Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing

Size: px
Start display at page:

Download "Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing"

Transcription

1 Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Roust esting Yixiao Sun Department of Economics University of California, San Diego Peter C. B. Phillips Cowles Foundation, Yale University, University of Auckland & University of York and Sainan Jin Guanghua School of Management Peking University January 5, An earlier version of the paper was presented at the 9th World Congress of the Econometric Society in London. he authors thank Whitney Newey and two anonymous referees for helpful comments. Phillips acknowledges partial research support from the Kelly Foundation and the NSF under Grant No. SES Jin acknowledges nancial support from the NSFC (Grant No. 76).

2 ABSRAC In time series regressions with nonparametrically autocorrelated errors, it is now standard empirical practice to use kernel-ased roust standard errors that involve some smoothing function over the sample autocovariances. he underlying smoothing parameter, which can e de ned as the ratio of the andwidth (or truncation lag) to the sample size, is a tuning parameter that plays a key role in determining the asymptotic properties of the standard errors and associated semiparametric tests. Small- asymptotics involve standard limit theory such as standard normal or chi-squared limits, whereas xed- asymptotics typically lead to nonstandard limit distriutions involving Brownian ridge functionals. he present paper shows that the nonstandard xed- limit distriutions of such nonparametrically studentized tests provide more accurate approximations to the nite sample distriutions than the standard small- limit distriution. In particular, using asymptotic expansions of oth the nite sample distriution and the nonstandard limit distriution, we con rm that the second-order corrected critical value ased on the expansion of the nonstandard limiting distriution is also second-order correct under the standard small- asymptotics. We further show that, for typical economic time series, the optimal andwidth that minimizes a weighted average of type I and type II errors is larger y an order of magnitude than the andwidth that minimizes the asymptotic mean squared error of the corresponding long-run variance estimator. A plug-in procedure for implementing this optimal andwidth is suggested and simulations con rm that the new plug-in procedure works well in nite samples. JEL Classi cation: C3; C4; C; C5 Keywords: Asymptotic expansion, andwidth choice, kernel method, long-run variance, loss function, nonstandard asymptotics, roust standard error, ype I and ype II errors

3 Introduction In time series regressions with autocorrelation of unknown form, the standard errors of regression coe cients are usually estimated nonparametrically y kernel-ased methods that involve some smoothing over the sample autocovariances. he underlying smoothing parameter () may e de ned as the ratio of the andwidth to the sample size and is an important tuning parameter that determines the size and power properties of the associated test. It therefore seems sensile that the choice of should take these properties into account. However, in conventional approaches (e.g., Andrews, 99, and Newey and West, 987, 994) and most practical software, the parameter is chosen to minimize the asymptotic mean squared error (AMSE) of the long-run variance (LRV) estimator. his approach follows what has long een standard practice in the context of spectral estimation (Grenander and Rosenlatt, 957; Hannan, 97) where the focus of attention is the spectrum (or, in the present context, the LRV). Such a choice of the smoothing parameter is designed to e optimal in the AMSE sense for the estimation of the relevant quantity (here, the asymptotic standard error or LRV ), ut is not necessarily est suited for hypothesis testing and con dence interval construction. In contrast to the aove convention, the present paper develops a new approach to choosing the smoothing parameter. We focus on two-sided tests and consider choosing to optimize a loss function that involves a weighted average of the type I and type II errors, a criterion that addresses the central concerns of interest in hypothesis testing, alancing possile size distortion against possile power loss in the use of di erent andwidths. his new approach to automatic andwidth selection requires improved measurement of type I and type II errors, which are provided here y means of asymptotic expansions of oth the nite sample distriution of the test statistic and the nonstandard limit distriution. We rst examine the asymptotic properties of the statistical test under di erent choices of : Using a Gaussian location model, we show that the distriution of the conventionally constructed t-statistic is closer to its limit distriution derived under the xed- asymptotics than that derived under the small- asymptotics. More speci cally, when is xed, the error in the rejection proaility (ERP) of the nonstandard t-test is of order O while that of the standard normal test is O(): he result is related to that of Jansson (4) who shows that the ERP of the nonstandard test ased on the Bartlett kernel with = is of order O(log = ). Our result strengthens and generalizes Jansson s result in two aspects. First, we show that the log ( ) factor can e dropped. Second, while Jansson s result applies only to the Bartlett kernel with =, our result applies to more general kernels with oth = and < : On the other hand, when is decreasing with the sample size, the ERP of the nonstandard t-test is smaller than that of the standard normal test, although they are of the same order of magnitude. As a consequence, the nonstandard t-test has less size distortion than the standard normal test. Our analytical ndings here support earlier simulation results in Kiefer, Vogelsang and Bunzel (, hereafter KVB), Kiefer and Vogelsang (a,, hereafter KV), Vogelsang (3), Gonçalves and Vogelsang (5) and ourselves (5). Our theoretical development is ased on two high order asymptotic expansions. he rst is the expansion of the nonstandard limiting distriution around its limiting chisquared form. he second is the expansion of the nite sample distriution of the t-

4 statistic under conventional joint limits where the sample size! and! simultaneously. hese higher order expansions enale us to develop improved approximations to the type I and type II errors. More speci cally, the type I error is measured y using the rst correction term in the asymptotic expansion of the nite sample distriution of the test statistic aout its nonstandard limit distriution. his term is of order O ( ) q where q is the Parzen characteristic exponent that measures the degree of smoothness of the kernel used. For typical economic time series, this term increases as decreases for any given. Similarly, the expansion under the local alternative reveals that in general the type II error decreases as decreases. hus, to this order in the asymptotic expansion, decreasing reduces the type II error ut also increases the type I error. Since the desirale e ects on these two types of errors generally work in opposing directions, there is an opportunity to trade o these e ects. Accordingly, we construct a loss function criterion y taking a weighted average of these two types of errors and show how may e selected in such a way as to minimize the loss. his method of choosing is consistent with the Neyman principle under which the proaility of the type II error is minimized after controlling for the proaility of the type I error (see, for example, Gourieroux and Monfort, 995). Our approach gives an optimal which generally has a shrinking rate of at most = O q=(q+) and which can even e O () for certain loss functions, depending on the weights that are chosen. Note that the optimal that minimizes the asymptotic mean squared error of the corresponding LRV estimator is of order O q=(q+) (c.f., Andrews (99)). hus, optimal values of for LRV estimation are smaller as! than those which are most suited for statistical testing. he xed rule is otained y attaching sustantially higher weight to the type I error in the construction of the loss function. his theory therefore provides some insight into the type of loss function for which there is a decision theoretic justi cation for the use of xed rules in econometric testing. o implement the optimal andwidth selection rule for two-sided tests in a location model, users may proceed in the following steps:. Specify the null hypothesis H : = and an alternative hypothesis H : j j = c >, where c may re ect a value of scienti c interest or economic signi cance if such a value is availale. (In the asence of such a value, we recommend that the user may set the default discrepancy parameter value = in () elow).. Specify the signi cance level of the test and the associated two-sided standard normal critical value z satisfying (z ) = =: 3. Specify a weighted average loss function (see (69) elow) capturing the relative importance of the type I and II errors, and re ecting the relative cost of false acceptance and false rejection. Normalize the weight for the type II error to e to unity and let w e the weight for the type I error. 4. Estimate the model y OLS and construct the residuals ^u t :

5 5. Fit an AR() model to the estimated residuals and compute ^d = ^ ( ^) ; ^ = (^u t ^^u t ) ; t= = p c ( ^) =^; () where ^ is the OLS estimator of the AR coe cient. Set = as a default value if the user is unsure aout the alternative hypothesis. 6. Specify the kernel function to e used in HAC standard error estimation. Among the commonly-used positive de nite kernels, we recommend a suitale second order kernel (q = ) such as the Parzen or quadratic spectral kernel. 7. Compute the automatic andwidth: ^ = 8 < : qg ^d[wg (z ) G (z )] =3 czk =3 ; ^d (z wg ) (z ) G (z ) > (log ) =; ^d wg (z ) G (z ) () where 6:; Parzen kernel g = :4; QS kernel :539; Parzen kernel c = :; QS kernel ; (3) G () is the cdf of a (non)central chi-squared variate with one degree of freedom and noncentrality parameter ; and K () is de ned in equation (4). 8. Compute the HAC standard error using andwidth ^M = ^ and construct the usual t-statistic t^: 9. Let z ;^ = z + k 3^ + k4^ where k 3 and k 4 are given in ale II. Reject the null hypothesis if jt^j z ;^. he rest of the paper is organized as follows. Section reviews the rst order limit theory for the t-test as! with the parameter xed and as! with approaching zero. Section 3 develops an asymptotic expansion of the nonstandard distriution under the null and local alternative hypotheses as! aout the usual central and noncentral chi-squared distriutions. he high-order terms in this asymptotic expansion under the null delivers correction terms that can e used to adjust the critical values in the usual chi-squared test. Section 4 develops comparale expansions of the nite sample distriution of the statistic as! and! at the same time. Section 5 compares the accuracy of the nonstandard approximation with that of the standard normal approximation. Section 6 proposes a selection rule for that is suitale for implementation in semiparametric testing. he susequent section reports simulation evidence on the performance of the new procedure. he last section provides some concluding discussion. Proofs and additional technical results are given in the Appendix. 3

6 Heteroskedasticity-Autocorrelation Roust Inference hroughout the paper, we focus on inference aout in the location model: y t = + u t ; t = ; ; :::; ; (4) where u t is zero mean process with a nonparametric autocorrelation structure. he nonstandard limiting distriution in this section and its asymptotic expansion in Section 3 apply to general regression models under certain conditions on the regressors, see KV (a,, 5). On the other hand, the asymptotic expansion of the nite sample distriution in Section 4 applies only to the location model. Although the location model is of interest in its own right, this is a limitation of the paper and some possile extensions are discussed in Section 8. OLS estimation of gives ^ = Y = P t= y t; and the scaled and centred estimation error is p ( ^ ) = p S ; (5) where S t = P t = u : Let ^u = y ^ e the demeaned time series and ^St = P t = ^u e the corresponding partial sum process. he following condition is commonly used to facilitate the limit theory (e.g., KVB and Jansson (4)). Assumption S [ r] satis es the functional law = S [ r] )!W (r); r [; ] (6) where! is the long-run variance of u t and W (r) is standard Brownian motion. Under Assumption, = ^S[ r] )!V (r); r [; ] ; (7) where V is a standard Brownian ridge process, and p ( ^ ) )!W () = N(;! ); (8) which provides the usual asis for roust testing aout : It is standard empirical practice to estimate! using kernel-ased nonparametric estimators that involve some smoothing and possily truncation of the autocovariances. When u t is stationary, the long-run variance of u t is X! = + (j); (9) where (j) = E(u t u t j ): Correspondingly, heteroskedasticity-autocorrelation consistent (HAC) estimates of! typically have the form j= ^! (M) = j= + P ( k( j M )^(j); ^(j) = t= ^u t+j ^u t for j P t= j+ ^u t+j ^u t for j < j () 4

7 involving the sample covariances ^(j): In (), k() is some kernel function and M is a andwidth parameter. Consistency of ^! (M) requires M! and M=! as! (e.g. Andrews (99), Andrews and Monahan (99), Hansen (99), Newey and West (987,994), de Jong and Davidson ()). Jansson () provides a recent overview and weak conditions for consistency of such estimates. o test the null H : = against H : 6= ; the standard approach relies on a nonparametrically studentized t-ratio statistic of the form t^!(m) = = ( ^ )=^!(M); () which is asymptotically N(; ). Use of t^!(m) is convenient empirically and therefore widespread in practice, in spite of well-known prolems of size distortion in inference. o reduce size distortion, KVB and KV proposed the use of kernel-ased estimators of! in which M is set proportional to ; i.e. M = for some (; ]. In this case, the estimator ^! ecomes ^! = X j k ^(j); () and the associated t statistic is given y j= + t = = ( ^ )=^! : (3) When the parameter is xed as! ; KV showed that under Assumption ^! )! ; where the limit is random and given y = Z Z k (r s)dv (r)dv (s); (4) with k () = k (=) and the t -statistic has a nonstandard limit distriution. Under the null hypothesis t ) W () = ; (5) whereas under the local alternative H : = + c = ; t ) ( + W ()) = ; (6) where = c=!: hus, the t -statistic has a nonstandard limit distriution arising from the random limit of the LRV estimate ^! when is xed as! : However, as decreases, the e ect of this randomness diminishes, and when! the limit distriutions under the null and local alternative oth approach those of conventional regression tests with consistent LRV estimates. It is important to point out that in the nonstandard limit distriution W () is independent : his is ecause W () is uncorrelated with V (r) for any r [; ] and oth W () and V (r) are Gaussian. In related work, the present authors (5a, 5, hereafter PSJ a, PSJ ) propose using an estimator of! of the form ^! = j= + j k ^(j); (7) 5

8 which involves setting M equal to and taking an aritrary power of the traditional kernel. Statistical tests ased on ^! and ^! share many of the same properties, which is explained y the fact that and play similar roles in the construction of the estimates. he present paper focuses on ^! and tests associated with this estimate. Comparale ideas and methods to those explored in the present paper may e pursued in the context of estimates such as ^! and are reported in other work (PSJ c and PSJ d ): 3 Expansion of the Nonstandard Limit heory his section develops asymptotic expansions of the limit distriutions given in (5) and (6) as the andwidth parameter! : hese expansions are taken aout the relevant central and noncentral chi-squared limit distriutions that apply when! ; corresponding to the null and local alternative hypotheses. hese expansions of the nonstandard limit distriutions are of some independent interest. For instance, they can e used to deliver correction terms to the limit distriutions under the null, therey providing a mechanism for adjusting the nominal critical values provided y the usual chi-squared distriution. he latter correspond to the critical values that would e used for tests ased on conventional consistent HAC estimates. he asymptotic expansions and later developments in the paper make use of the following kernel conditions: Assumption R (i) k(x) : R! [; ] is symmetric, piecewise smooth with k() = and k(x)xdx <. (ii) he Parzen characteristic exponent de ned y q = maxfq : q Z + k(x) ; g q = lim x! jxj q < g (8) is greater than or equal to. R (iii) k(x) is positive semide nite, i.e., for any square integrale function f(x), R k(s t)f(s)f(t)dsdt : Assumption imposes only mild conditions on the kernel function. All the commonly used kernels satisfy (i) and (ii). he assumption R k(x)xdx < ensures the integrals that appear frequently in our later developments are nite. It also enales us to use the Riemann-Leesgue lemma in our proofs. he assumption of positive semide niteness in (iii) ensures that the associated LRV estimator is nonnegative. Commonly used kernels that are positive semide nite include the Bartlett kernel, the Parzen kernel and the QS kernel, which are the main focus of the present paper. For the Bartlett kernel, the Parzen characteristic exponent is : For the Parzen and QS kernels, the Parzen characteristic exponent is. We proceed to estalish the asymptotic expansion of the nonstandard limiting distriution. Let G = G(; ) e the cdf of a non-central ( ) variate with noncentrality parameter : hen n o o ( = P + W ()) z = P n( + W ()) z = E G (z ) ; (9) 6

9 in view of the independence of W () and : Set = E ( ) ; and a fourth-order aylor expansion yields G (z ) = G ( z ) + G ( z )z 4 ( ) + 6 G ( z )z 6 ( ) G (4) (~ z )z 8 ( ) 4 ; () where ~ lies on the line segment etween and : aking expectation on oth sides of the equation and using the fact that (~ z )z 8 C for some constant C, we have G (4) EG (z ) = G ( z ) + G ( z )E ( ) z G ( z )E ( ) 3 z 6 + O E ( ) 4 ; () as! ; where the O () term holds uniformly over z R +. o characterize the asymptotic ehavior of E ( ) m as! ; we write where k (r; s) is de ned y k (r; s) = k (r s) = Z Z Z k (r t)dt Here, k (r; s) is simply the projection of k (r ff(; ) : f L ([; ]); Z f(r; s)dr = ; k (r; s)dw (r)dw (s); () Z Z k ( s)d + Z Z k (t )dtd: s) onto the Hilert suspace de ned y f(r; s)ds = ; Z Z f(r; s)drds = g: Since k(r s) is positive semide nite, y Mercer s theorem (e.g., see Shorack and Wellner (986)), k(r s) can e represented as k(r s) = P n= nf n (r)f n (s); where n > are the eigenvalues of the kernel and f n (x) are the corresponding eigenfunctions, i.e. n f n (s) = R k(r s)f n (r)dr: Since n > ; k (r; s) is also positive semide nite ecause it may e written as k (r; s) = X n g n (r=)g n (s=) for any (r; s) [; ] [; ]; (3) n= R where g n (r) = f n (r) f n ()d: In consequence, for any function q(x) L (R + ); we have Z Z q(r)k (r; s)q(s)drds = X Z n g n (r=)q(r)dr : (4) n= Using Mercer s theorem again, we therefore have k (r; s) = X nfn(r)f n(s); (5) n= 7

10 where n > are the eigenvalues of the kernel and fn(r) are the corresponding eigenfunctions, i.e. nfn(s) = R k (r; s)f n(r)dr: Using representation (5), we can write as = P n= nzn; where Z n s iid N(; ). herefore, the characteristic function of is given y n (t) = E e it( m= ) o = e it Y f i ntg = ; (6) n= and the cumulant generating function is ( ) X X ln (t) = m (m )! ( n) m n= (it) m : (7) m! Let ; ; 3 ; : : : e the cumulants of : hen = and m = m (m )! X ( n) m for m : (8) Some algeraic manipulations show that for m Z Z my m = m (m )! k ( j; j+ ) A d d m ; (9) where = m+ : With these preliminaries, we are ale to nd the dominating terms in the moments E ( ) m, m =, and develop an asymptotic expansion of P f( + W ()) = zg as the andwidth parameter! : In fact, a full series expansion is possile using this method, ut our purpose here requires only the leading terms in the expansion. n o ( = heorem Let F (z) := P + W ()) z e the nonstandard limiting distriution, then under Assumption, j= n= F (z) = G (z ) + p (z ) + q (z) + o ; (3) where the term o holds uniformly over z R + as!, p (z ) = c G (z )z 4 c G (z )z ; q (z ) = G (z )z c 3 + G (z )z 4 (c 4 c ) G (z )z 6 c c ; (3) and c = R k(x)dx; c = R k (x)dx; c 3 = R k(x) jxj dx; c 4 = R k (x) jxj dx: (3) 8

11 As is apparent from the proof of the theorem, the term c G (z )z 4 in p (z ) arises from the randomness of ; whereas the term c G (z )z in p (z ) arises from the asymptotic ias of : Although converges to as! ; we have var( ) = c ( + o()) and E( ) = c ( + o()); as is estalished in the proof. is not centered exactly at ecause the regression errors have to e estimated. he terms in q (z ) are due to the rst and second order iases of, the variance of and their interactions. It follows from heorem that when = ; F (z) = D(z ) + c D (z )z 4 c D (z )z + D (z )z c 3 + D (z )z 4 (c 4 c ) D (z )z 6 c c + o ; (33) where D() = G () is the cdf of distriution. For any (; ); let z R + ; z; R + such that D(z) = and F (z ; ) = : hen, using a Cornish-Fisher type expansion, we can otain high-order corrected critical values. Before presenting the corollary elow, we introduce some terminology. We call the critical value that is correct up to the O() order the second-order corrected critical value. We call the critical value that is correct up to the O( ) order the third-order corrected critical value. We will use this convention throughout the rest of the paper. he following quantities appear in the corollary: k = c + c z + c z; 4 (34) k = c + 3 c c c + c 3 + c 4 z + c + 3 c c c + 5 c 4 z c z 6 6 c z; 8 (35) k 3 = c + c z + 4 c z; 3 (36) k 4 = 8 c c c + 6 c + c c 4 z + 4 c c c c + 4 c 4 z c z 5 3 c z 7 : (37) Corollary For asymptotic chi-square and normal tests, (i) the second-order corrected critical values are z ; = z + k + o(); z ; = z + k 3 + o(); (38) (ii) the third-order corrected critical values are z ; = z + k + k + o( ); z ; = z + k 3 + k 4 + o( ); (39) where z is the nominal critical value from the standard normal distriution. 9

12 Our later developments require only the second-order corrected critical values. he third-order corrected critical values are given here ecause the second-order correction is not enough to deliver a good approximation to the exact critical values when the Parzen and QS kernels are employed and is large. For the Bartlett, Parzen and QS kernels, we can compute c ; c ; c 3 and c 4 either analytically or numerically. We otain the high-order corrected critical values in ale I. ale I. High Order Corrected Critical Values z ; = z + k + k + o( ); z ; = z + k 3 + k 4 + o( ) = 5%; z = :96 = %; z = :645 k k k 3 k 4 k k k 3 k 4 Bartlett : : : : Parzen 7: : : : QS 4: : : : We can evaluate the accuracy of the approximate critical values y comparing them with the exact ones otained via simulations. Comparing the second-order and thirdorder corrected critical values, we nd that the second-order corrected critical value is more accurate for the Bartlett kernel while the third-order corrected critical value is more accurate for the Parzen and QS kernels. In oth cases, the higher order corrected critical values provide excellent approximations to the exact ones when is smaller than.5 and reasonaly good approximations for other values of : It is important to point out that the exact critical values can e simulated and the high-order asymptotic expansion is developed here for two reasons. First, the resulting high-order corrected critical values are easy to calculate and convenient to use for practical testing situations. In particular, the high-order corrected critical values will e used in developing the optimal andwidth selection rule elow. Second, the asymptotic expansion helps shed some new light on the accuracy of the nonstandard approximation. In the next section, we will show that the second-order corrected critical values ased on the asymptotic expansion of the nonstandard distriution is also second-order correct under the conventional asymptotics. As a result, the nonstandard limiting distriution is closer to the exact distriution than the standard normal distriution even when! ; See Section 5. When 6= and the second-order corrected critical values are used, we can use heorem to calculate the local asymptotic power, measured y P f( + W ()) = > z ; g; as in the following corollary. Corollary 3 Let Assumption hold, then the local asymptotic power satis es n o ( = P + W ()) > z ; = G (z) c zk 4 z + o(); (4) as! where X K (z) = is positive for all z and : j= = j j! e = z j = e z= (j + =) j+= j z (4)

13 f According to Corollary 3, the local asymptotic test power, as measured y P f( + W ()) = > z ; g; decreases monotonically with at least when is small. Fig. graphs the surface f(z ; ) = zk 4 z for di erent values of z and : For a given critical value, f(z ; ) achieves its maximum around =, implying that the power increase resulting from the choice of a small is greatest when the local alternative is in an intermediate neighorhood of the null hypothesis. For any given local alternative, the function is monotonically increasing in z : herefore, the power improvement due to the choice of a small increases with the con dence level : his is expected. When the con dence level is higher, the test is less powerful and the room for power improvement is greater δ z α Figure : he graph of f(z ; ) = z 4 K z as a function of z and. 4 Expansions of the Finite Sample Distriution his section develops a nite sample expansion for the simple location model. his development, like that of Jansson (4), relies on Gaussianity, which facilitates the derivations. he assumption could e relaxed y taking distriutions ased (for example) on Gram- Charlier expansions, ut at the cost of much greater complexity (see, for example, Phillips (98), aniguchi and Puri (996), Velasco and Roinson ()). he following assumption on u t facilitates the development of the higher order expansion.

14 Assumption P 3 u t is a mean zero covariance-stationary Gaussian process with h= h j (h)j < ; where (h) = Eu t u t h : We develop an asymptotic expansion of P fj p ( ^ )=^! j zg for = + c= p : Depending on whether c is zero or not, this expansion can e used to approximate the size and power of the t-test. Since u t is in general autocorrelated, ^ and ^! are statistically dependent, which makes it di cult to write down the nite sample distriution of the t-statistic. o tackle this di culty, we decompose ^ and ^! into statistically independent components. Let u = ( ; :::u ) ; y = (y ; :::; y ); l = (; :::; ) and = var(u): hen the GLS estimator of is ~ = l l l y and ^ = ~ + l l l ~u; (4) where ~u = (I l l l l )u; which is statistically independent of ~ : Since ^! can e written as a quadratic form in ~u; ^! is also statistically independent of ~ : Next, it is easy to see that p! := var ( ^ ) = l l =! + O ; (43) and it follows from Grenander and Rosenlatt (957) that p ~! := var ( ~ ) = l l =! + O : (44) herefore = l ~u = N(; O ). Comining this result with the independence of ~ and (~u, ^! ); we have np o P ^ =^! z np = P ~ =^! + p o ( ) =^! + = l ~u=^! z np o = P ~ =~! + c=~! z^! =~! = l ~u=~! = E z^! =~! c=~! = l ~u=~! = E (z^! =~! c=~! ) = E' (z^! =~! c=~! ) l ~u=~! + O np o = P ~ =~! + c=~! z^! =~! + O ; (45) uniformly over z R; where and ' are the cdf and pdf of the standard normal distriution, respectively. he second to last equality follows ecause ^! is quadratic in ~u and thus E' (z^! =~! c=~! ) l ~u =. In a similar fashion we nd that np o np o P ^ =^! z = P ~ =~! + c=~! z^! =~! + O ;

15 uniformly over z R: herefore n p o F ; (z) := P ^ =^! z h p i = P ~ =~! + c=~! z ^! =~! + O = E G (z ^! =~! ) = E G (z & ) + O ; (46) uniformly over z R +, where & := (^! =! ) converges weakly to. Setting = E (& ) and following the same argument that leads to (), we have F ; (z) = G ( z ) + G ( z )E (& ) z G ( z )E (& ) 3 z 6 + O E (& ) 4 + O (47) where the O () term holds uniformly over z R +. By developing asymptotic expansions of and E (& ) m for m = ; ; :::; 4; we can estalish a higher order expansion of the nite sample distriution for the case where! and! at the same time. his expansion validates for nite samples the use of the second-order corrected critical values given in the previous section which were derived there on the asis of an expansion of the (nonstandard) limit distriution. heorem 4 Let Assumptions and 3 hold. If! as! and! ; then F ; (z) = G (z ) + c G ( z )z 4 c G (z )z g q d q G (z )z ( ) q + o + ( ) q + O (48) where d q =! P h= jhjq (h),! = l l and the o() and O() terms hold uniformly over z R + : Under the null hypothesis, = and G () = D(); so F ; (z) = D(z ) + c D (z )z 4 c D (z )z g q d q D (z )z ( ) q + o + ( ) q + O : (49) he leading two terms (up to order O()) in this expansion are the same as those in the corresponding expansion of the limit distriution F (z) given in (33) aove. hus, use of the second-order corrected critical values given in (38), which take account of terms up to order O () ; should lead to size improvements when q q+! : he third term in the expansion (49) is O ( q ) when is xed. When decreases with ; this term provides an asymptotic measure of the size distortion in tests ased on the use of the rst two terms of (49), or equivalently those ased on the nonstandard limit theory, at least to order O (). hus, the third term of (49) approximately measures how satisfactory the second-order corrected critical values given in (38) are for any given values of and. 3

16 If critical values from the standard normal are used, then the ERP is approximated y the sum of the O () and O(( ) q ) terms in (49), viz., e () = c D (z )z 4 c D (z )z g q d q D (z )z ( ) q ; (5) which is a simple rational function in : In (5) the O () term is negative, and the O(( ) q ) term is also negative for time series with d q >, as is typical for economic data. In this case, the ERP function e () is negative, has a negative asymptote at = ; and a simple maximum at = qg q d q D (z ) q+ q c D (z )z c D (z q+ : (5) ) For this choice of ; e () is closest to zero, and the resulting est rate of convergence of the ERP to zero is O q=(q+) : his rate increases with q and can e aritrarily close to O if q is large and the autocovariance decays at an exponential rate. Examples of kernels with q > include the familiar truncated kernel and the at top kernel proposed y Politis and Romano (995, 998). hese kernels are not positive semide nite. We consider only the commonly-used positive semide nite kernels with q in this paper and leave the analysis of higher order kernels for future research. If we construct a two-sided con dence interval ased on the asymptotic normality, then the ERP is also the coverage error. herefore, the optimal that achieves the greatest coverage accuracy is of order O q=(q+) for typical economic time series. As we discuss elow, this rate is di erent from O q=(q+) ; the rate that is appropriate for point estimation of the standard error. So, some undersmoothing is required to achieve improved coverage accuracy in con dence intervals. In the case where d q < ; the ERP-optimal is di erent from that given in (5). When d q <, the ERP function e () has a positive asymptote at = and is monotonically decreasing and passes through the origin as increases, so the optimal that achieves the greatest coverage accuracy is the value for which e () = ; i.e., = g q d q D (z ) c D (z )z c D (z ) q+ q q+ : (5) his choice of completely removes the O () and O(( ) q ) terms in (49), leading to a coverage error of a smaller order, i.e. o q=(q+) instead of O q=(q+) : he optimal given in (5) and (5) is not feasile as it involves the unknown parameter d q : In practice, we can estimate d q using a nonparametric approach. A minimal requirement for this estimator is its consistency. Based on the sign of this estimator, we can decide which of the two formulae to use in practical situations. We leave investigations along these lines to future research. Velasco and Roinson () estalished an Edgeworth expansion of P ft zg for the Gaussian local model. Note that P ft zg = ( + F ;(z)) for z under the null 4

17 hypothesis, and we have Using the identity P ft zg = + D(z ) + c D (z )z 4 c D (z )z g qd q D (z )z ( ) q + o + ( ) q + O : (53) D (z ) = (z) z ; D (z ) = (z) z (z) z 3 ; (54) we deduce for z h c P ft zg = (z) z3 + z + c zi (z) g qd q z(z)( ) q + o + ( ) q + O : (55) Comining this with P ft zg = P ft zg ; we can show that the preceding expansion also holds for z : he expansion (55) is identical to equation () in Velasco and Roinson () after notational changes and a small correction. Under the local alternative hypothesis, the power of the test ased on the secondorder corrected critical values is F ; (z ; ): heorem 4 shows that F ; (z ; ) can e approximated y G (z ; ) + c G (z ; )z4 ; c G (z ; )z4 ; gq d q G (z )z ( ) q ; with an approximation error of order o ( ) q + + O : Note that the aove approximation depends on only through ; so it is valid under oth H : = c= p and H : = c= p : hese results on size distortion and local power are formalized in the following corollary. Corollary 5 Let Assumptions and 3 hold. If! as! and!, then: (a) the size distortion of the two-sided t-test ased on the second-order corrected critical values is ( F ; (z ; )) = g q d q D (z )z ( ) q + o ( ) q + + O ; (56) here is a mistake in heorem 5 of Velasco and Roinson (). Given their heorem 4 and equation (), the second order corrected critical value should e w = z R z rn (x)(x)dx (z ) M N = z + z3 3z kkk = z + z3 + z kkk h i z NM d 4 kkk M K() N M N M N + z fk()g M N z M d Since K() = c and kkk = c; the correction terms of order O(M=N) are the same as that given in (38) in this paper. 5

18 () under the local alternative H : j j = c= p, the power of the two-sided t-test ased on the second-order corrected critical values is F ; (z ; ) = G (z ) c z 4 K z + g q d q G (z )z ( ) q + o ( ) q + + O : (57) 5 Accuracy of the Nonstandard Approximation In the previous section, we have shown that when goes to zero at a certain rate, the ERP of the standard normal or chi-squared test is at least of order O( q=(q+) ) for typical economic time series. his section estalishes a related result for the nonstandard test when is xed and then compares the ERP of these two tests under the same asymptotic speci cation, i.e. either is xed or! : As in the previous section, we focus on the Gaussian location model. In view of (46), the error of the nonstandard approximation is given y n p o n o W = F ; (z) F (z) := P ^ =^! z P () z = ED(z & ) ED(z ) + O : (58) o evaluate the di erence ED(z & ) ED(z ); we proceed to compute the cumulants of oth & and : Since ^! = ^u W ^u = u A W A u; where W is with (j; s)-th element k ((j s)= ) and A = I l l =, & is a quadratic form in a Gaussian vector. It is easy to show that the characteristic function of & is given y (t) = I it A W A =! exp f it g ; (59) where = E(uu ) and the cumulant generating function is ln ( (t)) = log det I it A W A X (it) m! it := m; ; m! (6) where the m; are the cumulants of & m= : It follows from (6) that ; = and m; = m (m )! m! m race [( A W A ) m ] for m : (6) By proving m; is close to m in the precise sense given in Lemma in the Appendix, we can estalish the following theorem, which gives the order of magnitude of the error in the nonstandard limit distriution of t as! with xed : he requirement < =(6 R jk(x)jdx) on that appears in the statement of the result is a technical condition in the proof that facilitates the use of a power series expansion. he requirement can e relaxed ut at the cost of more extensive and tedious calculations. heorem 6 Let Assumptions and 3 hold. If < =(6 R jk(x)jdx), then uniformly over z R + when! with xed : F ; (z) = F (z) + O ; (6) 6

19 Under the null hypothesis H : =, we have = : In this case, heorem 6 indicates that the ERP for tests with xed and using critical values otained from the nonstandard limit distriution of W () = is O : he theorem is an extension to the result of Jansson (4) who considered only the Bartlett-type kernel with = and proved that the ERP is of order O log( ) : It is an open question in Jansson (4) whether the log ( ) factor can e omitted. heorem 6 provides a positive answer to this question. In the previous section, we showed that when! ;! such that! ; F ; (z) = D(z ) + O q=(q+) (63) for typical economic time series. Comparing (6) with (63), one may conclude that the error of the nonstandard approximation is smaller than that of the standard normal approximation y an order of magnitude. However, the two O() terms are otained under di erent asymptotic speci cations. he O() term in (6) holds for xed while the O() term in (63) holds for diminishing : Since the O() term in (6) does not hold uniformly over (; ]; the two O() terms can not e directly compared, although they are oviously suggestive of the relative quality of the two approximations when is small, as it typically will e in practical applications. Indeed, F (z) and D(z ) are just di erent approximations to the same quantity F ; (z): o compare the two approximations more formally, we need to evaluate F ; (z) F (z) and F ; (z) D(z ) under the same asymptotic speci cation, i.e. either is xed or! : First, when is xed, we have F ; (z) D(z ) = F ; (z) F (z) + F (z) D(z ) = O(): (64) his is ecause F ; (z) F (z) = O(= ) as shown in heorem 6 and F (z) D(z ) = O(). Comparing (64) with (6), we conclude that when is xed, the error of the nonstandard approximation is smaller than that of the standard approximation y an order of magnitude. Second, when = O( q=(q+) ); we have F ; (z) F (z) = F ; (z) D(z ) F (z) D(z ) = c D (z )z 4 c D (z )z g q d q D (z )z ( ) q + o + ( ) q + O c D (z )z 4 c D (z )z + o() = g q d q D (z )z ( ) q + o + ( ) q + O ; (65) where we have used heorems and 4. herefore, when d q > ; which is typical for economic time series, the error of the nonstandard approximation is smaller than that of the standard normal approximation, although they are of the same order of magnitude for this choice of. We can conclude from the aove analysis that the nonstandard distriution provides a more accurate approximation to the nite sample distriution regardless of the asymptotic speci cation employed. here are two reasons for the etter performance: the 7

20 nonstandard distriution mimics the randomness of the denominator of the t-statistic, and it accounts for the ias of the LRV estimator resulting from the unoservaility of the regressor errors. As a result, the critical values from the nonstandard limiting distriution provide a higher order correction on the critical values from the standard normal distriution. However, just as in the standard limiting theory, the nonstandard limiting theory does not deal with another source of ias, i.e. the usual ias that arises in spectral density estimation even when a time series is known to e mean zero and oserved. his second source of ias manifests itself in the error of approximation given in (65). 6 Optimal Bandwidth Choice It is well known that the optimal choice of that minimizes the asymptotic mean squared error in LRV estimation has the form = O q=(q+). However, there is no reason to expect that such a choice is the most appropriate in statistical testing using nonparametrically studentized statistics. Developing an optimal choice of for semiparametric testing is not straightforward and involves some conceptual as well as technical challenges. In what follows we provide one possile approach to constructing an optimizing criterion that is ased on alancing the type I and type II errors. In view of the asymptotic expansion (56), we know that the type I error for a two-sided test with nominal size can e expressed as F ; (z ; ) = + g q d q D (z )z ( ) q + o ( ) q + + O ; (66) Similarly, from (57), the type II error has the form G (z ) + c z 4 K (z ) g q d q G (z )z ( ) q + o ( ) q + + O : (67) A loss function for the test may e constructed ased on the following three factors: (i) he magnitude of the type I error, as measured y the second term of (66); (ii) he magnitude of the type II error, as measured y the O () and O(( ) q ) terms in (67); and (iii) he relative importance of the type I and type II errors. For most economic time series we can expect that d q > and then oth g q d q D (z)z > and g q d q G (z )z > : Hence, the type I error increases as decreases. On the other hand, the ( ) q term in (67) indicates that there is a corresponding decrease in the type II error as decreases. Indeed, for > the decrease in the type II error will generally exceed the increase in the type I error ecause G (z ) > D (z) for (; 7:5) and z = :645; ; 96 or :58. Fig. graphs the ratio G (z )=D (z ) against for di erent values of z; illustrating the relative magnitude of G (z ) and D (z): he situation is further complicated y the fact that there is an additional O () term in the type II error. As we have seen earlier, K (z) > so that the second term of (67) leads to a reduction in the type II error as decreases. hus, the type II error generally decreases with for two reasons one from the nonstandard limit theory and the other from the (typical) downward ias in estimating the long-run variance. he case of d q < usually arises where there is negative serial correlation in the errors and so tends to e less typical for economic time series. In such a case, (66) shows 8

21 G δ '/D' z=.645 z=.96 z= δ Figure : he graph of G (z )=D (z ) as a function of for di erent values of z that the type I error increases with while the type II error may increase or decrease with depending on which of the two terms in (67) dominates. hese considerations suggest that a loss function may e constructed y taking a suitale weighted average of the type I and type II errors given in (66) and (67). Setting we de ne the loss function to e e I = + g q d q D (z )z ( ) q ; e II = G (z ) + c z 4 K (z ) g q d q G (z )z ( ) q ; (68) L (; ; ; z ) = w () + w () ei + + w () eii ; (69) where w () is a function that determines the relative weight on the type I and II errors and this function is allowed to depend on the sample size and : Oviously, the loss L (; ; ; z ) is here speci ed for a particular value of and this function could e adjusted in a simple way so that the type II error is averaged over a range of values of with respect to some (prior) distriution over alternatives. We focus on the case of a xed local alternative elow, in which case we can suppress the dependence of w () on and write w = w (). o sum up, the loss function we consider is of the form: L (; ; ; z ) = g q d q w D (z ) G (z ) z ( ) q + c z 4 K (z ) + w + C, 9 (7)

22 where C = w + G (z) =[ + w ], which does not depend on : In the rest of this section, we consider the case w D (z) G (z ) > ; which holds if the relative weight w is large enough. It turns out that the optimal choice of depends on whether d q > or d q < : We consider these two cases in turn. When d q > ; the loss function L (; ; ; z ) is minimized for the following choice of : opt = ( qgq d q w D (z) G (z ) ) =(q+) q=(q+) : (7) c z K (z ) herefore, the optimal shrinkage rate for is of order O( q=(q+) ) when w is a xed constant. If w! as!, we then have opt = qgq d q D (z) =(q+) zc K (z) (w = q ) =(q+) : (7) Fixed rules may then e interpreted as assigning relative weight w = O ( q ) in the loss function so that the emphasis in tests ased on such rules is a small type I error, at least when we expect the type I error to e larger than the nominal size of the test. his gives us an interpretation of xed rules in terms of the loss perceived y the econometrician using such a rule. Within the more general framework given y (7), may e xed or shrink with up to an O q=(q+) rate corresponding to the relative importance that is placed in the loss function on the type I and type II errors. Oserve that when = O((w = q ) =(q+) ), size distortion is O (w ) q=(q+) rather than O ; as it is when is xed. hus, the use of = opt for a nite w involves some compromise y allowing the error order in the rejection proaility to e somewhat larger in order to achieve higher power. Such compromise is an inevitale consequence of alancing the two elements in the loss function (7). Note that even in this case, the order of ERP is smaller than O( q=(q+) ); which is the order of the ERP for the conventional procedure in which standard normal critical values are used and is set to e O q=(q+) : For Parzen and QS kernels the optimal rate of is =3 ; whereas for the Bartlett kernel the optimal rate is = : herefore, in large samples, a smaller should e used with the Parzen and QS kernels than with the Bartlett kernel. In the former cases, the ERP is at most of order =3 and in the latter case the ERP is at most of order = : he O =3 rate of the ERP for the quadratic kernels represents an improvement on the Bartlett kernel. Note that for the optimal, the rate of the ERP is also the rate for which the loss function L (; ; ; z ) approaches C from aove. herefore, the loss L (; ; ; z ) is expected to e smaller for quadratic kernels than for the Bartlett kernel in large samples. Finite sample performance may not necessarily follow this ordering, however, and will depend on the sample size and the shape of the spectral density of fu t g at the origin. he formula for opt involves the unknown parameter d q ; which could e estimated nonparametrically (e.g. Newey and West (994)) or y a standard plug-in procedure ased on a simple model like AR() (e.g. Andrews (99)). Both methods achieve a valid

23 order of magnitude and the procedure is oviously analogous to conventional data-driven methods for HAC estimation. When w D (z) G (z ) > and d q < ; L (; ; ; z ) is an increasing function of. o minimize loss in this case, we can choose to e as small as possile. Since the loss function is constructed under the assumption that! and! ; the choice of is required to e compatile with these two rate conditions. hese considerations lead to a choice of of the form = J = for some J that goes to in nity ut at a slowly varying rate relative to : In the simulation study elow, we set J = log( ) so that = (log ) =: o sum up, for typical economic time series, the value of which minimizes the weighted type I and type II errors has a shrinkage rate of = O q=(q+) : his rate may e compared with the optimal rate of = O q=(q+) that applies when minimizing the mean squared error of estimation of the corresponding HAC estimate, ^! ; itself (Andrews (99)). hus, the AMSE optimal values of for HAC estimation are smaller as! than those which are most suited for statistical testing. In e ect, optimal HAC estimation tolerates more ias in order to reduce variance in estimation. In contrast, optimal selection in HAC testing undersmooths the long-run variance estimate to reduce ias and allows for greater variance in long-run variance estimation through higher order adjustments to the nominal asymptotic critical values or y direct use of the nonstandard limit distriution. his conclusion highlights the importance of ias reduction in statistical testing. 7 Simulation Evidence his section provides some simulation evidence on the nite sample performance of the t-test ased on the plug-in procedure that optimizes the loss function constructed in the previous section. We consider the simple local model with Gaussian ARMA(,) errors: where c = or and y t = + c= p + u t (73) u t = u t + " t + " t ; " t s iidn(; ): (74) Under the null c = ; and under the local alternative c = : he latter value of c is chosen such that when = :3 and = ; the local asymptotic power of the t-test is 6%: In other words, c = solves P fj(c( ) + W ())j :645g = 6% for = :3: he qualitative results are similar for other nonzero values of c: We consider three sample sizes = 5; and : For the ARMA(,) process, d = ( + )( + ) ( ) ( + ) ; d ( + )( + ) = ( ) ( + ) : (75)

24 o estimate d and d ; we employ the AR() plug-in procedure. More speci cally, let ^ = P t= (u t u)(u t u) P t= (u t u) (76) e the OLS estimate of the AR parameter, then we estimate d and d y It is easy to see that herefore ^d = ^=( ^ ); ^d = ^=( ^) : (77) ^! ( + ) ( + ) ; as! : (78) + + d = ^d! ( ) ( ) + n ( + ) o > + + d = ^d ( + ) ( + ) n! ( ) ( + ) o > : (79) + + Since oth limits are positive when jj <, ^d q and d q have the same sign with proaility approaching one as! : We can thus choose ased on the sign of ^d q : More speci cally, 8 < qg q ^dqfw D (z ) G (z )g =(q+) ^opt = c zk (z : ) q=(q+) ; ^dq > (8) log ; ^dq < We consider the following relative weights: w = ; ; 3; and 4. We set the signi cance level to e = % and the corresponding nominal critical value for the two-sided test is z = :645: For all the DGP s considered, we let = in computing the optimal andwidth parameter. For each choice of w ; we otain ^ opt and use it to construct the LRV estimate and corresponding t^-statistic. We reject the null hypothesis if t^ is larger than the corrected critical values given in (38) and (39). For the Bartlett kernel, the second-order corrected critical value is used. For the Parzen and QS kernels, the third-order corrected critical value is used. Using 5, replications, we compute the empirical type I error (when c = ) and type II error (when c = ). We construct the empirical loss y taking a weighted average of the type I and type II errors. he weights associated with the type I and II errors are w =( + w ) and =( + w ); respectively. For comparative purposes, we also compute the empirical loss function when the andwidth is the optimal one that minimizes the asymptotic mean squared errors of the LRV estimate. his andwidth rule is given in Andrews (99) and the AR() plug-in version is ^B 4^ =3 MSE = :447 =3 ; R 4^ ^P MSE = :664 ^QS MSE = :3 4^ ( ^) 4 ( ^) ( + ^) =5 4=5 ; or (8) ( ^) 4 =5 4=5 :

Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing

Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing Yixiao Sun Department of Economics University of California, San Diego Peter C. B. Phillips Cowles Foundation, Yale University,

More information

POWER MAXIMIZATION AND SIZE CONTROL IN HETEROSKEDASTICITY AND AUTOCORRELATION ROBUST TESTS WITH EXPONENTIATED KERNELS

POWER MAXIMIZATION AND SIZE CONTROL IN HETEROSKEDASTICITY AND AUTOCORRELATION ROBUST TESTS WITH EXPONENTIATED KERNELS POWER MAXIMIZAION AND SIZE CONROL IN HEEROSKEDASICIY AND AUOCORRELAION ROBUS ESS WIH EXPONENIAED KERNELS By Yixiao Sun, Peter C. B. Phillips and Sainan Jin January COWLES FOUNDAION DISCUSSION PAPER NO.

More information

A New Approach to Robust Inference in Cointegration

A New Approach to Robust Inference in Cointegration A New Approach to Robust Inference in Cointegration Sainan Jin Guanghua School of Management, Peking University Peter C. B. Phillips Cowles Foundation, Yale University, University of Auckland & University

More information

OPTIMAL BANDWIDTH SELECTION IN HETEROSKEDASTICITY-AUTOCORRELATION ROBUST TESTING. YIAIAO SUN, PETER C. B. PHILLIPS, and SAINAN JIN

OPTIMAL BANDWIDTH SELECTION IN HETEROSKEDASTICITY-AUTOCORRELATION ROBUST TESTING. YIAIAO SUN, PETER C. B. PHILLIPS, and SAINAN JIN OPIMAL BANDWIDH SELECION IN HEEROSKEDASICIY-AUOCORRELAION ROBUS ESING BY YIAIAO SUN, PEER C. B. PHILLIPS, and SAINAN JIN COWLES FOUNDAION PAPER NO. COWLES FOUNDAION FOR RESEARCH IN ECONOMICS YALE UNIVERSIY

More information

Fixed-b Inference for Testing Structural Change in a Time Series Regression

Fixed-b Inference for Testing Structural Change in a Time Series Regression Fixed- Inference for esting Structural Change in a ime Series Regression Cheol-Keun Cho Michigan State University imothy J. Vogelsang Michigan State University August 29, 24 Astract his paper addresses

More information

CAE Working Paper # Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators

CAE Working Paper # Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators CAE Working Paper #06-04 Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators by Nigar Hashimzade and Timothy Vogelsang January 2006. Fixed-b Asymptotic

More information

POWER MAXIMIZATION AND SIZE CONTROL IN HETEROSKEDASTICITY AND AUTOCORRELATION ROBUST TESTS WITH EXPONENTIATED KERNELS

POWER MAXIMIZATION AND SIZE CONTROL IN HETEROSKEDASTICITY AND AUTOCORRELATION ROBUST TESTS WITH EXPONENTIATED KERNELS POWER MAXIMIZAION AND SIZE CONROL IN HEEROSKEDASICIY AND AUOCORRELAION ROBUS ESS WIH EXPONENIAED KERNELS By Yixiao Sun, Peter C.B. Phillips and Sainan Jin COWLES FOUNDAION PAPER NO. 34 COWLES FOUNDAION

More information

Comment on HAC Corrections for Strongly Autocorrelated Time Series by Ulrich K. Müller

Comment on HAC Corrections for Strongly Autocorrelated Time Series by Ulrich K. Müller Comment on HAC Corrections for Strongly Autocorrelated ime Series by Ulrich K. Müller Yixiao Sun Department of Economics, UC San Diego May 2, 24 On the Nearly-optimal est Müller applies the theory of optimal

More information

OPTIMAL BANDWIDTH CHOICE FOR INTERVAL ESTIMATION IN GMM REGRESSION. Yixiao Sun and Peter C.B. Phillips. May 2008

OPTIMAL BANDWIDTH CHOICE FOR INTERVAL ESTIMATION IN GMM REGRESSION. Yixiao Sun and Peter C.B. Phillips. May 2008 OPIAL BANDWIDH CHOICE FOR INERVAL ESIAION IN G REGRESSION By Yixiao Sun and Peter C.B. Phillips ay 8 COWLES FOUNDAION DISCUSSION PAPER NO. 66 COWLES FOUNDAION FOR RESEARCH IN ECONOICS YALE UNIVERSIY Box

More information

Testing Weak Convergence Based on HAR Covariance Matrix Estimators

Testing Weak Convergence Based on HAR Covariance Matrix Estimators Testing Weak Convergence Based on HAR Covariance atrix Estimators Jianning Kong y, Peter C. B. Phillips z, Donggyu Sul x August 4, 207 Abstract The weak convergence tests based on heteroskedasticity autocorrelation

More information

Serial Correlation Robust LM Type Tests for a Shift in Trend

Serial Correlation Robust LM Type Tests for a Shift in Trend Serial Correlation Robust LM Type Tests for a Shift in Trend Jingjing Yang Department of Economics, The College of Wooster Timothy J. Vogelsang Department of Economics, Michigan State University March

More information

CAE Working Paper # A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests. Nicholas M. Kiefer and Timothy J.

CAE Working Paper # A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests. Nicholas M. Kiefer and Timothy J. CAE Working Paper #05-08 A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests by Nicholas M. Kiefer and Timothy J. Vogelsang January 2005. A New Asymptotic Theory for Heteroskedasticity-Autocorrelation

More information

LONG RUN VARIANCE ESTIMATION AND ROBUST REGRESSION TESTING USING SHARP ORIGIN KERNELS WITH NO TRUNCATION

LONG RUN VARIANCE ESTIMATION AND ROBUST REGRESSION TESTING USING SHARP ORIGIN KERNELS WITH NO TRUNCATION LONG RUN VARIANCE ESIMAION AND ROBUS REGRESSION ESING USING SHARP ORIGIN KERNELS WIH NO RUNCAION BY PEER C. B. PHILLIPS, YIXIAO SUN and SAINAN JIN COWLES FOUNDAION PAPER NO. 78 COWLES FOUNDAION FOR RESEARCH

More information

Should We Go One Step Further? An Accurate Comparison of One-step and Two-step Procedures in a Generalized Method of Moments Framework

Should We Go One Step Further? An Accurate Comparison of One-step and Two-step Procedures in a Generalized Method of Moments Framework Should We Go One Step Further? An Accurate Comparison of One-step and wo-step Procedures in a Generalized Method of Moments Framework Jungbin Hwang and Yixiao Sun Department of Economics, University of

More information

Estimation and Inference of Linear Trend Slope Ratios

Estimation and Inference of Linear Trend Slope Ratios Estimation and Inference of Linear rend Slope Ratios imothy J. Vogelsang and Nasreen Nawaz Department of Economics, Michigan State University October 4 Abstract We focus on the estimation of the ratio

More information

Robust testing of time trend and mean with unknown integration order errors

Robust testing of time trend and mean with unknown integration order errors Robust testing of time trend and mean with unknown integration order errors Jiawen Xu y Shanghai University of Finance and Economics and Key Laboratory of Mathematical Economics Pierre Perron z Boston

More information

LECTURE ON HAC COVARIANCE MATRIX ESTIMATION AND THE KVB APPROACH

LECTURE ON HAC COVARIANCE MATRIX ESTIMATION AND THE KVB APPROACH LECURE ON HAC COVARIANCE MARIX ESIMAION AND HE KVB APPROACH CHUNG-MING KUAN Institute of Economics Academia Sinica October 20, 2006 ckuan@econ.sinica.edu.tw www.sinica.edu.tw/ ckuan Outline C.-M. Kuan,

More information

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic Chapter 6 ESTIMATION OF THE LONG-RUN COVARIANCE MATRIX An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic standard errors for the OLS and linear IV estimators presented

More information

The Size-Power Tradeoff in HAR Inference

The Size-Power Tradeoff in HAR Inference he Size-Power radeoff in HAR Inference June 6, 07 Eben Lazarus Department of Economics, Harvard University Daniel J. Lewis Department of Economics, Harvard University and James H. Stock* Department of

More information

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,

More information

Powerful Trend Function Tests That are Robust to Strong Serial Correlation with an Application to the Prebish Singer Hypothesis

Powerful Trend Function Tests That are Robust to Strong Serial Correlation with an Application to the Prebish Singer Hypothesis Economics Working Papers (22 26) Economics 4-23 Powerful Trend Function Tests That are Robust to Strong Serial Correlation with an Application to the Prebish Singer Hypothesis Helle Bunzel Iowa State University,

More information

Block Bootstrap HAC Robust Tests: The Sophistication of the Naive Bootstrap

Block Bootstrap HAC Robust Tests: The Sophistication of the Naive Bootstrap Block Bootstrap HAC Robust ests: he Sophistication of the Naive Bootstrap Sílvia Gonçalves Département de sciences économiques, CIREQ and CIRANO, Université de ontréal and imothy J. Vogelsang Department

More information

HAR Inference: Recommendations for Practice

HAR Inference: Recommendations for Practice HAR Inference: Recommendations for Practice Eben Lazarus, Harvard Daniel Lewis, Harvard Mark Watson, Princeton & NBER James H. Stock, Harvard & NBER JBES Invited Session Sunday, Jan. 7, 2018, 1-3 pm Marriott

More information

Robust Con dence Intervals in Nonlinear Regression under Weak Identi cation

Robust Con dence Intervals in Nonlinear Regression under Weak Identi cation Robust Con dence Intervals in Nonlinear Regression under Weak Identi cation Xu Cheng y Department of Economics Yale University First Draft: August, 27 This Version: December 28 Abstract In this paper,

More information

Inference about Clustering and Parametric. Assumptions in Covariance Matrix Estimation

Inference about Clustering and Parametric. Assumptions in Covariance Matrix Estimation Inference about Clustering and Parametric Assumptions in Covariance Matrix Estimation Mikko Packalen y Tony Wirjanto z 26 November 2010 Abstract Selecting an estimator for the variance covariance matrix

More information

Testing for a Trend with Persistent Errors

Testing for a Trend with Persistent Errors Testing for a Trend with Persistent Errors Graham Elliott UCSD August 2017 Abstract We develop new tests for the coe cient on a time trend in a regression of a variable on a constant and time trend where

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

Accurate Asymptotic Approximation in the Optimal GMM Frame. Stochastic Volatility Models

Accurate Asymptotic Approximation in the Optimal GMM Frame. Stochastic Volatility Models Accurate Asymptotic Approximation in the Optimal GMM Framework with Application to Stochastic Volatility Models Yixiao Sun UC San Diego January 21, 2014 GMM Estimation Optimal Weighting Matrix Two-step

More information

Likelihood Ratio Based Test for the Exogeneity and the Relevance of Instrumental Variables

Likelihood Ratio Based Test for the Exogeneity and the Relevance of Instrumental Variables Likelihood Ratio Based est for the Exogeneity and the Relevance of Instrumental Variables Dukpa Kim y Yoonseok Lee z September [under revision] Abstract his paper develops a test for the exogeneity and

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Heteroskedasticity- and Autocorrelation-Robust Inference or Three Decades of HAC and HAR: What Have We Learned?

Heteroskedasticity- and Autocorrelation-Robust Inference or Three Decades of HAC and HAR: What Have We Learned? AEA Continuing Education Course ime Series Econometrics Lecture 4 Heteroskedasticity- and Autocorrelation-Robust Inference or hree Decades of HAC and HAR: What Have We Learned? James H. Stock Harvard University

More information

11. Bootstrap Methods

11. Bootstrap Methods 11. Bootstrap Methods c A. Colin Cameron & Pravin K. Trivedi 2006 These transparencies were prepared in 20043. They can be used as an adjunct to Chapter 11 of our subsequent book Microeconometrics: Methods

More information

Simple Estimators for Semiparametric Multinomial Choice Models

Simple Estimators for Semiparametric Multinomial Choice Models Simple Estimators for Semiparametric Multinomial Choice Models James L. Powell and Paul A. Ruud University of California, Berkeley March 2008 Preliminary and Incomplete Comments Welcome Abstract This paper

More information

ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria

ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria SOLUTION TO FINAL EXAM Friday, April 12, 2013. From 9:00-12:00 (3 hours) INSTRUCTIONS:

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

Testing in GMM Models Without Truncation

Testing in GMM Models Without Truncation Testing in GMM Models Without Truncation TimothyJ.Vogelsang Departments of Economics and Statistical Science, Cornell University First Version August, 000; This Version June, 001 Abstract This paper proposes

More information

On Size and Power of Heteroskedasticity and Autocorrelation Robust Tests

On Size and Power of Heteroskedasticity and Autocorrelation Robust Tests On Size and Power of Heteroskedasticity and Autocorrelation Robust Tests David Preinerstorfer and Benedikt M. Pötscher y Department of Statistics, University of Vienna Preliminary version: April 2012 First

More information

Problem set 1 - Solutions

Problem set 1 - Solutions EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed

More information

Applications of Subsampling, Hybrid, and Size-Correction Methods

Applications of Subsampling, Hybrid, and Size-Correction Methods Applications of Subsampling, Hybrid, and Size-Correction Methods Donald W. K. Andrews Cowles Foundation for Research in Economics Yale University Patrik Guggenberger Department of Economics UCLA November

More information

Nonparametric Identi cation and Estimation of Truncated Regression Models with Heteroskedasticity

Nonparametric Identi cation and Estimation of Truncated Regression Models with Heteroskedasticity Nonparametric Identi cation and Estimation of Truncated Regression Models with Heteroskedasticity Songnian Chen a, Xun Lu a, Xianbo Zhou b and Yahong Zhou c a Department of Economics, Hong Kong University

More information

Inference on a Structural Break in Trend with Fractionally Integrated Errors

Inference on a Structural Break in Trend with Fractionally Integrated Errors Inference on a Structural Break in rend with Fractionally Integrated Errors Seongyeon Chang Boston University Pierre Perron y Boston University November, Abstract Perron and Zhu (5) established the consistency,

More information

The Impact of a Hausman Pretest on the Size of a Hypothesis Test: the Panel Data Case

The Impact of a Hausman Pretest on the Size of a Hypothesis Test: the Panel Data Case The Impact of a Hausman retest on the Size of a Hypothesis Test: the anel Data Case atrik Guggenberger Department of Economics UCLA September 22, 2008 Abstract: The size properties of a two stage test

More information

Heteroskedasticity and Autocorrelation Consistent Standard Errors

Heteroskedasticity and Autocorrelation Consistent Standard Errors NBER Summer Institute Minicourse What s New in Econometrics: ime Series Lecture 9 July 6, 008 Heteroskedasticity and Autocorrelation Consistent Standard Errors Lecture 9, July, 008 Outline. What are HAC

More information

Choice of Spectral Density Estimator in Ng-Perron Test: Comparative Analysis

Choice of Spectral Density Estimator in Ng-Perron Test: Comparative Analysis MPRA Munich Personal RePEc Archive Choice of Spectral Density Estimator in Ng-Perron Test: Comparative Analysis Muhammad Irfan Malik and Atiq-ur- Rehman International Institute of Islamic Economics, International

More information

Introduction to Nonparametric and Semiparametric Estimation. Good when there are lots of data and very little prior information on functional form.

Introduction to Nonparametric and Semiparametric Estimation. Good when there are lots of data and very little prior information on functional form. 1 Introduction to Nonparametric and Semiparametric Estimation Good when there are lots of data and very little prior information on functional form. Examples: y = f(x) + " (nonparametric) y = z 0 + f(x)

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Understanding Regressions with Observations Collected at High Frequency over Long Span

Understanding Regressions with Observations Collected at High Frequency over Long Span Understanding Regressions with Observations Collected at High Frequency over Long Span Yoosoon Chang Department of Economics, Indiana University Joon Y. Park Department of Economics, Indiana University

More information

Economics 620, Lecture 19: Introduction to Nonparametric and Semiparametric Estimation

Economics 620, Lecture 19: Introduction to Nonparametric and Semiparametric Estimation Economics 620, Lecture 19: Introduction to Nonparametric and Semiparametric Estimation Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 19: Nonparametric Analysis

More information

Economics 241B Estimation with Instruments

Economics 241B Estimation with Instruments Economics 241B Estimation with Instruments Measurement Error Measurement error is de ned as the error resulting from the measurement of a variable. At some level, every variable is measured with error.

More information

Cointegration and the joint con rmation hypothesis

Cointegration and the joint con rmation hypothesis Cointegration and the joint con rmation hypothesis VASCO J. GABRIEL Department of Economics, Birkbeck College, UK University of Minho, Portugal October 2001 Abstract Recent papers by Charemza and Syczewska

More information

Simple and Powerful GMM Over-identi cation Tests with Accurate Size

Simple and Powerful GMM Over-identi cation Tests with Accurate Size Simple and Powerful GMM Over-identi cation ests wit Accurate Size Yixiao Sun and Min Seong Kim Department of Economics, University of California, San Diego is version: August, 2 Abstract e paper provides

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Finite sample distributions of nonlinear IV panel unit root tests in the presence of cross-section dependence

Finite sample distributions of nonlinear IV panel unit root tests in the presence of cross-section dependence Finite sample distributions of nonlinear IV panel unit root tests in the presence of cross-section dependence Qi Sun Department of Economics University of Leicester November 9, 2009 Abstract his paper

More information

Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables

Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables Josep Lluís Carrion-i-Silvestre University of Barcelona Dukpa Kim y Korea University May 4, 28 Abstract We consider

More information

Testing for Regime Switching: A Comment

Testing for Regime Switching: A Comment Testing for Regime Switching: A Comment Andrew V. Carter Department of Statistics University of California, Santa Barbara Douglas G. Steigerwald Department of Economics University of California Santa Barbara

More information

Weak - Convergence: Theory and Applications

Weak - Convergence: Theory and Applications Weak - Convergence: Theory and Applications Jianning Kong y, Peter C. B. Phillips z, Donggyu Sul x October 26, 2018 Abstract The concept of relative convergence, which requires the ratio of two time series

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Comparing Nested Predictive Regression Models with Persistent Predictors

Comparing Nested Predictive Regression Models with Persistent Predictors Comparing Nested Predictive Regression Models with Persistent Predictors Yan Ge y and ae-hwy Lee z November 29, 24 Abstract his paper is an extension of Clark and McCracken (CM 2, 25, 29) and Clark and

More information

A Modified Confidence Set for the S Title Date in Linear Regression Models.

A Modified Confidence Set for the S Title Date in Linear Regression Models. A Modified Confidence Set for the S Title Date in Linear Regression Models Author(s) Yamamoto, Yohei Citation Issue 24-5-7 Date Type Technical Report Text Version publisher URL http://hdl.handle.net/86/26678

More information

HAC Corrections for Strongly Autocorrelated Time Series

HAC Corrections for Strongly Autocorrelated Time Series HAC Corrections for Strongly Autocorrelated Time Series Ulrich K. Müller Princeton University Department of Economics Princeton, NJ, 08544 December 2012 Abstract Applied work routinely relies on heteroskedasticity

More information

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER By Donald W. K. Andrews August 2011 COWLES FOUNDATION DISCUSSION PAPER NO. 1815 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS

More information

A Bootstrap Test for Conditional Symmetry

A Bootstrap Test for Conditional Symmetry ANNALS OF ECONOMICS AND FINANCE 6, 51 61 005) A Bootstrap Test for Conditional Symmetry Liangjun Su Guanghua School of Management, Peking University E-mail: lsu@gsm.pku.edu.cn and Sainan Jin Guanghua School

More information

Heteroskedasticity-Robust Inference in Finite Samples

Heteroskedasticity-Robust Inference in Finite Samples Heteroskedasticity-Robust Inference in Finite Samples Jerry Hausman and Christopher Palmer Massachusetts Institute of Technology December 011 Abstract Since the advent of heteroskedasticity-robust standard

More information

A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests

A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests A New Asymptotic Theory for Heteroskedasticity-Autocorrelation Robust Tests Nicholas M. Kiefer and Timothy J. Vogelsang Departments of Economics and Statistical Science, Cornell University April 2002,

More information

Econometrics Midterm Examination Answers

Econometrics Midterm Examination Answers Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i

More information

GMM estimation of spatial panels

GMM estimation of spatial panels MRA Munich ersonal ReEc Archive GMM estimation of spatial panels Francesco Moscone and Elisa Tosetti Brunel University 7. April 009 Online at http://mpra.ub.uni-muenchen.de/637/ MRA aper No. 637, posted

More information

Supplemental Material 1 for On Optimal Inference in the Linear IV Model

Supplemental Material 1 for On Optimal Inference in the Linear IV Model Supplemental Material 1 for On Optimal Inference in the Linear IV Model Donald W. K. Andrews Cowles Foundation for Research in Economics Yale University Vadim Marmer Vancouver School of Economics University

More information

Economics 620, Lecture 13: Time Series I

Economics 620, Lecture 13: Time Series I Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is

More information

Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England

Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England Kyung So Im Junsoo Lee Walter Enders June 12, 2005 Abstract In this paper, we propose new cointegration tests

More information

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes

More information

1. The Multivariate Classical Linear Regression Model

1. The Multivariate Classical Linear Regression Model Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The

More information

Modeling Expectations with Noncausal Autoregressions

Modeling Expectations with Noncausal Autoregressions Modeling Expectations with Noncausal Autoregressions Markku Lanne * Pentti Saikkonen ** Department of Economics Department of Mathematics and Statistics University of Helsinki University of Helsinki Abstract

More information

UNIVERSITY OF NOTTINGHAM. Discussion Papers in Economics

UNIVERSITY OF NOTTINGHAM. Discussion Papers in Economics UNIVERSITY OF NOTTINGHAM Discussion Papers in Economics Discussion Paper No. 06/11 Simple, Robust and Powerful Tests of the Breaking Trend Hypothesis* by David I. Harvey, Stephen J. Leybourne and A.M.

More information

A Fixed-b Perspective on the Phillips-Perron Unit Root Tests

A Fixed-b Perspective on the Phillips-Perron Unit Root Tests A Fixed-b Perspective on the Phillips-Perron Unit Root Tests Timothy J. Vogelsang Department of Economics Michigan State University Martin Wagner Department of Economics and Finance Institute for Advanced

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43 Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression

More information

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 Revised March 2012

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 Revised March 2012 SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER By Donald W. K. Andrews August 2011 Revised March 2012 COWLES FOUNDATION DISCUSSION PAPER NO. 1815R COWLES FOUNDATION FOR

More information

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 Instructions: Answer all four (4) questions. Point totals for each question are given in parenthesis; there are 00 points possible. Within

More information

Inference for Parameters Defined by Moment Inequalities: A Recommended Moment Selection Procedure. Donald W.K. Andrews and Panle Jia

Inference for Parameters Defined by Moment Inequalities: A Recommended Moment Selection Procedure. Donald W.K. Andrews and Panle Jia Inference for Parameters Defined by Moment Inequalities: A Recommended Moment Selection Procedure By Donald W.K. Andrews and Panle Jia September 2008 Revised August 2011 COWLES FOUNDATION DISCUSSION PAPER

More information

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations. Exercises for the course of Econometrics Introduction 1. () A researcher is using data for a sample of 30 observations to investigate the relationship between some dependent variable y i and independent

More information

HAR Inference: Recommendations for Practice

HAR Inference: Recommendations for Practice HAR Inference: Recommendations for Practice May 16, 2018 Eben Lazarus Department of Economics, Harvard University Daniel J. Lewis Department of Economics, Harvard University James H. Stock Department of

More information

Economic modelling and forecasting

Economic modelling and forecasting Economic modelling and forecasting 2-6 February 2015 Bank of England he generalised method of moments Ole Rummel Adviser, CCBS at the Bank of England ole.rummel@bankofengland.co.uk Outline Classical estimation

More information

HAR Inference: Recommendations for Practice

HAR Inference: Recommendations for Practice HAR Inference: Recommendations for Practice July 14, 2018 Eben Lazarus Department of Economics, Harvard University Daniel J. Lewis Department of Economics, Harvard University James H. Stock Department

More information

On the Long-Run Variance Ratio Test for a Unit Root

On the Long-Run Variance Ratio Test for a Unit Root On the Long-Run Variance Ratio Test for a Unit Root Ye Cai and Mototsugu Shintani Vanderbilt University May 2004 Abstract This paper investigates the effects of consistent and inconsistent long-run variance

More information

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics GMM, HAC estimators, & Standard Errors for Business Cycle Statistics Wouter J. Den Haan London School of Economics c Wouter J. Den Haan Overview Generic GMM problem Estimation Heteroskedastic and Autocorrelation

More information

Simple Estimators for Monotone Index Models

Simple Estimators for Monotone Index Models Simple Estimators for Monotone Index Models Hyungtaik Ahn Dongguk University, Hidehiko Ichimura University College London, James L. Powell University of California, Berkeley (powell@econ.berkeley.edu)

More information

13 Endogeneity and Nonparametric IV

13 Endogeneity and Nonparametric IV 13 Endogeneity and Nonparametric IV 13.1 Nonparametric Endogeneity A nonparametric IV equation is Y i = g (X i ) + e i (1) E (e i j i ) = 0 In this model, some elements of X i are potentially endogenous,

More information

When is it really justifiable to ignore explanatory variable endogeneity in a regression model?

When is it really justifiable to ignore explanatory variable endogeneity in a regression model? Discussion Paper: 2015/05 When is it really justifiable to ignore explanatory variable endogeneity in a regression model? Jan F. Kiviet www.ase.uva.nl/uva-econometrics Amsterdam School of Economics Roetersstraat

More information

Simple Examples. Let s look at a few simple examples of OI analysis.

Simple Examples. Let s look at a few simple examples of OI analysis. Simple Examples Let s look at a few simple examples of OI analysis. Example 1: Consider a scalar prolem. We have one oservation y which is located at the analysis point. We also have a ackground estimate

More information

Estimation and Inference with Weak Identi cation

Estimation and Inference with Weak Identi cation Estimation and Inference with Weak Identi cation Donald W. K. Andrews Cowles Foundation Yale University Xu Cheng Department of Economics University of Pennsylvania First Draft: August, 2007 Revised: March

More information

(c) i) In ation (INFL) is regressed on the unemployment rate (UNR):

(c) i) In ation (INFL) is regressed on the unemployment rate (UNR): BRUNEL UNIVERSITY Master of Science Degree examination Test Exam Paper 005-006 EC500: Modelling Financial Decisions and Markets EC5030: Introduction to Quantitative methods Model Answers. COMPULSORY (a)

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

Generalized Least Squares

Generalized Least Squares Generalized Least Squares Upuntilnow,wehaveassumedthat Euu = ¾ I Now we generalize to let Euu = where is restricted to be a positive de nite, symmetric matrix ommon Examples Autoregressive Models AR()

More information

Testing Linear Restrictions: cont.

Testing Linear Restrictions: cont. Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n

More information

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley Review of Classical Least Squares James L. Powell Department of Economics University of California, Berkeley The Classical Linear Model The object of least squares regression methods is to model and estimate

More information

A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter

A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter Donald W. K. Andrews Cowles Foundation for Research in Economics Yale University Patrik Guggenberger Department

More information

Department of Economics, UCSD UC San Diego

Department of Economics, UCSD UC San Diego Department of Economics, UCSD UC San Diego itle: Spurious Regressions with Stationary Series Author: Granger, Clive W.J., University of California, San Diego Hyung, Namwon, University of Seoul Jeon, Yongil,

More information

This chapter reviews properties of regression estimators and test statistics based on

This chapter reviews properties of regression estimators and test statistics based on Chapter 12 COINTEGRATING AND SPURIOUS REGRESSIONS This chapter reviews properties of regression estimators and test statistics based on the estimators when the regressors and regressant are difference

More information

A New Approach to the Asymptotics of Heteroskedasticity-Autocorrelation Robust Testing

A New Approach to the Asymptotics of Heteroskedasticity-Autocorrelation Robust Testing A New Approach to the Asymptotics of Heteroskedasticity-Autocorrelation Robust Testing Nicholas M. Kiefer Timothy J. Vogelsang August, 2 Abstract Asymptotic theory for heteroskedasticity autocorrelation

More information

A Panel Unit Root Test in the Presence of a Multifactor Error Structure

A Panel Unit Root Test in the Presence of a Multifactor Error Structure A Panel Unit Root est in the Presence of a Multifactor Error Structure M. Hashem Pesaran a L. Vanessa Smith b akashi Yamagata c a University of Cambridge and USC b CFAP, University of Cambridge c Department

More information