Local linear multiple regression with variable. bandwidth in the presence of heteroscedasticity

Size: px
Start display at page:

Download "Local linear multiple regression with variable. bandwidth in the presence of heteroscedasticity"

Transcription

1 Local linear multiple regression with variable bandwidth in the presence of heteroscedasticity Azhong Ye 1 Rob J Hyndman 2 Zinai Li 3 23 January 2006 Abstract: We present local linear estimator with variable bandwidth for multivariate nonparametric regression. We prove its consistency and asymptotic normality in interior and obtain its rates of convergence. For heteroscedastic models, the local linear estimator with variable bandwidth has better goodness of fit than the local linear estimator with constant bandwidth. JEL Classification: C12; C15; C52 Keywords: heteroscedasticity; kernel smoothing; local linear regression; variable bandwidth. 1 Azhong Ye, College of Management, Fuzhou University, Fuzhou, , China. ye2004@fzu.edu.cn 2 Rob J Hyndman, Department of Econometrics and Business Statistics, Monash University, Clayton, Victoria 3800, Australia. Rob.Hyndman@buseco.monash.edu.au 3 Zinai Li, School of Economics and Management, Tsinghua University, Beijin, , China. lizinai@mail.tsinghua.edu.cn 1

2 1 Introduction We are interested in the problem of heteroscedasticity in nonparametric regression, especially when applied to economic data. Heteroscedasticity is very common in economics, and heteroscedasticity in linear regression is covered in almost every econometrics tetbook. Applications of nonparametric regression in economics are growing, and Yatchew (1998) argued that it will become an indispensable tool for every economist because it typically assumes little about the shape of the regression function. Consequently, we believe that heteroscedasticity in nonparametric regression is an important problem that has received limited attention to date. We seek to develop new estimators that have better goodness of fit than the common estimators in nonparametric econometric models. In particular, we are interested in using heteroscedasticity to improve nonparametric regression estimation. There have been a few papers on related topics. Testing for heteroscedasticity in nonparametric regression has been discussed by Eubank and Thomas (1993) and Dette and Munk (1998). Ruppert and Wand (1994) discussed multivariate locally weighted least squares regression when the variances of the disturbances aren t constant. Ruppert, Wand, Ulla and Hossjer (1997) presented the local polynomial estimator of the conditional variance function in a heteroscedastic, nonparametric regression model using linear smoothing of squared residuals. Sharp-optimal and adaptive estimators for heteroscedastic nonparametric regression using the classical trigonometric Fourier basis are given by Efromovich and Pinsker (1996). Our approach is to eploit the heteroscedasticity by using variable bandwidths in local linear regression. Müller and Stadtmüller (1987) discussed variable bandwidth kernel estimators of regression curves. Fan and Gijbels (1992, 1995, 1996) discussed the local linear estimator with variable bandwidth for nonparametric regression models with a single covariate. In this paper, we etend these papers by presenting a local linear estimator with variable bandwidth for nonparametric multiple regression models. We demonstrate that the local linear estimator has optimal conditional mean squared error when Ye, Hyndman and Li: 23 January

3 its variable bandwidth is a function of the density of the eplanatory variables and conditional variances. Numerical simulation shows that the local linear estimator with this variable bandwidth has better goodness of fit than the local linear estimator with constant bandwidth for the heteroscedastic models. 2 Local linear regression with a variable bandwidth Suppose we have a univariate response variable Y and a d-dimensional set of covariates X, and we observe the random vectors (X 1, Y 1 ),, (X n, Y n ) which are independent and identically distributed. It is assumed that each variable in X has been scaled so they have similar measures of spread. Our aim is to estimate the regression function = E(Y X = ). We can regard the data as being generated from the model Y = m(x) + u, (1) where E(u X) = 0, Var(u X = ) = σ 2 () and the marginal density of X is denoted by f(). We assume the second-order derivatives of are continuous, f() is bounded above 0 and σ 2 () is continuous and bounded. Let K be a d-variate kernel which is symmetric, nonnegative, compactly supported, K(u) du = 1 and uu T K(u) du = µ 2 (K)I where µ 2 (K) 0 and I is the d d identity matri. In addition, all odd-order moments of K vanish, that is, u l 1 1 u l d d K(u) du = 0 for all nonnegative integers l 1,, l d such that their sum is odd. Let K h (u) = K(u/h). Then the local linear estimator of with variable bandwidth is ˆm n (, h n, α) = e T 1 (X T W,α X ) 1 X T W,α Y, (2) Ye, Hyndman and Li: 23 January

4 where h n = cn 1/(d+4), c is a constant that depends only on K, α() is the variable bandwidth function, e T 1 = (1, 0,, 0), X = (X,1,, X,n ) T, X,i = (1, (X i )) T and W,α = ( ) diag K hn α(x 1 )(X 1 ),, K hn α(x n )(X n ). We assume α() is continuously differentiable. We now state our main result. Theorem 1 Let be a fied point in the interior of { f() > 0}, H m () = let s(h m ()) be the sum of the elements of H m (). Then ( 2 i j )d d and 1 E [ ˆm n (, h n, α) X 1,, X n ] = 0.5h 2 nµ 2 (K)α 2 ()s(h m ()) + o p (h 2 n); 2 Var [ ˆm n (, h n, α) X 1,, X n ] = n 1 h d n R(K)α d ()σ 2 ()f 1 () + o p (n 1 h d n ), where R(K) = K 2 (u)du; and 3 n 2/(d+4) [ ˆm n (, h n, α) ] d N(0.5c 2 µ 2 (K)α 2 ()s(h m ()), c d R(K)α d ()σ 2 ()f 1 ()). When α() = 1, results 1 and 2 coincide with Theorem 2.1 of Ruppert and Wand (1994). By result 2 and the law of large numbers, we find that ˆm n (, h n, α) is consistent. From result 3 we know that the rate of convergence of ˆm n (, h n, α) in interior points is O(n 2/(d+4) ) which, according to Stone (1980, 1982), is the optimal rate of convergence for estimating nonparametric function. 3 Using heteroscedasticity to improve local linear regression Although Fan and Gijbels (1992) and Ruppert and Wand (1994) discuss the local linear estimator of, nobody has previously developed an improved estimator using the information of heteroscedasticity. We now show how this can be achieved. Ye, Hyndman and Li: 23 January

5 Using Theorem 1, we can give an epression for the conditional mean squared error of the local linear estimator with variable bandwidth. } MSE = E {[ ˆm n (, h n, α) ] 2 X 1,, X n = 1 4 [h nα()] 4 µ 2 2(K)s 2 (H m ()) + R(K)σ2 () n[h n α()] d f() + o p(h 2 n) + o p (n 1 h d n ). (3) Minimizing MSE, ignoring the higher order terms, we obtain the optimal variable bandwidth ( α opt () = c 1 dr(k)σ 2 ) 1/(d+4) () µ 2. 2 (K)f()s2 (H m ()) Note that the constant c will cancel out in the bandwidth epression h n α opt (). Therefore, { } without loss of generality, we can set c = dr(k)µ 2 2 1/(d+4) (K) which simplifies the above epression to ( σ 2 ) 1/(d+4) () α opt () = f()s 2. (4) (H m ()) To apply this new estimator, we need to replace f(), H m () and σ 2 () with estimators. We can use the kernel estimator of f() and the local polynomial estimator of H m (). To estimate σ 2 () = E(u 2 X = ), we first obtain the estimator of u, û i = Y i Ŷi, where Ŷ i = ˆm n (X i, h n, 1). Then we estimate σ 2 () using local polynomial regression with the model û 2 i = σ2 (X i ) + v i where v i are iid with zero mean. In our eamples, we measure relative goodness of fit by the mean of standard squared errors: ( n ) MSSE = n 1 1/2 [ ˆm n (X i, h n, α) m(x i )] 2. (5) Ye, Hyndman and Li: 23 January

6 4 Numerical studies with univariate regression This section eamines the performance of the proposed variable bandwidth method via several data sets of univariate regression, generated from known functions. Also we compare it with the performance of the constant bandwidth method. As the true regression function is known in each case, the performances of the variable and constant bandwidths are measured by MSSE. We consider five simulated models as following subsection. In each of these models Y = m(x)+ σ(x)u with u N(0, 1) and the covariate X has a Uniform ( 2, 2) distribution. 4.1 True regression models Model A: m A () = 2 +, σ 2 A() = Model B: m B () = (1 + ) sin(1.5), σ 2 B() = Model C: m C () = + 2 ep( 2 2 ), σ 2 C() = 16( )I ( 2 >0.01) Model D: m D () = sin(2) + 2 ep( 2 2 ), Ye, Hyndman and Li: 23 January

7 σ 2 D() = 16( )I ( 2 >0.01) Model E: m E () = ep( ( + 1) 2 ) + 2 ep( 2 2 ), σ 2 E() = 32( )I ( 2 >0.01) Bandwidth selectors For the constant bandwidth method, we use direct plug-in methodology to select the bandwidth of a local linear Gaussian kernel regression estimate, as described by Ruppert, Sheather and Wand (1995). For the variable bandwidth method, we use direct plug-in methodology to choose the variable bandwidth. First, we use direct plug-in methodology to select the bandwidth of a kernel density estimate for f(). Second, we estimate σ 2 () = E(u 2 X = ) by using local linear or kernel regression with the model û 2 i = σ2 (X i ) + v i where û i = Y i ˆm n (X i, ĥn, 1),v i are iid with zero mean and ĥn is chosen by the direct plug-in methodology. Third, we use three methods to estimate. For the first method, we assume that = α 1 + α 2 + α α α 5 4, and then = 2α 3 + 6α α 5 2. For the second method, we use revised rule of thumb for bandwidth selection (J. Fan and I. Gijbels, 1996) by replacing the rule of thumb for bandwidth selector (the formula (4.2) in their book) with the ideal choice of bandwidth (the formula (3.20) in their book). For the third method, we average the first and second estimators of in previous methods. Finally, we Ye, Hyndman and Li: 23 January

8 have the direct plug-in bandwidth ( ˆσ 2 ) 1/5 () ĥ() = 2n π ˆf() ˆ m 2. () 4.3 Simulated results We draw 1000 random samples of size 200 from each model for each method. Table 1 presents a summary of the outputs of the variable and constant bandwidth methods. It shows that the variable bandwidth method has less mean of standard squared errors than the constant bandwidth method. Among 1000 samples, the percentages that the variable bandwidth method is better than the constant bandwidth method are totally more than half. We plot the true regression functions (the solid line) and four typical estimated curves in Figure 1 and 2. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. We find out that for the same percentile, the dotted line (the variable bandwidth method) is almost closer to the solid line (the true regression functions) than the dashed line ( the constant bandwidth method). Therefore, we have the conclusion that for heteroscedastic models, the local linear estimator with variable bandwidth has better goodness of fit than the local linear estimator with constant bandwidth. 5 Numerical studies with bivariate regression This section eamines the performance of the proposed variable bandwidth method via several data sets of bivariate regression, generated from known functions and compares it with the performance of the constant bandwidth method. We consider four simulated models as following subsection. In each of these models Y = m(x 1, X 2 ) + σ(x 1 )u with u N(0, 1) and the covariate X 1 and X 2 are independent and have a Uniform ( 2, 2) distribution. We draw 200 random samples of size 400 from each model. Ye, Hyndman and Li: 23 January

9 5.1 True regression models Model A: m A ( 1, 2 ) = 1 2, σ 2 A( 1, 2 ) = ( )I ( 2 1 >0.04) Model B: m B ( 1, 2 ) = 1 ep( 2 2 2), σ 2 B( 1, 2 ) = 2.5( )I ( 2 1 >0.04) Model C: m C ( 1, 2 ) = sin(1.5 2 ), σ 2 C( 1, 2 ) = ( )I ( 2 1 >0.04) Model D: m D ( 1, 2 ) = sin( ) + 2 ep( 2 2 2), σ 2 D( 1, 2 ) = 3( )I ( 2 1 >0.04) Bandwidth selectors To estimate the second derivative of m( 1, 2 ), we assume that m( 1, 2 ) = α 1 + α α α α α α α α Ye, Hyndman and Li: 23 January

10 +α α α α α α , and then we can obtain the estimators of α. Hence, we easily get the estimators for the second derivative of m( 1, 2 ) by using = 2α 4 + 6α α α α α , = α 5 + 2α α α α α , = 2α 6 + 2α α α α α For the constant bandwidth method, we use direct plug-in methodology to select the bandwidth of a local linear Gaussian kernel regression estimate. To estimate σ 2, we first obtain the estimator of u, û i = Y i Ŷi, where Ŷi = ˆm n (X i, ĥn, 1) and ĥn = min(ĥ1, ĥ2) and ĥ1, ĥ2 are chosen respectively by the direct plug-in methodology for Y = m 1 (X 1 ) + u 1 and Y = m 2 (X 2 ) + u 2. Then ˆσ 2 = n 1 n û 2 i. Finally, we have the direct plug-in bandwidth ( ˆσ 2 ) 1/6 ĥ = π. n s 2 (Ĥm(X i )) For the variable bandwidth method, we also use direct plug-in methodology to choose the variable bandwidth. First, we use direct plug-in methodology to select the bandwidth of a kernel density estimate for X 1 and X 2 respectively, as described by Wand and Jones (1995). Second, we estimate σ 2 ( 1 ) using local linear regression with the model û 2 i = σ2 (X 1i ) + v i, where v i are iid with zero mean. Finally, we have the direct plug-in bandwidth ( ˆσ 2 ) 1/6 () ĥ() = nπ ˆf()s. 2 (Ĥ) Ye, Hyndman and Li: 23 January

11 5.3 Simulated results We draw 200 random samples of size 200 from each model. Table 2 presents a summary of the outputs of the variable and constant bandwidth methods. It shows that the variable bandwidth method has less mean of standard squared errors than the constant bandwidth method. Among 200 samples, the percentages that the variable bandwidth method is better than the constant bandwidth method are totally more than half. We plot the true regression functions with one fied variable (the solid line) and their four typical estimated curves in Figure 3-6. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. We find out that for the same percentile, the dotted line (the variable bandwidth method) is almost closer to the solid line (the true regression functions) than the dashed line ( the constant bandwidth method). Therefore, we have the conclusion that for heteroscedastic models, the local linear estimator with variable bandwidth has better goodness of fit than the local linear estimator with constant bandwidth. 6 Summary We have presented a local linear nonparametric estimator with variable bandwidth for multivariate regression models. We have shown that the estimator is consistent and asymptotically normal in the interior of the sample space. We have also shown that its convergence rate is optimal for nonparametric regression (Stone, 1980, 1982). By minimizing the conditional mean squared error of the estimator, we have derived the optimal variable bandwidth as a function of the density of the eplanatory variables and the conditional variance. We have also provided a plug-in algorithm for computing the estimator. Numerical simulation shows that our local linear estimator with variable bandwidth has better goodness of fit than the local linear estimator with constant bandwidth for heteroscedastic models. Ye, Hyndman and Li: 23 January

12 Acknowledgements We thank X. Zhang for his helpful comments. References Dette, H., Munk, A., Testing heteroscedasticity in nonparametric regression. Journal of the Royal Statistical Society. Series B, 60, Part4, Efromovich, S., and Pinsker, M., Sharp-optimal and adaptive estimation for heteroscedastic nonparametric regression. Statistica Sinica 6, Eubank, R.L., Thomas, W., Detecting heteroscedasticity in nonparametric regression. Journal of the Royal Statistical Society. Series B, 55(1), Fan, J., Gijbels, I., Variable bandwidth and local linear regression smoothers. The Annals of Statistics, 20(4), Fan, J., Gijbels, I., Data-driven bandwidth selection in local polynomial fitting: variable bandwidth and spatial adaptation. Journal of the Royal Statistical Society. Series B, 57(2), Fan, J., Gijbels, I., Local polynomial modelling and its applications. Chapman and Hall. Härdle, W., Applied nonparametric regression. Cambridge University Press. Müller, H.G., Stadtmüller, U., Variable bandwidth kernel estimators of regression curves. The Annals of Statistics, 15(1), Opsomer, J., Yang, Y., Wang Y., Nonparametric regression with correlated errors. Statistical Science, 16(2), Pagan, A., and Ullah, A., Nonparametric econometrics. Cambridge University Press. Ruppert, D., Wand, M.P., Multivariate locally weighted least squares regression. The Annals of Statistics, 22(3), Ruppert, D., Wand, M.P., Ulla, H., Hossjer, O., Local polynomial variance-function Ye, Hyndman and Li: 23 January

13 estimation. Technometrics, 39, Stone, C.J., Optimal convergence rates for nonparametric estimators. The Annals of Statistics, 8, Stone, C.J., Optimal global rates of convergence for nonparametric regression. The Annals Statist., 10, White, H., Asymptotic theory for econometricians. Academic Press. Xia, Y., Tong, H., Li, W.K., Zhu, L.X., An adaptive estimation of dimension reduction space. Journal of the Royal Statistical Society. Series B, 64, Part 3, Yatchew, A., Nonparametric regression techniques in economics. Journal of Economic Literature, 34, June, Ye, Hyndman and Li: 23 January

14 Appendi A: Proof of Theorem 1 Note that e T 1 (X T W,α X ) 1 X T W,α X = e T 1 D m () =, (6) D m () where D m () = [ 1 ],, T d. Therefore, ˆm n (, h n, α) = e T 1 (X T W,α X ) 1 X T W,α [0.5Q m () + U], (7) where Q m () = [Q m,1 (),, Q m,n ()] T, Q m,i () = (X i ) T H m (z i (, X i ))(X i ), ( H m () = 2 i j )d d, z i(, X i ) X i and U = (u 1,, u n ) T. We can deduce that {z i (, X i )} n are independent because {X i} n obvious that are independent. It is n 1 n 1 X T W,α X = n K hnα(xi )(X i ) n K hn α(x i )(X i )(X i ) n K hnα(x i )(X i )(X i ) T n K hn α(x i )(X i )(X i )(X i ) T. (8) We begin with a lemma using the notation of (2) and Theorem 1. Lemma 1 n 1 n n 1 n n 1 n K hn α(x i )(X i ) = f() + o p (1) (9) K hn α(x i )(X i )(X i ) = h 2 nα 3 ()G(α, f, ) + o p (h 2 n1) (10) K hnα(x i )(X i )(X i )(X i ) T = µ 2 (K)h 2 nα 2 ()f()i + o p (h 2 n1) (11) Ye, Hyndman and Li: 23 January

15 where [ ] G(α, f, ) = f() udα T ()uu T D K (u)du + µ 2 (K) df()d α () + α 1 ()D f () supp(k) and 1 is a generic matri having each entry equal to 1, the dimensions of which will be clear from the contet. (X T W,α X ) 1 = f() 1 + o p (1) B(, α) + o p (1) (12) B T (, α) + o p (1) µ 2 (K) 1 h 2 n α 2 ()f() 1 I + o p (h 2 n 1) f()µ 2(K)α 2 ()s(h m ()) n 1 X T W,α Q m () = h 2 n + o p (h 2 n1) (13) 0 (nh d n) 1/2 (n 1 X T W,α u) d where B(, α) = µ 2 (K) 1 f() 2 α()g(α, f, ) T. N(0, R(K)α d ()σ 2 ()f()). (14) 0 We only prove results (10), (13) and (14) as the other results can be proved similarly. Proof of (10) It is easy to show that n 1 n where Ψ = [ ] K hn α(x i )(X i )(X i ) = E K hn α(x 1 )(X 1 )(X 1 ) + O p ( n 1 Ψ), (15) ( Var(K hnα(x 1 )(X 1 )(X 11 1 )),, Var(K hnα(x 1 )(X 1 )(X d1 d ))) T. Because is a fied point in the interior of supp(f) = { f() 0}, we have supp(k) {z : ( + h n α()z) supp(f)}, provided that the bandwidth h n is small enough. Ye, Hyndman and Li: 23 January

16 Due to the continuity of f, K and α, we have [ ] E K hn α(x 1 )(X 1 )(X 1 ) = h d n α d ()K(h 1 n (α(x 1 )) 1 (X 1 ))(X 1 )f(x 1 )dx 1 supp(f) = (α( + h n Q)) d K(Q(α( + h n Q)) 1 )f( + h n Q)h n QdQ Ω n = h 2 nα() 3 G(α, f, ) + o(h 2 n1), (16) where Ω n = {Q : + h n Q supp(f)}. It is easy to see [ ] Var K hn α(x 1 )(X 1 )(X 1 ) { [ ] [ ] } T = E K hn α(x 1 )(X 1 )(X 1 ) K hn α(x 1 )(X 1 )(X 1 ) { [ ]} { [ T E K hn α(x 1 )(X 1 )(X 1 ) E K hn α(x 1 )(X 1 )(X 1 )]}. (17) Again by the continuity of f, K and α, we have { [ ] [ ] } T E K hn α(x 1 )(X 1 )(X 1 ) K hn α(x 1 )(X 1 )(X 1 ) = E [K ] hn α(x 1 )(X 1 ) 2 (X 1 )(X 1 ) T = [ 2 h d n (α()) d K(h 1 n (α(x 1 )) 1 (X 1 ))] (X1 )(X 1 ) T f(x 1 )dx 1 supp(f) = h d+2 n (α( + h n Q)) d K(Q(α( + h n Q)) 1 ) 2 f( + h n Q)QQ T dq Ω n = h d+2 n (α()) d K(Q(α()) 1 ) 2 f()qq T dq + O(h d+2 n 1) = O(h d+2 n 1). Ω n (18) Therefore we have O p ( n 1 Ψ) = o p (h 2 n1). (19) Then (10) follows from (15) (19). Ye, Hyndman and Li: 23 January

17 Proof of (13) It is straightforward to show that n 1 X T W,α Q m () n 1 n K hn α(x i )(X i )(X i ) T H m (z i (, X i ))(X i ) = n 1. (20) n K hn α(x i )(X i )(X i ) T H m (z i (, X i ))(X i )(X i ) It is easy to prove that n n 1 n n 1 K hn α(x i )(X i )(X i ) T H m (z i (, X i ))(X i ) (21) = h 2 nf()µ 2 (K)(α()) 2 s(h m ()) + o p (h 2 n1), K hn α(x i )(X i )(X i ) T H m (z i (, X i ))(X i )(X i ) = O p (h 3 n1). (22) Therefore (13) holds. Proof of (14) It is obvious that [ ] [ ] E n 1 X T W,α u = E n 1 X T W,α u = 0, n 1 n K n 1 X T hn α(x i )(X i )u i W,α u = n 1. (23) n K hnα(xi )(X i )(X i )u i By the continuity of f, K, σ 2 and α, we have Var [ n 1 n ] ] K hn α(x i )(X i )u i = n 1 Var [K hn α(x 1 )(X 1 )u 1 [ 2 = n 1 h d n (α()) d K(h 1 n (α(x 1 )) 1 (X 1 ))] σ 2 (X 1 )f(x 1 )dx 1 supp(f) = n 1 h d n ((α( + h n Q)) d K(Q(α( + h n Q)) 1 )) 2 σ 2 ( + h n Q)f( + h n Q)dQ Ω n = n 1 h d n ((α()) d K(Q(α()) 1 )) 2 σ 2 ()f()dq + o(n 1 h d n ) Ω n Ye, Hyndman and Li: 23 January

18 = n 1 h d n R(K)(α()) d σ 2 ()f() + o(n 1 h d n ), (24) [ n ] Var n 1 K hnα(xi )(X i )(X i )u i = o(n 1 h d+2 n 1). (25) Then (14) holds. Proof of Theorem 1 Proof of Theorem 1(1) By (12) and (13), we have E [ ˆm n (, h n, α) X 1,, X n ] = 0.5e T 1 (X T W,α X ) 1 X T W,α Q m () = 0.5h 2 nµ 2 (K)(α()) 2 s(h m ()) + o p (h 2 n). (26) Therefore Theorem 1(1) holds. Proof of Theorem 1(2) Denote V = diag { σ 2 (X 1 ),, σ 2 (X n ) }. It is obvious that Var [ ˆm n (, h n, α) X 1,, X n ] = e T 1 (X T W,α X ) 1 X T W,α V W,α X (X T W,α X ) 1 e 1, (27) and a 11(, h n, α) (a 21 (, h n, α)) T n 1 X T W,α V W,α X =, (28) a 21 (, h n, α) a 22 (, h n, α) Ye, Hyndman and Li: 23 January

19 where n a 11 (, h n, α) = n 1 (K hn α(x i )(X i )) 2 σ 2 (X i ) n a 21 (, h n, α) = n 1 (K hn α(x i )(X i )) 2 (X i )σ 2 (X i ) n a 22 (, h n, α) = n 1 (K hnα(xi )(X i )) 2 (X i )(X i ) T σ 2 (X i ). It is easy to prove that n n 1 n n 1 n n 1 (K hn α(x i )(X i )) 2 σ 2 (X i ) = h d R(K)(α()) d σ 2 ()f()o p (h d (K hn α(x i )(X i )) 2 (X i ) T σ 2 (X i ) = O p (h d+1 n 1) n (K hn α(x i )(X i )) 2 (X i )(X i ) T σ 2 (X i ) = O p (h d+2 n 1). (29) n ) By (27) (29) and (12), we have Var [ ˆm n (, h n, α) X 1,, X n ] = n 1 h d n R(K)(α()) d σ 2 ()f() 1 + o p (n 1 h d n ). (30) Therefore Theorem 1(2) holds. Proof of Theorem 1(3) By (13) and (14) and the central limit theorem, we have n 2/(d+4) n 1 X T d W,α [0.5Q m () + u] N(0.5c2 µ 2 (K)(α()) 2 f()s(h m ()), c d R(K)(α()) d σ 2 ()f()). (31) 0 Applying White (1984, Proposition 2.26) and (12), we can easily deduce Theorem 1(3). Ye, Hyndman and Li: 23 January

20 Table 1: The percentage of 1000 samples (in which the variable bandwidth method is better than the constant bandwidth method) and the mean of standard squared errors of two methods. Model and Percentage Mean of standard squared errors the method estimating Constant bandwidth Variable bandwidth Model A, method Model A, method Model A, method Model B, method Model B, method Model B, method Model C, method Model C, method Model C, method Model D, method Model D, method Model D, method Model E, method Model E, method Model E, method Table 2: The percentage of 200 samples (in which the variable bandwidth method is better than the constant bandwidth method) and the mean of standard squared errors of two methods. Model Percentage Mean of standard squared errors Constant bandwidth Variable bandwidth Model A Model B Model C Model D Ye, Hyndman and Li: 23 January

21 Figure 1: The results for the simulated univariate regression data of model A, B and C. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model A, method 1 Model A, method 2 Model A, method Model B, method 1 Model B, method 2 Model B, method Model C, method 1 Model C, method 2 Model C, method Ye, Hyndman and Li: 23 January

22 Figure 2: The results for the simulated univariate regression data of model D and E. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model D, method 1 Model D, method 2 Model D, method Model E, method 1 Model E, method 2 Model E, method Ye, Hyndman and Li: 23 January

23 Figure 3: The results for the simulated bivariate data of model A. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model A, _2=1 Model A, _1=1 m(_1,1) m(1,_2) _1 _2 Model A, _2=0 Model A, _1=0 m(_1,0) m(0,_2) _1 _2 Model A, _2= 1 Model A, _1= 1 m(_1, 1) m( 1,_2) _1 _2 Ye, Hyndman and Li: 23 January

24 Figure 4: The results for the simulated bivariate data of model B. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model B, _2=1 Model B, _1=1 m(_1,1) m(1,_2) _1 _2 Model B, _2=0 Model B, _1=0 m(_1,0) m(0,_2) _1 _2 Model B, _2= 1 Model B, _1= 1 m(_1, 1) m( 1,_2) _1 _2 Ye, Hyndman and Li: 23 January

25 Figure 5: The results for the simulated bivariate data of model C. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model C, _2=1 Model C, _1=1 m(_1,1) m(1,_2) _1 _2 Model C, _2=0 Model C, _1=0 m(_1,0) m(0,_2) _1 _2 Model C, _2= 1 Model C, _1= 1 m(_1, 1) m( 1,_2) _1 _2 Ye, Hyndman and Li: 23 January

26 Figure 6: The results for the simulated bivariate data of model D. The true regression functions (the solid line) and four typical estimated curves are presented. These correspond to the 10 th, the 30 th, the 70 th, the 90 th percentile. The dashed line is for the constant bandwidth method and the dotted line is for the variable bandwidth method. Model D, _2=1 Model D, _1=1 m(_1,1) m(1,_2) _1 _2 Model D, _2=0 Model D, _1=0 m(_1,0) m(0,_2) _1 _2 Model D, _2= 1 Model D, _1= 1 m(_1, 1) m( 1,_2) _1 _2 Ye, Hyndman and Li: 23 January

Local linear multivariate. regression with variable. bandwidth in the presence of. heteroscedasticity

Local linear multivariate. regression with variable. bandwidth in the presence of. heteroscedasticity Model ISSN 1440-771X Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Local linear multivariate regression with variable bandwidth in the presence

More information

Local Polynomial Regression

Local Polynomial Regression VI Local Polynomial Regression (1) Global polynomial regression We observe random pairs (X 1, Y 1 ),, (X n, Y n ) where (X 1, Y 1 ),, (X n, Y n ) iid (X, Y ). We want to estimate m(x) = E(Y X = x) based

More information

ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS

ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS By YOUSEF FAYZ M ALHARBI A thesis submitted to The University of Birmingham for the Degree of DOCTOR OF PHILOSOPHY School of Mathematics The

More information

Nonparametric Regression

Nonparametric Regression Nonparametric Regression Econ 674 Purdue University April 8, 2009 Justin L. Tobias (Purdue) Nonparametric Regression April 8, 2009 1 / 31 Consider the univariate nonparametric regression model: where y

More information

Density estimation Nonparametric conditional mean estimation Semiparametric conditional mean estimation. Nonparametrics. Gabriel Montes-Rojas

Density estimation Nonparametric conditional mean estimation Semiparametric conditional mean estimation. Nonparametrics. Gabriel Montes-Rojas 0 0 5 Motivation: Regression discontinuity (Angrist&Pischke) Outcome.5 1 1.5 A. Linear E[Y 0i X i] 0.2.4.6.8 1 X Outcome.5 1 1.5 B. Nonlinear E[Y 0i X i] i 0.2.4.6.8 1 X utcome.5 1 1.5 C. Nonlinearity

More information

Nonparametric confidence intervals. for receiver operating characteristic curves

Nonparametric confidence intervals. for receiver operating characteristic curves Nonparametric confidence intervals for receiver operating characteristic curves Peter G. Hall 1, Rob J. Hyndman 2, and Yanan Fan 3 5 December 2003 Abstract: We study methods for constructing confidence

More information

DEPARTMENT MATHEMATIK ARBEITSBEREICH MATHEMATISCHE STATISTIK UND STOCHASTISCHE PROZESSE

DEPARTMENT MATHEMATIK ARBEITSBEREICH MATHEMATISCHE STATISTIK UND STOCHASTISCHE PROZESSE Estimating the error distribution in nonparametric multiple regression with applications to model testing Natalie Neumeyer & Ingrid Van Keilegom Preprint No. 2008-01 July 2008 DEPARTMENT MATHEMATIK ARBEITSBEREICH

More information

Nonparametric Econometrics

Nonparametric Econometrics Applied Microeconometrics with Stata Nonparametric Econometrics Spring Term 2011 1 / 37 Contents Introduction The histogram estimator The kernel density estimator Nonparametric regression estimators Semi-

More information

Nonparametric Regression. Changliang Zou

Nonparametric Regression. Changliang Zou Nonparametric Regression Institute of Statistics, Nankai University Email: nk.chlzou@gmail.com Smoothing parameter selection An overall measure of how well m h (x) performs in estimating m(x) over x (0,

More information

Variance Function Estimation in Multivariate Nonparametric Regression

Variance Function Estimation in Multivariate Nonparametric Regression Variance Function Estimation in Multivariate Nonparametric Regression T. Tony Cai 1, Michael Levine Lie Wang 1 Abstract Variance function estimation in multivariate nonparametric regression is considered

More information

Smooth simultaneous confidence bands for cumulative distribution functions

Smooth simultaneous confidence bands for cumulative distribution functions Journal of Nonparametric Statistics, 2013 Vol. 25, No. 2, 395 407, http://dx.doi.org/10.1080/10485252.2012.759219 Smooth simultaneous confidence bands for cumulative distribution functions Jiangyan Wang

More information

Simple and Efficient Improvements of Multivariate Local Linear Regression

Simple and Efficient Improvements of Multivariate Local Linear Regression Journal of Multivariate Analysis Simple and Efficient Improvements of Multivariate Local Linear Regression Ming-Yen Cheng 1 and Liang Peng Abstract This paper studies improvements of multivariate local

More information

Local Polynomial Modelling and Its Applications

Local Polynomial Modelling and Its Applications Local Polynomial Modelling and Its Applications J. Fan Department of Statistics University of North Carolina Chapel Hill, USA and I. Gijbels Institute of Statistics Catholic University oflouvain Louvain-la-Neuve,

More information

Automatic Local Smoothing for Spectral Density. Abstract. This article uses local polynomial techniques to t Whittle's likelihood for spectral density

Automatic Local Smoothing for Spectral Density. Abstract. This article uses local polynomial techniques to t Whittle's likelihood for spectral density Automatic Local Smoothing for Spectral Density Estimation Jianqing Fan Department of Statistics University of North Carolina Chapel Hill, N.C. 27599-3260 Eva Kreutzberger Department of Mathematics University

More information

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract Journal of Data Science,17(1). P. 145-160,2019 DOI:10.6339/JDS.201901_17(1).0007 WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION Wei Xiong *, Maozai Tian 2 1 School of Statistics, University of

More information

Bandwidth selection for kernel conditional density

Bandwidth selection for kernel conditional density Bandwidth selection for kernel conditional density estimation David M Bashtannyk and Rob J Hyndman 1 10 August 1999 Abstract: We consider bandwidth selection for the kernel estimator of conditional density

More information

Adaptive Kernel Estimation of The Hazard Rate Function

Adaptive Kernel Estimation of The Hazard Rate Function Adaptive Kernel Estimation of The Hazard Rate Function Raid Salha Department of Mathematics, Islamic University of Gaza, Palestine, e-mail: rbsalha@mail.iugaza.edu Abstract In this paper, we generalized

More information

Nonparametric Density Estimation (Multidimension)

Nonparametric Density Estimation (Multidimension) Nonparametric Density Estimation (Multidimension) Härdle, Müller, Sperlich, Werwarz, 1995, Nonparametric and Semiparametric Models, An Introduction Tine Buch-Kromann February 19, 2007 Setup One-dimensional

More information

Estimation of a quadratic regression functional using the sinc kernel

Estimation of a quadratic regression functional using the sinc kernel Estimation of a quadratic regression functional using the sinc kernel Nicolai Bissantz Hajo Holzmann Institute for Mathematical Stochastics, Georg-August-University Göttingen, Maschmühlenweg 8 10, D-37073

More information

Test for Discontinuities in Nonparametric Regression

Test for Discontinuities in Nonparametric Regression Communications of the Korean Statistical Society Vol. 15, No. 5, 2008, pp. 709 717 Test for Discontinuities in Nonparametric Regression Dongryeon Park 1) Abstract The difference of two one-sided kernel

More information

Minimax Rate of Convergence for an Estimator of the Functional Component in a Semiparametric Multivariate Partially Linear Model.

Minimax Rate of Convergence for an Estimator of the Functional Component in a Semiparametric Multivariate Partially Linear Model. Minimax Rate of Convergence for an Estimator of the Functional Component in a Semiparametric Multivariate Partially Linear Model By Michael Levine Purdue University Technical Report #14-03 Department of

More information

Nonparametric Methods

Nonparametric Methods Nonparametric Methods Michael R. Roberts Department of Finance The Wharton School University of Pennsylvania July 28, 2009 Michael R. Roberts Nonparametric Methods 1/42 Overview Great for data analysis

More information

Local Adaptive Smoothing in Kernel Regression Estimation

Local Adaptive Smoothing in Kernel Regression Estimation Clemson University TigerPrints All Theses Theses 8-009 Local Adaptive Smoothing in Kernel Regression Estimation Qi Zheng Clemson University, qiz@clemson.edu Follow this and additional works at: https://tigerprints.clemson.edu/all_theses

More information

Nonparametric Density Estimation

Nonparametric Density Estimation Nonparametric Density Estimation Econ 690 Purdue University Justin L. Tobias (Purdue) Nonparametric Density Estimation 1 / 29 Density Estimation Suppose that you had some data, say on wages, and you wanted

More information

Nonparametric Regression Härdle, Müller, Sperlich, Werwarz, 1995, Nonparametric and Semiparametric Models, An Introduction

Nonparametric Regression Härdle, Müller, Sperlich, Werwarz, 1995, Nonparametric and Semiparametric Models, An Introduction Härdle, Müller, Sperlich, Werwarz, 1995, Nonparametric and Semiparametric Models, An Introduction Tine Buch-Kromann Univariate Kernel Regression The relationship between two variables, X and Y where m(

More information

Introduction. Linear Regression. coefficient estimates for the wage equation: E(Y X) = X 1 β X d β d = X β

Introduction. Linear Regression. coefficient estimates for the wage equation: E(Y X) = X 1 β X d β d = X β Introduction - Introduction -2 Introduction Linear Regression E(Y X) = X β +...+X d β d = X β Example: Wage equation Y = log wages, X = schooling (measured in years), labor market experience (measured

More information

41903: Introduction to Nonparametrics

41903: Introduction to Nonparametrics 41903: Notes 5 Introduction Nonparametrics fundamentally about fitting flexible models: want model that is flexible enough to accommodate important patterns but not so flexible it overspecializes to specific

More information

Nonparametric Modal Regression

Nonparametric Modal Regression Nonparametric Modal Regression Summary In this article, we propose a new nonparametric modal regression model, which aims to estimate the mode of the conditional density of Y given predictors X. The nonparametric

More information

Bickel Rosenblatt test

Bickel Rosenblatt test University of Latvia 28.05.2011. A classical Let X 1,..., X n be i.i.d. random variables with a continuous probability density function f. Consider a simple hypothesis H 0 : f = f 0 with a significance

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

Integral approximation by kernel smoothing

Integral approximation by kernel smoothing Integral approximation by kernel smoothing François Portier Université catholique de Louvain - ISBA August, 29 2014 In collaboration with Bernard Delyon Topic of the talk: Given ϕ : R d R, estimation of

More information

A New Procedure for Multiple Testing of Econometric Models

A New Procedure for Multiple Testing of Econometric Models A New Procedure for Multiple Testing of Econometric Models Maxwell L. King 1, Xibin Zhang, and Muhammad Akram Department of Econometrics and Business Statistics Monash University, Australia April 2007

More information

Nonparametric Function Estimation with Infinite-Order Kernels

Nonparametric Function Estimation with Infinite-Order Kernels Nonparametric Function Estimation with Infinite-Order Kernels Arthur Berg Department of Statistics, University of Florida March 15, 2008 Kernel Density Estimation (IID Case) Let X 1,..., X n iid density

More information

DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA

DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA Statistica Sinica 18(2008), 515-534 DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA Kani Chen 1, Jianqing Fan 2 and Zhezhen Jin 3 1 Hong Kong University of Science and Technology,

More information

Statistica Sinica Preprint No: SS

Statistica Sinica Preprint No: SS Statistica Sinica Preprint No: SS-017-0013 Title A Bootstrap Method for Constructing Pointwise and Uniform Confidence Bands for Conditional Quantile Functions Manuscript ID SS-017-0013 URL http://wwwstatsinicaedutw/statistica/

More information

Does k-th Moment Exist?

Does k-th Moment Exist? Does k-th Moment Exist? Hitomi, K. 1 and Y. Nishiyama 2 1 Kyoto Institute of Technology, Japan 2 Institute of Economic Research, Kyoto University, Japan Email: hitomi@kit.ac.jp Keywords: Existence of moments,

More information

On a Nonparametric Notion of Residual and its Applications

On a Nonparametric Notion of Residual and its Applications On a Nonparametric Notion of Residual and its Applications Bodhisattva Sen and Gábor Székely arxiv:1409.3886v1 [stat.me] 12 Sep 2014 Columbia University and National Science Foundation September 16, 2014

More information

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA This article was downloaded by: [University of New Mexico] On: 27 September 2012, At: 22:13 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered

More information

A Design Unbiased Variance Estimator of the Systematic Sample Means

A Design Unbiased Variance Estimator of the Systematic Sample Means American Journal of Theoretical and Applied Statistics 2015; 4(3): 201-210 Published online May 29, 2015 (http://www.sciencepublishinggroup.com/j/ajtas) doi: 10.1148/j.ajtas.20150403.27 ISSN: 232-8999

More information

NADARAYA WATSON ESTIMATE JAN 10, 2006: version 2. Y ik ( x i

NADARAYA WATSON ESTIMATE JAN 10, 2006: version 2. Y ik ( x i NADARAYA WATSON ESTIMATE JAN 0, 2006: version 2 DATA: (x i, Y i, i =,..., n. ESTIMATE E(Y x = m(x by n i= ˆm (x = Y ik ( x i x n i= K ( x i x EXAMPLES OF K: K(u = I{ u c} (uniform or box kernel K(u = u

More information

Preface. 1 Nonparametric Density Estimation and Testing. 1.1 Introduction. 1.2 Univariate Density Estimation

Preface. 1 Nonparametric Density Estimation and Testing. 1.1 Introduction. 1.2 Univariate Density Estimation Preface Nonparametric econometrics has become one of the most important sub-fields in modern econometrics. The primary goal of this lecture note is to introduce various nonparametric and semiparametric

More information

Function of Longitudinal Data

Function of Longitudinal Data New Local Estimation Procedure for Nonparametric Regression Function of Longitudinal Data Weixin Yao and Runze Li Abstract This paper develops a new estimation of nonparametric regression functions for

More information

A Bootstrap Test for Conditional Symmetry

A Bootstrap Test for Conditional Symmetry ANNALS OF ECONOMICS AND FINANCE 6, 51 61 005) A Bootstrap Test for Conditional Symmetry Liangjun Su Guanghua School of Management, Peking University E-mail: lsu@gsm.pku.edu.cn and Sainan Jin Guanghua School

More information

Transformation and Smoothing in Sample Survey Data

Transformation and Smoothing in Sample Survey Data Scandinavian Journal of Statistics, Vol. 37: 496 513, 2010 doi: 10.1111/j.1467-9469.2010.00691.x Published by Blackwell Publishing Ltd. Transformation and Smoothing in Sample Survey Data YANYUAN MA Department

More information

Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon

Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon Jianqing Fan Department of Statistics Chinese University of Hong Kong AND Department of Statistics

More information

O Combining cross-validation and plug-in methods - for kernel density bandwidth selection O

O Combining cross-validation and plug-in methods - for kernel density bandwidth selection O O Combining cross-validation and plug-in methods - for kernel density selection O Carlos Tenreiro CMUC and DMUC, University of Coimbra PhD Program UC UP February 18, 2011 1 Overview The nonparametric problem

More information

A review of some semiparametric regression models with application to scoring

A review of some semiparametric regression models with application to scoring A review of some semiparametric regression models with application to scoring Jean-Loïc Berthet 1 and Valentin Patilea 2 1 ENSAI Campus de Ker-Lann Rue Blaise Pascal - BP 37203 35172 Bruz cedex, France

More information

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley

More information

Local Modal Regression

Local Modal Regression Local Modal Regression Weixin Yao Department of Statistics, Kansas State University, Manhattan, Kansas 66506, U.S.A. wxyao@ksu.edu Bruce G. Lindsay and Runze Li Department of Statistics, The Pennsylvania

More information

A simple bootstrap method for constructing nonparametric confidence bands for functions

A simple bootstrap method for constructing nonparametric confidence bands for functions A simple bootstrap method for constructing nonparametric confidence bands for functions Peter Hall Joel Horowitz The Institute for Fiscal Studies Department of Economics, UCL cemmap working paper CWP4/2

More information

Single Index Quantile Regression for Heteroscedastic Data

Single Index Quantile Regression for Heteroscedastic Data Single Index Quantile Regression for Heteroscedastic Data E. Christou M. G. Akritas Department of Statistics The Pennsylvania State University SMAC, November 6, 2015 E. Christou, M. G. Akritas (PSU) SIQR

More information

SEMIPARAMETRIC ESTIMATION OF CONDITIONAL HETEROSCEDASTICITY VIA SINGLE-INDEX MODELING

SEMIPARAMETRIC ESTIMATION OF CONDITIONAL HETEROSCEDASTICITY VIA SINGLE-INDEX MODELING Statistica Sinica 3 (013), 135-155 doi:http://dx.doi.org/10.5705/ss.01.075 SEMIPARAMERIC ESIMAION OF CONDIIONAL HEEROSCEDASICIY VIA SINGLE-INDEX MODELING Liping Zhu, Yuexiao Dong and Runze Li Shanghai

More information

Bayesian estimation of bandwidths for a nonparametric regression model with a flexible error density

Bayesian estimation of bandwidths for a nonparametric regression model with a flexible error density ISSN 1440-771X Australia Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Bayesian estimation of bandwidths for a nonparametric regression model

More information

Goodness-of-fit tests for the cure rate in a mixture cure model

Goodness-of-fit tests for the cure rate in a mixture cure model Biometrika (217), 13, 1, pp. 1 7 Printed in Great Britain Advance Access publication on 31 July 216 Goodness-of-fit tests for the cure rate in a mixture cure model BY U.U. MÜLLER Department of Statistics,

More information

Some Theories about Backfitting Algorithm for Varying Coefficient Partially Linear Model

Some Theories about Backfitting Algorithm for Varying Coefficient Partially Linear Model Some Theories about Backfitting Algorithm for Varying Coefficient Partially Linear Model 1. Introduction Varying-coefficient partially linear model (Zhang, Lee, and Song, 2002; Xia, Zhang, and Tong, 2004;

More information

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds.

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. THE EFFECTS OF ESTIMATION OF HETEROSCEDASTICITY ON STOCHASTIC

More information

Local linear hazard rate estimation and bandwidth selection

Local linear hazard rate estimation and bandwidth selection Ann Inst Stat Math 2011) 63:1019 1046 DOI 10.1007/s10463-010-0277-6 Local linear hazard rate estimation and bandwidth selection Dimitrios Bagkavos Received: 23 February 2009 / Revised: 27 May 2009 / Published

More information

Convergence rates for uniform confidence intervals based on local polynomial regression estimators

Convergence rates for uniform confidence intervals based on local polynomial regression estimators Journal of Nonparametric Statistics ISSN: 1048-5252 Print) 1029-0311 Online) Journal homepage: http://www.tandfonline.com/loi/gnst20 Convergence rates for uniform confidence intervals based on local polynomial

More information

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers 6th St.Petersburg Workshop on Simulation (2009) 1-3 A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers Ansgar Steland 1 Abstract Sequential kernel smoothers form a class of procedures

More information

Optimal Estimation of Derivatives in Nonparametric Regression

Optimal Estimation of Derivatives in Nonparametric Regression Journal of Machine Learning Research 17 (2016) 1-25 Submitted 12/15; Revised 4/16; Published 8/16 Optimal Estimation of Derivatives in Nonparametric Regression Wenlin Dai CEMSE Division King Abdullah University

More information

Quantitative Economics for the Evaluation of the European Policy. Dipartimento di Economia e Management

Quantitative Economics for the Evaluation of the European Policy. Dipartimento di Economia e Management Quantitative Economics for the Evaluation of the European Policy Dipartimento di Economia e Management Irene Brunetti 1 Davide Fiaschi 2 Angela Parenti 3 9 ottobre 2015 1 ireneb@ec.unipi.it. 2 davide.fiaschi@unipi.it.

More information

4 Nonparametric Regression

4 Nonparametric Regression 4 Nonparametric Regression 4.1 Univariate Kernel Regression An important question in many fields of science is the relation between two variables, say X and Y. Regression analysis is concerned with the

More information

Optimal bandwidth selection for differences of nonparametric estimators with an application to the sharp regression discontinuity design

Optimal bandwidth selection for differences of nonparametric estimators with an application to the sharp regression discontinuity design Optimal bandwidth selection for differences of nonparametric estimators with an application to the sharp regression discontinuity design Yoichi Arai Hidehiko Ichimura The Institute for Fiscal Studies Department

More information

Some Monte Carlo Evidence for Adaptive Estimation of Unit-Time Varying Heteroscedastic Panel Data Models

Some Monte Carlo Evidence for Adaptive Estimation of Unit-Time Varying Heteroscedastic Panel Data Models Some Monte Carlo Evidence for Adaptive Estimation of Unit-Time Varying Heteroscedastic Panel Data Models G. R. Pasha Department of Statistics, Bahauddin Zakariya University Multan, Pakistan E-mail: drpasha@bzu.edu.pk

More information

ECON 721: Lecture Notes on Nonparametric Density and Regression Estimation. Petra E. Todd

ECON 721: Lecture Notes on Nonparametric Density and Regression Estimation. Petra E. Todd ECON 721: Lecture Notes on Nonparametric Density and Regression Estimation Petra E. Todd Fall, 2014 2 Contents 1 Review of Stochastic Order Symbols 1 2 Nonparametric Density Estimation 3 2.1 Histogram

More information

Multiscale Exploratory Analysis of Regression Quantiles using Quantile SiZer

Multiscale Exploratory Analysis of Regression Quantiles using Quantile SiZer Multiscale Exploratory Analysis of Regression Quantiles using Quantile SiZer Cheolwoo Park Thomas C. M. Lee Jan Hannig April 6, 2 Abstract The SiZer methodology proposed by Chaudhuri & Marron (999) is

More information

Uniform Convergence Rates for Nonparametric Estimation

Uniform Convergence Rates for Nonparametric Estimation Uniform Convergence Rates for Nonparametric Estimation Bruce E. Hansen University of Wisconsin www.ssc.wisc.edu/~bansen October 2004 Preliminary and Incomplete Abstract Tis paper presents a set of rate

More information

Adaptive Nonparametric Density Estimators

Adaptive Nonparametric Density Estimators Adaptive Nonparametric Density Estimators by Alan J. Izenman Introduction Theoretical results and practical application of histograms as density estimators usually assume a fixed-partition approach, where

More information

CALCULATING DEGREES OF FREEDOM IN MULTIVARIATE LOCAL POLYNOMIAL REGRESSION. 1. Introduction

CALCULATING DEGREES OF FREEDOM IN MULTIVARIATE LOCAL POLYNOMIAL REGRESSION. 1. Introduction CALCULATING DEGREES OF FREEDOM IN MULTIVARIATE LOCAL POLYNOMIAL REGRESSION NADINE MCCLOUD AND CHRISTOPHER F. PARMETER Abstract. The matrix that transforms the response variable in a regression to its predicted

More information

Estimation and Testing for Varying Coefficients in Additive Models with Marginal Integration

Estimation and Testing for Varying Coefficients in Additive Models with Marginal Integration Estimation and Testing for Varying Coefficients in Additive Models with Marginal Integration Lijian Yang Byeong U. Park Lan Xue Wolfgang Härdle September 6, 2005 Abstract We propose marginal integration

More information

A NOTE ON TESTING THE SHOULDER CONDITION IN LINE TRANSECT SAMPLING

A NOTE ON TESTING THE SHOULDER CONDITION IN LINE TRANSECT SAMPLING A NOTE ON TESTING THE SHOULDER CONDITION IN LINE TRANSECT SAMPLING Shunpu Zhang Department of Mathematical Sciences University of Alaska, Fairbanks, Alaska, U.S.A. 99775 ABSTRACT We propose a new method

More information

Histogram Härdle, Müller, Sperlich, Werwatz, 1995, Nonparametric and Semiparametric Models, An Introduction

Histogram Härdle, Müller, Sperlich, Werwatz, 1995, Nonparametric and Semiparametric Models, An Introduction Härdle, Müller, Sperlich, Werwatz, 1995, Nonparametric and Semiparametric Models, An Introduction Tine Buch-Kromann Construction X 1,..., X n iid r.v. with (unknown) density, f. Aim: Estimate the density

More information

Teruko Takada Department of Economics, University of Illinois. Abstract

Teruko Takada Department of Economics, University of Illinois. Abstract Nonparametric density estimation: A comparative study Teruko Takada Department of Economics, University of Illinois Abstract Motivated by finance applications, the objective of this paper is to assess

More information

Single Index Quantile Regression for Heteroscedastic Data

Single Index Quantile Regression for Heteroscedastic Data Single Index Quantile Regression for Heteroscedastic Data E. Christou M. G. Akritas Department of Statistics The Pennsylvania State University JSM, 2015 E. Christou, M. G. Akritas (PSU) SIQR JSM, 2015

More information

On Asymptotic Normality of the Local Polynomial Regression Estimator with Stochastic Bandwidths 1. Carlos Martins-Filho.

On Asymptotic Normality of the Local Polynomial Regression Estimator with Stochastic Bandwidths 1. Carlos Martins-Filho. On Asymptotic Normality of the Local Polynomial Regression Estimator with Stochastic Bandwidths 1 Carlos Martins-Filho Department of Economics IFPRI University of Colorado 2033 K Street NW Boulder, CO

More information

Nonparametric Estimation of Regression Functions In the Presence of Irrelevant Regressors

Nonparametric Estimation of Regression Functions In the Presence of Irrelevant Regressors Nonparametric Estimation of Regression Functions In the Presence of Irrelevant Regressors Peter Hall, Qi Li, Jeff Racine 1 Introduction Nonparametric techniques robust to functional form specification.

More information

A Bayesian approach to parameter estimation for kernel density estimation via transformations

A Bayesian approach to parameter estimation for kernel density estimation via transformations A Bayesian approach to parameter estimation for kernel density estimation via transformations Qing Liu,, David Pitt 2, Xibin Zhang 3, Xueyuan Wu Centre for Actuarial Studies, Faculty of Business and Economics,

More information

Error distribution function for parametrically truncated and censored data

Error distribution function for parametrically truncated and censored data Error distribution function for parametrically truncated and censored data Géraldine LAURENT Jointly with Cédric HEUCHENNE QuantOM, HEC-ULg Management School - University of Liège Friday, 14 September

More information

Nonparametric Inference via Bootstrapping the Debiased Estimator

Nonparametric Inference via Bootstrapping the Debiased Estimator Nonparametric Inference via Bootstrapping the Debiased Estimator Yen-Chi Chen Department of Statistics, University of Washington ICSA-Canada Chapter Symposium 2017 1 / 21 Problem Setup Let X 1,, X n be

More information

Time Series and Forecasting Lecture 4 NonLinear Time Series

Time Series and Forecasting Lecture 4 NonLinear Time Series Time Series and Forecasting Lecture 4 NonLinear Time Series Bruce E. Hansen Summer School in Economics and Econometrics University of Crete July 23-27, 2012 Bruce Hansen (University of Wisconsin) Foundations

More information

Additional results for model-based nonparametric variance estimation for systematic sampling in a forestry survey

Additional results for model-based nonparametric variance estimation for systematic sampling in a forestry survey Additional results for model-based nonparametric variance estimation for systematic sampling in a forestry survey J.D. Opsomer Colorado State University M. Francisco-Fernández Universidad de A Coruña July

More information

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Robert V. Breunig Centre for Economic Policy Research, Research School of Social Sciences and School of

More information

Relative error prediction via kernel regression smoothers

Relative error prediction via kernel regression smoothers Relative error prediction via kernel regression smoothers Heungsun Park a, Key-Il Shin a, M.C. Jones b, S.K. Vines b a Department of Statistics, Hankuk University of Foreign Studies, YongIn KyungKi, 449-791,

More information

Testing Equality of Nonparametric Quantile Regression Functions

Testing Equality of Nonparametric Quantile Regression Functions International Journal of Statistics and Probability; Vol. 3, No. 1; 2014 ISSN 1927-7032 E-ISSN 1927-7040 Published by Canadian Center of Science and Education Testing Equality of Nonparametric Quantile

More information

Nonparametric Regression. Badr Missaoui

Nonparametric Regression. Badr Missaoui Badr Missaoui Outline Kernel and local polynomial regression. Penalized regression. We are given n pairs of observations (X 1, Y 1 ),...,(X n, Y n ) where Y i = r(x i ) + ε i, i = 1,..., n and r(x) = E(Y

More information

Logistic Kernel Estimator and Bandwidth Selection. for Density Function

Logistic Kernel Estimator and Bandwidth Selection. for Density Function International Journal of Contemporary Matematical Sciences Vol. 13, 2018, no. 6, 279-286 HIKARI Ltd, www.m-ikari.com ttps://doi.org/10.12988/ijcms.2018.81133 Logistic Kernel Estimator and Bandwidt Selection

More information

Package NonpModelCheck

Package NonpModelCheck Type Package Package NonpModelCheck April 27, 2017 Title Model Checking and Variable Selection in Nonparametric Regression Version 3.0 Date 2017-04-27 Author Adriano Zanin Zambom Maintainer Adriano Zanin

More information

New Local Estimation Procedure for Nonparametric Regression Function of Longitudinal Data

New Local Estimation Procedure for Nonparametric Regression Function of Longitudinal Data ew Local Estimation Procedure for onparametric Regression Function of Longitudinal Data Weixin Yao and Runze Li The Pennsylvania State University Technical Report Series #0-03 College of Health and Human

More information

PREWHITENING-BASED ESTIMATION IN PARTIAL LINEAR REGRESSION MODELS: A COMPARATIVE STUDY

PREWHITENING-BASED ESTIMATION IN PARTIAL LINEAR REGRESSION MODELS: A COMPARATIVE STUDY REVSTAT Statistical Journal Volume 7, Number 1, April 2009, 37 54 PREWHITENING-BASED ESTIMATION IN PARTIAL LINEAR REGRESSION MODELS: A COMPARATIVE STUDY Authors: Germán Aneiros-Pérez Departamento de Matemáticas,

More information

Regression Discontinuity Design Econometric Issues

Regression Discontinuity Design Econometric Issues Regression Discontinuity Design Econometric Issues Brian P. McCall University of Michigan Texas Schools Project, University of Texas, Dallas November 20, 2009 1 Regression Discontinuity Design Introduction

More information

Optimal Bandwidth Choice for the Regression Discontinuity Estimator

Optimal Bandwidth Choice for the Regression Discontinuity Estimator Optimal Bandwidth Choice for the Regression Discontinuity Estimator Guido Imbens and Karthik Kalyanaraman First Draft: June 8 This Draft: September Abstract We investigate the choice of the bandwidth for

More information

On the Robust Modal Local Polynomial Regression

On the Robust Modal Local Polynomial Regression International Journal of Statistical Sciences ISSN 683 5603 Vol. 9(Special Issue), 2009, pp 27-23 c 2009 Dept. of Statistics, Univ. of Rajshahi, Bangladesh On the Robust Modal Local Polynomial Regression

More information

Plot of λ(x) against x x. SiZer map based on local likelihood (n=500) x

Plot of λ(x) against x x. SiZer map based on local likelihood (n=500) x Local Likelihood SiZer Map Runze Li Department of Statistics Pennsylvania State University University Park, PA 682-2 Email: rli@stat.psu.edu J. S. Marron Department of Statistics University of North Carolina

More information

Titolo Smooth Backfitting with R

Titolo Smooth Backfitting with R Rapporto n. 176 Titolo Smooth Backfitting with R Autori Alberto Arcagni, Luca Bagnato ottobre 2009 Dipartimento di Metodi Quantitativi per le Scienze Economiche ed Aziendali Università degli Studi di Milano

More information

Introduction to Regression

Introduction to Regression Introduction to Regression p. 1/97 Introduction to Regression Chad Schafer cschafer@stat.cmu.edu Carnegie Mellon University Introduction to Regression p. 1/97 Acknowledgement Larry Wasserman, All of Nonparametric

More information

The Priestley-Chao Estimator - Bias, Variance and Mean-Square Error

The Priestley-Chao Estimator - Bias, Variance and Mean-Square Error The Priestley-Chao Estimator - Bias, Variance and Mean-Square Error Bias, variance and mse properties In the previous section we saw that the eact mean and variance of the Pristley-Chao estimator ˆm()

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Rejoinder. 1 Phase I and Phase II Profile Monitoring. Peihua Qiu 1, Changliang Zou 2 and Zhaojun Wang 2

Rejoinder. 1 Phase I and Phase II Profile Monitoring. Peihua Qiu 1, Changliang Zou 2 and Zhaojun Wang 2 Rejoinder Peihua Qiu 1, Changliang Zou 2 and Zhaojun Wang 2 1 School of Statistics, University of Minnesota 2 LPMC and Department of Statistics, Nankai University, China We thank the editor Professor David

More information

LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION

LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION Statistica Sinica 24 (2014), 1215-1238 doi:http://dx.doi.org/10.5705/ss.2012.040 LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION Li-Shan Huang and Kung-Sik Chan National Tsing Hua University

More information

Smooth functions and local extreme values

Smooth functions and local extreme values Smooth functions and local extreme values A. Kovac 1 Department of Mathematics University of Bristol Abstract Given a sample of n observations y 1,..., y n at time points t 1,..., t n we consider the problem

More information