Bias Correction and Higher Order Kernel Functions

Size: px
Start display at page:

Download "Bias Correction and Higher Order Kernel Functions"

Transcription

1 Bias Correction and Higher Order Kernel Functions Tien-Chung Hu 1 Department of Mathematics National Tsing-Hua University Hsinchu, Taiwan Jianqing Fan Department of Statistics University of North Carolina Chapel Hill, N.C Abstract. Kernel density estimates are frequently used, based on a second order kernel. Thus, the bias inherent to the estimates has an order of O(h~). In this note, a method of corr~cting the bias in the kernel density estimates is provided, which reduces the bias to a smaller order. Effectively, this method produces a higher order kernel based on a second order kernel. For a kernel function K, the functions and 1 K(lc-l)( )/ f~oo K(lc-l)(z)/zdz z z are kernels of order k, under some mild conditions. lcompleted while visiting Department of Statistics, University of North Carolina, Chapel Hill. Abbreviated title. Bias Correction. AMS 1980 subject classification. Primary 62G05. Secondary 62G20. Key words and phrases. Bias correction, higher order kernel, kernel density estimate, nonparametrics. 1

2 1 Introduction Consider data that can be thought of as a random sample from a distribution having an unknown density. It is common practice to summarize the data with some kinds of statistics. Unless the form ofthe density is known, it is also very helpful to examine graphical representations and overall structures of the data. Kernel density estimates provide a useful tool for these purpose. See Silverman (1986), Eubank (1988), Miiller (1988), HardIe (1990) and Wahba (1990) for many examples of this, and good introductions to the general subject area. Great efforts have been made to select a bandwidth for a kernel density estimate based on a second order kernel, because such an estimate is easily explainable. A large amount of recent progress has been obtained on data based smoothing parameter selection, see. Rice (1984), Marron (1988), Hall et al. (1990), Jones, Marron and Park (1990), Chiu (1990), Fan and Marron (1990) and among others. Most of these bandwidth selectors have extremely fast rates of convergence to their theoretical optimal. However, since the second order kernel is used in the density estimate, the bias inherent to the estimate is always of order n- 2!5, no matter how good an automatic bandwidth selector is. This amount of bias may sometimes obscure the interesting features such the number of modes and height of the underlying density at the modes. In such cases, bias correction to the kernel density estimate is desirable. The discussion on this issue forms the core of this paper. For a set of random sample Xl,"', X n, a kernel density estimator is defined by (See Rosenblatt (1956)) in(x) = n~n ~I( (X ~nxj), (1.1) where K is a kernel function and h n is a bandwidth. We will concentrate on how the bia.s of (1.1) can be estimated for a nonrandom bandwidth h n. A method of correcting bias is given in section 2. Effectively, we give a method for constructing a higher order kernel based on a second order kernel. This provides a new insight to the effect of an higher order kernel. 2

3 There are several methods of constructing a higher order kernel. Schucany and Sommers (1977) propose a method based on the generalized jackknife to higher order kernels. A useful class of higher order kernel based on Gaussian density can be found in Wand and Schucany (1990). BerliJ:!-et (1990) using the idea of reproducing kernel in a Hilbert space to construct a class of higher order kernel and discuss its consequences. The optimalities of higher order kernels' are discussed in Millier (1984), Gasser et al. (1985), and Granovsky and Miiller (1990). Mathematically, most of methods above are directly targeted at finding a function ](r satisfying 1: ](r(z)dz =1,1: zqkr(z)dz = O,q = 1,,T -1, and 1: Izr Kr(z)ldz < 00. We take a different approach from the pioneering work by correcting bias directly. As a result of bias correction, a class of higher order kernel is constructed. Section 2 gives a precise formulation, and discussion, of the main results. Proofs are in section 3. 2 Main Results Let's illustrate how the bias of the kernel density can be corrected. Mathematical justifications are given in section 3. Observe that the kernel density estimator (1.1) is an unbiased estimator of J~oo f(x - hny)k(y)dy: E]n(x) =1:f(x - hny)k(y)dy. (2.1) Taking derivatives j times with respective to h n yields an unbiased estimate of the functional (viewing x as a fixed point) (2.2) and the unbiased estimate of OJ is given by A A ()i OJ(X) =-.fn(x), ah~ (2.3) 3

4 where in is the kernel density estimate defined by (1.1). Let's assume that the unknown density has k bounded continuous derivatives. Now, the Taylor expansion of f(x) yields f(x) = f(x-hny+hny) = k-l 1 L 1f(i)(x - hny)(hny)i + O(h~) j=o J. (2.4) Multiplying K(y) and then integrating both sides of (2.4) with respect to y, we have k-l (-hn)j f(x) = L., 8j(x) +O(hn), j=o J. where the fact J~oo K(y)dy = 1 is used. Thus, one can use OJ(x) to correct the bias k-l ( h )j 8o(x) - f(x) = - L -.; 8j(x) + O(h~) j=l J. of kernel density estimate (1.1). In other words, a bias-corrected estimate is defined as k-l (-hn)j fb(x) = L " 8j(x). j=o J. Let's give a simpler formula for the bias-corrected estimate (2.5). Lemma 2.1. If K( ) has bounded k th derivative, then k (2.5) where K,(Z) = zlk(l)(z), and OJ was defined by (2.3). By Lemma 2.1, the bias-corrected estimate (2.5) can be written as with (2.6) (2.7) 4

5 where the identity that was used. Thus, effectively the efforts of bias correction of kernel density estimate produce another kernel function Wk(') defined by (2.7). As intuitively expected, Wk( x) is a k th order kernel, which is justified by Theorem 1. If the kernel function K (.) satisfies 1: K(y)dy = 1 and 1:ly2k K(k)(y)ldy < 00, then the function Wk(') is a k th order kernel: 1: Wk(x)dx = 1, 1:x"Wk(x)dx = O,Jor s = 1"" k - 1, and (2.8) Since Wk( x) is a k th order kernel, it follows that Theorem 2. Let K satisfy the condition of Theorem 1 and let f( ) have k th bounded continuous derivative. Then, f(k)(x)100 A Efb(x) = f(x) - k! -00 x k K(x)dxh~(1 + 0(1)). Thus, the bias-corrected estimate does have the order of bias as expected. Since W k is a k th order kernel, a similar conclusion holds for the Mean Integrated Square Error (MISE). Remark 1. When K is symmetric, the kernel function (2.7) is also symmetric. In such a case, if k = 2r - 1 is an odd integer, then W 2r - 1 is also a kernel of order 2r which can easily justified by (2.8), and satisfies 5

6 Example 1. Let's take a standard normal density as a kernel function. Then, 1 K(x) = </>(x) = /icexp(-x 2 /2) y21l" </>(I)(x) = (-1)IH,(x)</>(x), where H,(x) is the Hermite polynomial of order 1. Thus, by (2.7), is a k th order kernel with k-1 ( k ) Wk(X) = L 1= (-x)ih,(x)</>(x)/l! if k = 2r - 1 if k = 2r. (2.9) Note that also that W 2r - 1 ( x) is a kernel of order 2r with These kernel functions are different from the kernel functions derived by Wand and Schucany (1990). The following table list the first few kernel functions (2.9), which is computed by a computer program. k 2 (_x 2 + 2)t/>(x) 3 (x - 7x 2 + 6)t/>(x)/2 4 (_x x - 48x2 + 24)t/>(x)/6 5 (x 8-26x x - 360x )t/>(x)/24 6 (_x x 8-495x x x )t/>(x)/120 7 (x 12-57r o +1095x8-8625x x x )t/>(x)/ It appears that the higher kernels produced by (2.9) are quite complicated, which make them less useful. However, a simple method is possible. Observe that for 1 2:: 1 i:1((1)(x )dx = O. 6

7 By integration by parts, we obtain 100 x'k(k-l)(x)dx = { 0, -00 (_l)(k-l)(k - I)!, Thus, if J~oo IK(k-l)(x)/xldx < 00, then by (2.10) if 1= 0".., k - 2 ifl =k-1 (2.10) (2.11) is a kernel of order k. Theorem 3. Let K (x) be a kernel function satisfying and x'k(l-l)(x) --+ 0, as Ixl--+ oo,lor 1= 1,'",k-1. Then, Kk defined by (2.11) is a kernel function satisfying if 1= 1,,,,,k - 1 if 1 = k Remark 2. When K( ) is a even function, then K(2r-l)(0) = O. Thus, if K(2r)(0) exits. In other words, the function!(2r-l)(x)/x is well defined at point x = O. Consequently, if J~oo IK(2r-l)(x)ldx < 00, then J~oo IK(2r-l)(x)l/xdx < 00 and K 2r- 1(X) is a kernel of order 2r, if other conditions of Theorem 3 is satisfied. Example 1. (continued) If K (x) = </>(x), then </>(2r-l)(X)/[X1:</>(2r-l)(x)/xdx] = ( _ly</>(2r-l)(x) 2 r - 1 (r - 1)!x. 7

8 is a kernel function of order 2r. The result is found in Wand and Schucany (1990). See also Wand and Schucany (1990) for the kernel functions K2r-1 (x), r = 1",,,5. Example 2. Let K(x) = ~ 1;X 2 be the standard Cauchy density. Then, (1 + x 2 )K(2r-1)(x) + 2(2r - l)xk(2r-2)(x) + (2r - 1)(2r - 2)K(2r-3)(x) = O. The recursive formula is used to compute higher order kernels. The following table gives the higher order kernel function resulting from (2.11). The renormalization constants and f~oo K~r-1(x)dx are computed by using numerical integration. Table 2: Cauchy density based kernels of order 2-8 order 2r K2r-l (x) f~~ K~r_l(x)dx 2 1«1 ';x2) (x~-I) «1 +x 2 ) (3x -10x~+3) (l+x2) S0(x 6-7x + 7x~ -1) l~x Example 3. Let Kn(x) = cn(l- x 2 )+. be a kernel function, where Cn is a normalization constant. Then, by (2.11) _ -1 n j ( n ) (2 J ')' 2j-2r K n,2r-1 - Cn,r ~(-1) j (2j _ 2r + 1)!x l[1xl$11' for r = 1"",[n/2], h C - 2"n ( 1)j ( n ) (2j)! Th r 11. T bl' h 1 were n,r - L.Jj=r - j (2j-2r+1)!(2j-2r+1)' e 10 owmg a e gives t e resu t of K 8,2r-1(X) for x E [-1,1]. Table 3: Polynomial based kernels of order 4-8 order 2r KS.2r-l (x) f~<>o K~r_l(x)dx 4 ~(1 - x 2 )t(1-5x 5 ) nhr(1 - x2)~(3-26x 2 +39x 4 ) j;\lijil (1 - x 2 )+ (35-385x x 4-715x6) If one is interested in finding a fourth order kernel, a simpler one would be K 4.3(X) = 8

9 3 Proofs 3.1 Proof of Lemma 2.1 Since differentiation is a linear operator, we need only to show for the case n = 1. We use the induction to prove the result. Note that Lemma 2.1 holds for j = O. Assume that Lemma 2.1 holds for j = m. Then, A 0 Om+I(X) A = oh n Om(X) = (-~=::m! t ( m ) [KI+I(X ~ Xl) +(l + m + l)ki(x ~ Xl)] Il! n 1=0 1 n n (_1)m+lm! [ m + 1 x - Xl X - Xl = h~+2 (m + 1)!Km+ l ( h n ) + (m + 1)Ko( h n ) Eam,IK1+I(x ~ Xl)], 1=0 n where Combining the last two displays yields that ij ()_(-1)m+I(m+1)!~(m+1)K(X-XI)ll' m+i x - hm+2 L..J 1 h.. n 1=0 1 n Thus, Lemma 2.1 holds for 1= m Proof of Theorem 1 Let's give two simple Lemmas, which will be used in the proof of Theorem 1. min(r k) ( r)( s ) (r +s ) Lemma 3.1. Li=~(O,k-") i k _ i = k. 9

10 Proof. Think of products consisting of T good and 8 bad products. Choosing k products is equivalent to selecting i good products and k - i bad products, for all possible i. Lemma 3.2. Under the condition of Theorem 1, where T 8 = f~oo x 8 K(x)dx. Proof. Integration by parts j times yields the results. Proof of Theorem 1. By Lemma 3.2 and the definition of Wk, we have for 1 ~ 8 ~ k _ ~ ( k ) ( ) T 8 L.J (-1) 1=1 1 8 (3.1) By Lemma 3.1, the summation in (3.1) can be written as t( -1)/-1 ( k ) mii:8) ( 1 ) ( 8-1 ) 1=1 1 i=1 i i - 1 ; t.~(-1)1-1 (; )( : )( : =~ ) = tt(-1)/_l( k) (k-i) (8-1) i=1 I=i i k - 1 i - 1 = t [E( _1)l+i-l ( k - i )] ( ~ ) ( ~ - 1 ),=1 1=0 1 t t - 1 (3.2) Note that L(-1)1 - t = 1=0 1 k-i (k' ) { 0, if i < k 1, if i = k 10

11 Thus, by (3.1) and (3.2) if s < k if s = k Similarly, by (3.1) we have This completes the proof.. L W,<x)dx = t.<-l)'( ~)= 1. ACKNOWLEDGEMENTS We would also like to express our sincere thanks to Professor J.S. Marron for many helpful discussions. References [1] Berlinet, A. (1990) Reproducing kernels and finite order kernels. Manuscript. [2] Chiu, S.T. (1990). Bandwidth selection for kernel density estimation, Ann. Statist., to appear. [3] Eubank, R. 1. (1988). Spline Smoothing and Nonparametric Regression. Dekker, New York. [4] Fan, J. and Marron, J.S. (1990). Best possible constant for bandwidth selection. Institute of Mimeo Series #2041, University of North Carolina, Chapel Hill. [5] Gasser, T., MUller, H.G., and Mammitzsch, V. (1985). Kernels for nonparametric curve estimation. J. Roy. Statist. Soc. Ser. B, 47, [6] HardIe, W. (1990). Applied Nonparametric Regression. Cambridge University Press, Boston. [7] Hall, P., Sheather, S. J., Jones, M.C. and Marron, J.S. (1990). On optimal data-based bandwidth selection in kernel density estimation. Biometrika, to appear. [8] Jones, M. C., Marron, J. S. and Park, B. U. (1990). A simple root n bandwidth selector. Annals of Statistics, to appear. [9] Marron, J. S. (1988). Automatic smoothing parameter selection: a survey. Empirical Economics, 13,

12 [10] MUller, H.G. (1984). Smooth optimum kernel estimators of densities, regression curves and modes. Ann. Statist., 12, [11] Milller, H.G. (1988). Nonparametnc Analysis of Longitudinal Data. Springer Verlag, Berlin. [12] Granovsky, B. and Milller, H.-G. (1990). Optimizing kernel methods for the nonparametric estimation of functions and characteristic points: a unifying variational principle. Manuscript. [13] Rice, J. (1984). Bandwidth choice for nonparametric regression. Ann. Statist., 12, [14] Rosenblatt, M. (1956). Remarks on some nonparametric estimates ofa density function, Ann. Math. Statist., 42, [15] Silverman, B. W. (1986). Density Estimation for Statistics and Data Analysis. Chapman and Hall, London. [16] Wahba, G. (1990). Spline Models for Observational Data. SIAM, Philadelphia. [17] Wand, M. P. and Schucany, W.R. (1990). Gaussian-based kernels. Canadian J. Statist., 18, [18] Schucany, W. R. and Sommers, J. P. (1977). Improvement of kernel type density estimators. J. Amen. Statist. Assoc., 72,

Local Polynomial Modelling and Its Applications

Local Polynomial Modelling and Its Applications Local Polynomial Modelling and Its Applications J. Fan Department of Statistics University of North Carolina Chapel Hill, USA and I. Gijbels Institute of Statistics Catholic University oflouvain Louvain-la-Neuve,

More information

O Combining cross-validation and plug-in methods - for kernel density bandwidth selection O

O Combining cross-validation and plug-in methods - for kernel density bandwidth selection O O Combining cross-validation and plug-in methods - for kernel density selection O Carlos Tenreiro CMUC and DMUC, University of Coimbra PhD Program UC UP February 18, 2011 1 Overview The nonparametric problem

More information

J. Cwik and J. Koronacki. Institute of Computer Science, Polish Academy of Sciences. to appear in. Computational Statistics and Data Analysis

J. Cwik and J. Koronacki. Institute of Computer Science, Polish Academy of Sciences. to appear in. Computational Statistics and Data Analysis A Combined Adaptive-Mixtures/Plug-In Estimator of Multivariate Probability Densities 1 J. Cwik and J. Koronacki Institute of Computer Science, Polish Academy of Sciences Ordona 21, 01-237 Warsaw, Poland

More information

Deconvolution with Supersmooth Distributions

Deconvolution with Supersmooth Distributions Deconvolution with Supersmooth Distributions Jianqing Fan Department of Statistics University of North Carolina Chapel Hill, N.C. 27514 July 15, 1990 Abstract The desire to recover the unknown density

More information

Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon

Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon Discussion of the paper Inference for Semiparametric Models: Some Questions and an Answer by Bickel and Kwon Jianqing Fan Department of Statistics Chinese University of Hong Kong AND Department of Statistics

More information

Department of Statistics Purdue University West Lafayette, IN USA

Department of Statistics Purdue University West Lafayette, IN USA Effect of Mean on Variance Funtion Estimation in Nonparametric Regression by Lie Wang, Lawrence D. Brown,T. Tony Cai University of Pennsylvania Michael Levine Purdue University Technical Report #06-08

More information

Variance Function Estimation in Multivariate Nonparametric Regression

Variance Function Estimation in Multivariate Nonparametric Regression Variance Function Estimation in Multivariate Nonparametric Regression T. Tony Cai 1, Michael Levine Lie Wang 1 Abstract Variance function estimation in multivariate nonparametric regression is considered

More information

1 Introduction A central problem in kernel density estimation is the data-driven selection of smoothing parameters. During the recent years, many dier

1 Introduction A central problem in kernel density estimation is the data-driven selection of smoothing parameters. During the recent years, many dier Bias corrected bootstrap bandwidth selection Birgit Grund and Jorg Polzehl y January 1996 School of Statistics, University of Minnesota Technical Report No. 611 Abstract Current bandwidth selectors for

More information

Local linear multiple regression with variable. bandwidth in the presence of heteroscedasticity

Local linear multiple regression with variable. bandwidth in the presence of heteroscedasticity Local linear multiple regression with variable bandwidth in the presence of heteroscedasticity Azhong Ye 1 Rob J Hyndman 2 Zinai Li 3 23 January 2006 Abstract: We present local linear estimator with variable

More information

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Statistica Sinica 19 (2009), 71-81 SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Song Xi Chen 1,2 and Chiu Min Wong 3 1 Iowa State University, 2 Peking University and

More information

arxiv: v1 [stat.co] 26 May 2009

arxiv: v1 [stat.co] 26 May 2009 MAXIMUM LIKELIHOOD ESTIMATION FOR MARKOV CHAINS arxiv:0905.4131v1 [stat.co] 6 May 009 IULIANA TEODORESCU Abstract. A new approach for optimal estimation of Markov chains with sparse transition matrices

More information

2 FRED J. HICKERNELL the sample mean of the y (i) : (2) ^ 1 N The mean square error of this estimate may be written as a sum of two parts, a bias term

2 FRED J. HICKERNELL the sample mean of the y (i) : (2) ^ 1 N The mean square error of this estimate may be written as a sum of two parts, a bias term GOODNESS OF FIT STATISTICS, DISCREPANCIES AND ROBUST DESIGNS FRED J. HICKERNELL Abstract. The Cramer{Von Mises goodness-of-t statistic, also known as the L 2 -star discrepancy, is the optimality criterion

More information

EXACT MEAN INTEGRATED SQUARED ERROR OF HIGHER ORDER KERNEL ESTIMATORS

EXACT MEAN INTEGRATED SQUARED ERROR OF HIGHER ORDER KERNEL ESTIMATORS Econometric Theory, 2, 2005, 03 057+ Printed in the United States of America+ DOI: 0+070S0266466605050528 EXACT MEAN INTEGRATED SQUARED ERROR OF HIGHER ORDER KERNEL ESTIMATORS BRUCE E. HANSEN University

More information

Local Polynomial Regression

Local Polynomial Regression VI Local Polynomial Regression (1) Global polynomial regression We observe random pairs (X 1, Y 1 ),, (X n, Y n ) where (X 1, Y 1 ),, (X n, Y n ) iid (X, Y ). We want to estimate m(x) = E(Y X = x) based

More information

The choice of weights in kernel regression estimation

The choice of weights in kernel regression estimation Biometrika (1990), 77, 2, pp. 377-81 Printed in Great Britain The choice of weights in kernel regression estimation BY THEO GASSER Zentralinstitut fur Seelische Gesundheit, J5, P.O.B. 5970, 6800 Mannheim

More information

Some Applications of the Euler-Maclaurin Summation Formula

Some Applications of the Euler-Maclaurin Summation Formula International Mathematical Forum, Vol. 8, 203, no., 9-4 Some Applications of the Euler-Maclaurin Summation Formula Rafael Jakimczuk División Matemática, Universidad Nacional de Luján Buenos Aires, Argentina

More information

DIRECT AND INDIRECT LEAST SQUARES APPROXIMATING POLYNOMIALS FOR THE FIRST DERIVATIVE FUNCTION

DIRECT AND INDIRECT LEAST SQUARES APPROXIMATING POLYNOMIALS FOR THE FIRST DERIVATIVE FUNCTION Applied Probability Trust (27 October 2016) DIRECT AND INDIRECT LEAST SQUARES APPROXIMATING POLYNOMIALS FOR THE FIRST DERIVATIVE FUNCTION T. VAN HECKE, Ghent University Abstract Finite difference methods

More information

DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA

DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA Statistica Sinica 18(2008), 515-534 DESIGN-ADAPTIVE MINIMAX LOCAL LINEAR REGRESSION FOR LONGITUDINAL/CLUSTERED DATA Kani Chen 1, Jianqing Fan 2 and Zhezhen Jin 3 1 Hong Kong University of Science and Technology,

More information

Data-Based Choice of Histogram Bin Width. M. P. Wand. Australian Graduate School of Management. University of New South Wales.

Data-Based Choice of Histogram Bin Width. M. P. Wand. Australian Graduate School of Management. University of New South Wales. Data-Based Choice of Histogram Bin Width M. P. Wand Australian Graduate School of Management University of New South Wales 13th May, 199 Abstract The most important parameter of a histogram is the bin

More information

Kernel Density Estimation

Kernel Density Estimation Kernel Density Estimation and Application in Discriminant Analysis Thomas Ledl Universität Wien Contents: Aspects of Application observations: 0 Which distribution? 0?? 0.0 0. 0. 0. 0.0 0. 0. 0 0 0.0

More information

CORRECTION OF DENSITY ESTIMATORS WHICH ARE NOT DENSITIES

CORRECTION OF DENSITY ESTIMATORS WHICH ARE NOT DENSITIES CORRECTION OF DENSITY ESTIMATORS WHICH ARE NOT DENSITIES I.K. Glad\ N.L. Hjort 1 and N.G. Ushakov 2 * November 1999 1 Department of Mathematics, University of Oslo, Norway 2Russian Academy of Sciences,

More information

A Chebyshev Polynomial Rate-of-Convergence Theorem for Stieltjes Functions

A Chebyshev Polynomial Rate-of-Convergence Theorem for Stieltjes Functions mathematics of computation volume 39, number 159 july 1982, pages 201-206 A Chebyshev Polynomial Rate-of-Convergence Theorem for Stieltjes Functions By John P. Boyd Abstract. The theorem proved here extends

More information

Comments on \Wavelets in Statistics: A Review" by. A. Antoniadis. Jianqing Fan. University of North Carolina, Chapel Hill

Comments on \Wavelets in Statistics: A Review by. A. Antoniadis. Jianqing Fan. University of North Carolina, Chapel Hill Comments on \Wavelets in Statistics: A Review" by A. Antoniadis Jianqing Fan University of North Carolina, Chapel Hill and University of California, Los Angeles I would like to congratulate Professor Antoniadis

More information

Convergence rates for uniform confidence intervals based on local polynomial regression estimators

Convergence rates for uniform confidence intervals based on local polynomial regression estimators Journal of Nonparametric Statistics ISSN: 1048-5252 Print) 1029-0311 Online) Journal homepage: http://www.tandfonline.com/loi/gnst20 Convergence rates for uniform confidence intervals based on local polynomial

More information

CALCULUS JIA-MING (FRANK) LIOU

CALCULUS JIA-MING (FRANK) LIOU CALCULUS JIA-MING (FRANK) LIOU Abstract. Contents. Power Series.. Polynomials and Formal Power Series.2. Radius of Convergence 2.3. Derivative and Antiderivative of Power Series 4.4. Power Series Expansion

More information

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract Journal of Data Science,17(1). P. 145-160,2019 DOI:10.6339/JDS.201901_17(1).0007 WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION Wei Xiong *, Maozai Tian 2 1 School of Statistics, University of

More information

Automatic Local Smoothing for Spectral Density. Abstract. This article uses local polynomial techniques to t Whittle's likelihood for spectral density

Automatic Local Smoothing for Spectral Density. Abstract. This article uses local polynomial techniques to t Whittle's likelihood for spectral density Automatic Local Smoothing for Spectral Density Estimation Jianqing Fan Department of Statistics University of North Carolina Chapel Hill, N.C. 27599-3260 Eva Kreutzberger Department of Mathematics University

More information

ESTIMATING DENSITIES, QUANTILES, QUANTILE DENSITIES AND DENSITY QUANTILES

ESTIMATING DENSITIES, QUANTILES, QUANTILE DENSITIES AND DENSITY QUANTILES Ann. Inst. Statist. Math. Vol. 44, No. 4, 721-727 (1992) ESTIMATING DENSITIES, QUANTILES, QUANTILE DENSITIES AND DENSITY QUANTILES M. C. JONES Department of Statistics, The Open University, Milton Keynes

More information

A Perturbation Technique for Sample Moment Matching in Kernel Density Estimation

A Perturbation Technique for Sample Moment Matching in Kernel Density Estimation A Perturbation Technique for Sample Moment Matching in Kernel Density Estimation Arnab Maity 1 and Debapriya Sengupta 2 Abstract The fundamental idea of kernel smoothing technique can be recognized as

More information

An Asymptotic Study of Variable Bandwidth Selection for Local Polynomial Regression with Application to Density Estimation

An Asymptotic Study of Variable Bandwidth Selection for Local Polynomial Regression with Application to Density Estimation An Asymptotic Study of Variable Bandwidth Selection for Local Polynomial Regression with Application to Density Estimation Jianqing Fant Department of Statistics University of North Carolina Chapel Hill,

More information

Estimation of cumulative distribution function with spline functions

Estimation of cumulative distribution function with spline functions INTERNATIONAL JOURNAL OF ECONOMICS AND STATISTICS Volume 5, 017 Estimation of cumulative distribution function with functions Akhlitdin Nizamitdinov, Aladdin Shamilov Abstract The estimation of the cumulative

More information

Smooth functions and local extreme values

Smooth functions and local extreme values Smooth functions and local extreme values A. Kovac 1 Department of Mathematics University of Bristol Abstract Given a sample of n observations y 1,..., y n at time points t 1,..., t n we consider the problem

More information

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Alea 4, 117 129 (2008) Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Anton Schick and Wolfgang Wefelmeyer Anton Schick, Department of Mathematical Sciences,

More information

Nonparametric Modal Regression

Nonparametric Modal Regression Nonparametric Modal Regression Summary In this article, we propose a new nonparametric modal regression model, which aims to estimate the mode of the conditional density of Y given predictors X. The nonparametric

More information

Introduction to Curve Estimation

Introduction to Curve Estimation Introduction to Curve Estimation Density 0.000 0.002 0.004 0.006 700 800 900 1000 1100 1200 1300 Wilcoxon score Michael E. Tarter & Micheal D. Lock Model-Free Curve Estimation Monographs on Statistics

More information

ON SOME TWO-STEP DENSITY ESTIMATION METHOD

ON SOME TWO-STEP DENSITY ESTIMATION METHOD UNIVESITATIS IAGELLONICAE ACTA MATHEMATICA, FASCICULUS XLIII 2005 ON SOME TWO-STEP DENSITY ESTIMATION METHOD by Jolanta Jarnicka Abstract. We introduce a new two-step kernel density estimation method,

More information

SCALE SPACE VIEW OF CURVE ESTIMATION. By Probal Chaudhuri and J. S. Marron Indian Statistical Institute and University of North Carolina

SCALE SPACE VIEW OF CURVE ESTIMATION. By Probal Chaudhuri and J. S. Marron Indian Statistical Institute and University of North Carolina The Annals of Statistics 2000, Vol. 28, No. 2, 408 428 SCALE SPACE VIEW OF CURVE ESTIMATION By Probal Chaudhuri and J. S. Marron Indian Statistical Institute and University of North Carolina Scale space

More information

Boosting kernel density estimates: A bias reduction technique?

Boosting kernel density estimates: A bias reduction technique? Biometrika (2004), 91, 1, pp. 226 233 2004 Biometrika Trust Printed in Great Britain Boosting kernel density estimates: A bias reduction technique? BY MARCO DI MARZIO Dipartimento di Metodi Quantitativi

More information

A nonparametric method of multi-step ahead forecasting in diffusion processes

A nonparametric method of multi-step ahead forecasting in diffusion processes A nonparametric method of multi-step ahead forecasting in diffusion processes Mariko Yamamura a, Isao Shoji b a School of Pharmacy, Kitasato University, Minato-ku, Tokyo, 108-8641, Japan. b Graduate School

More information

Defect Detection using Nonparametric Regression

Defect Detection using Nonparametric Regression Defect Detection using Nonparametric Regression Siana Halim Industrial Engineering Department-Petra Christian University Siwalankerto 121-131 Surabaya- Indonesia halim@petra.ac.id Abstract: To compare

More information

Optimal global rates of convergence for interpolation problems with random design

Optimal global rates of convergence for interpolation problems with random design Optimal global rates of convergence for interpolation problems with random design Michael Kohler 1 and Adam Krzyżak 2, 1 Fachbereich Mathematik, Technische Universität Darmstadt, Schlossgartenstr. 7, 64289

More information

OPTIMAL GWBAL RATES OF COVERGENCE IN PARTLY LINEAR MODELS YOUNG K. TRUONG. Department of Biostatistics University of North Carolina

OPTIMAL GWBAL RATES OF COVERGENCE IN PARTLY LINEAR MODELS YOUNG K. TRUONG. Department of Biostatistics University of North Carolina OPTIMAL GWBAL RATES OF COVERGENCE IN PARTLY LINEAR MODELS by YOUNG K. TRUONG Department of Biostatistics University of North Carolina.. Institute of Statistics Mimeo Series No. 2174 February 1997 OPTIMAL

More information

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers 6th St.Petersburg Workshop on Simulation (2009) 1-3 A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers Ansgar Steland 1 Abstract Sequential kernel smoothers form a class of procedures

More information

arxiv: v1 [math.pr] 13 Jan 2019

arxiv: v1 [math.pr] 13 Jan 2019 arxiv:1901.04086v1 [math.pr] 13 Jan 2019 Non-central limit theorem for non-linear functionals of vector valued Gaussian stationary random fields Péter Major Mathematical Institute of the Hungarian Academy

More information

Nonparametric Methods

Nonparametric Methods Nonparametric Methods Michael R. Roberts Department of Finance The Wharton School University of Pennsylvania July 28, 2009 Michael R. Roberts Nonparametric Methods 1/42 Overview Great for data analysis

More information

Notes on Regularized Least Squares Ryan M. Rifkin and Ross A. Lippert

Notes on Regularized Least Squares Ryan M. Rifkin and Ross A. Lippert Computer Science and Artificial Intelligence Laboratory Technical Report MIT-CSAIL-TR-2007-025 CBCL-268 May 1, 2007 Notes on Regularized Least Squares Ryan M. Rifkin and Ross A. Lippert massachusetts institute

More information

Adaptive Nonparametric Density Estimators

Adaptive Nonparametric Density Estimators Adaptive Nonparametric Density Estimators by Alan J. Izenman Introduction Theoretical results and practical application of histograms as density estimators usually assume a fixed-partition approach, where

More information

Smooth simultaneous confidence bands for cumulative distribution functions

Smooth simultaneous confidence bands for cumulative distribution functions Journal of Nonparametric Statistics, 2013 Vol. 25, No. 2, 395 407, http://dx.doi.org/10.1080/10485252.2012.759219 Smooth simultaneous confidence bands for cumulative distribution functions Jiangyan Wang

More information

Adaptive Kernel Estimation of The Hazard Rate Function

Adaptive Kernel Estimation of The Hazard Rate Function Adaptive Kernel Estimation of The Hazard Rate Function Raid Salha Department of Mathematics, Islamic University of Gaza, Palestine, e-mail: rbsalha@mail.iugaza.edu Abstract In this paper, we generalized

More information

Reproducing Kernel Hilbert Spaces Class 03, 15 February 2006 Andrea Caponnetto

Reproducing Kernel Hilbert Spaces Class 03, 15 February 2006 Andrea Caponnetto Reproducing Kernel Hilbert Spaces 9.520 Class 03, 15 February 2006 Andrea Caponnetto About this class Goal To introduce a particularly useful family of hypothesis spaces called Reproducing Kernel Hilbert

More information

S. Lototsky and B.L. Rozovskii Center for Applied Mathematical Sciences University of Southern California, Los Angeles, CA

S. Lototsky and B.L. Rozovskii Center for Applied Mathematical Sciences University of Southern California, Los Angeles, CA RECURSIVE MULTIPLE WIENER INTEGRAL EXPANSION FOR NONLINEAR FILTERING OF DIFFUSION PROCESSES Published in: J. A. Goldstein, N. E. Gretsky, and J. J. Uhl (editors), Stochastic Processes and Functional Analysis,

More information

Nonparametric Regression

Nonparametric Regression Nonparametric Regression Econ 674 Purdue University April 8, 2009 Justin L. Tobias (Purdue) Nonparametric Regression April 8, 2009 1 / 31 Consider the univariate nonparametric regression model: where y

More information

Section Taylor and Maclaurin Series

Section Taylor and Maclaurin Series Section.0 Taylor and Maclaurin Series Ruipeng Shen Feb 5 Taylor and Maclaurin Series Main Goal: How to find a power series representation for a smooth function us assume that a smooth function has a power

More information

COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE

COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE APPLICATIONES MATHEMATICAE 23,1(1995), pp. 83 93 A. WERON(Wroc law) COMPUTER-AIDED MODELING AND SIMULATION OF ELECTRICAL CIRCUITS WITH α-stable NOISE Abstract. The aim of this paper is to demonstrate how

More information

Miscellanea Kernel density estimation and marginalization consistency

Miscellanea Kernel density estimation and marginalization consistency Biometrika (1991), 78, 2, pp. 421-5 Printed in Great Britain Miscellanea Kernel density estimation and marginalization consistency BY MIKE WEST Institute of Statistics and Decision Sciences, Duke University,

More information

A NOTE ON THE CHOICE OF THE SMOOTHING PARAMETER IN THE KERNEL DENSITY ESTIMATE

A NOTE ON THE CHOICE OF THE SMOOTHING PARAMETER IN THE KERNEL DENSITY ESTIMATE BRAC University Journal, vol. V1, no. 1, 2009, pp. 59-68 A NOTE ON THE CHOICE OF THE SMOOTHING PARAMETER IN THE KERNEL DENSITY ESTIMATE Daniel F. Froelich Minnesota State University, Mankato, USA and Mezbahur

More information

A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS

A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS Statistica Sinica 26 (2016), 1117-1128 doi:http://dx.doi.org/10.5705/ss.202015.0240 A CENTRAL LIMIT THEOREM FOR NESTED OR SLICED LATIN HYPERCUBE DESIGNS Xu He and Peter Z. G. Qian Chinese Academy of Sciences

More information

DEPARTMENT MATHEMATIK ARBEITSBEREICH MATHEMATISCHE STATISTIK UND STOCHASTISCHE PROZESSE

DEPARTMENT MATHEMATIK ARBEITSBEREICH MATHEMATISCHE STATISTIK UND STOCHASTISCHE PROZESSE Estimating the error distribution in nonparametric multiple regression with applications to model testing Natalie Neumeyer & Ingrid Van Keilegom Preprint No. 2008-01 July 2008 DEPARTMENT MATHEMATIK ARBEITSBEREICH

More information

Spatial Process Estimates as Smoothers: A Review

Spatial Process Estimates as Smoothers: A Review Spatial Process Estimates as Smoothers: A Review Soutir Bandyopadhyay 1 Basic Model The observational model considered here has the form Y i = f(x i ) + ɛ i, for 1 i n. (1.1) where Y i is the observed

More information

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA This article was downloaded by: [University of New Mexico] On: 27 September 2012, At: 22:13 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered

More information

12 - Nonparametric Density Estimation

12 - Nonparametric Density Estimation ST 697 Fall 2017 1/49 12 - Nonparametric Density Estimation ST 697 Fall 2017 University of Alabama Density Review ST 697 Fall 2017 2/49 Continuous Random Variables ST 697 Fall 2017 3/49 1.0 0.8 F(x) 0.6

More information

Myung-Hwan Kim and Byeong-Kweon Oh. Department of Mathematics, Seoul National University, Seoul , Korea

Myung-Hwan Kim and Byeong-Kweon Oh. Department of Mathematics, Seoul National University, Seoul , Korea REPRESENTATIONS OF POSITIVE DEFINITE SENARY INTEGRAL QUADRATIC FORMS BY A SUM OF SQUARES Myung-Hwan Kim and Byeong-Kweon Oh Department of Mathematics Seoul National University Seoul 151-742 Korea Abstract.

More information

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Robert V. Breunig Centre for Economic Policy Research, Research School of Social Sciences and School of

More information

STATISTICAL ESTIMATION IN VARYING COEFFICIENT MODELS

STATISTICAL ESTIMATION IN VARYING COEFFICIENT MODELS he Annals of Statistics 1999, Vol. 27, No. 5, 1491 1518 SAISICAL ESIMAION IN VARYING COEFFICIEN MODELS By Jianqing Fan 1 and Wenyang Zhang University of North Carolina and Chinese University of Hong Kong

More information

Teruko Takada Department of Economics, University of Illinois. Abstract

Teruko Takada Department of Economics, University of Illinois. Abstract Nonparametric density estimation: A comparative study Teruko Takada Department of Economics, University of Illinois Abstract Motivated by finance applications, the objective of this paper is to assess

More information

Wavelet Regression Estimation in Longitudinal Data Analysis

Wavelet Regression Estimation in Longitudinal Data Analysis Wavelet Regression Estimation in Longitudinal Data Analysis ALWELL J. OYET and BRAJENDRA SUTRADHAR Department of Mathematics and Statistics, Memorial University of Newfoundland St. John s, NF Canada, A1C

More information

A CHARACTERIZATION OF STRICT LOCAL MINIMIZERS OF ORDER ONE FOR STATIC MINMAX PROBLEMS IN THE PARAMETRIC CONSTRAINT CASE

A CHARACTERIZATION OF STRICT LOCAL MINIMIZERS OF ORDER ONE FOR STATIC MINMAX PROBLEMS IN THE PARAMETRIC CONSTRAINT CASE Journal of Applied Analysis Vol. 6, No. 1 (2000), pp. 139 148 A CHARACTERIZATION OF STRICT LOCAL MINIMIZERS OF ORDER ONE FOR STATIC MINMAX PROBLEMS IN THE PARAMETRIC CONSTRAINT CASE A. W. A. TAHA Received

More information

A Novel Nonparametric Density Estimator

A Novel Nonparametric Density Estimator A Novel Nonparametric Density Estimator Z. I. Botev The University of Queensland Australia Abstract We present a novel nonparametric density estimator and a new data-driven bandwidth selection method with

More information

Issues on quantile autoregression

Issues on quantile autoregression Issues on quantile autoregression Jianqing Fan and Yingying Fan We congratulate Koenker and Xiao on their interesting and important contribution to the quantile autoregression (QAR). The paper provides

More information

ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS

ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS ERROR VARIANCE ESTIMATION IN NONPARAMETRIC REGRESSION MODELS By YOUSEF FAYZ M ALHARBI A thesis submitted to The University of Birmingham for the Degree of DOCTOR OF PHILOSOPHY School of Mathematics The

More information

A comparative study of ordinary cross-validation, r-fold cross-validation and the repeated learning-testing methods

A comparative study of ordinary cross-validation, r-fold cross-validation and the repeated learning-testing methods Biometrika (1989), 76, 3, pp. 503-14 Printed in Great Britain A comparative study of ordinary cross-validation, r-fold cross-validation and the repeated learning-testing methods BY PRABIR BURMAN Division

More information

LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION

LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION Statistica Sinica 24 (2014), 1215-1238 doi:http://dx.doi.org/10.5705/ss.2012.040 LOCAL POLYNOMIAL AND PENALIZED TRIGONOMETRIC SERIES REGRESSION Li-Shan Huang and Kung-Sik Chan National Tsing Hua University

More information

arxiv: v1 [stat.me] 28 Jun 2007

arxiv: v1 [stat.me] 28 Jun 2007 Electronic Journal of Statistics Vol. 1 (7) 9 ISSN: 1935-75 DOI:.11/7-EJS SiZer for time series: A new approach to the analysis of trends arxiv:7.19v1 [stat.me] Jun 7 Vitaliana Rondonotti, Monetary Financial

More information

Introduction to Smoothing spline ANOVA models (metamodelling)

Introduction to Smoothing spline ANOVA models (metamodelling) Introduction to Smoothing spline ANOVA models (metamodelling) M. Ratto DYNARE Summer School, Paris, June 215. Joint Research Centre www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting

More information

Local regression I. Patrick Breheny. November 1. Kernel weighted averages Local linear regression

Local regression I. Patrick Breheny. November 1. Kernel weighted averages Local linear regression Local regression I Patrick Breheny November 1 Patrick Breheny STA 621: Nonparametric Statistics 1/27 Simple local models Kernel weighted averages The Nadaraya-Watson estimator Expected loss and prediction

More information

On robust and efficient estimation of the center of. Symmetry.

On robust and efficient estimation of the center of. Symmetry. On robust and efficient estimation of the center of symmetry Howard D. Bondell Department of Statistics, North Carolina State University Raleigh, NC 27695-8203, U.S.A (email: bondell@stat.ncsu.edu) Abstract

More information

Local Polynomial Fitting: A Standard for Nonparametric Regression!

Local Polynomial Fitting: A Standard for Nonparametric Regression! Local Polynomial Fitting: A Standard for Nonparametric Regression! Jianqing Fan 2 Department of Statistics University of North-Carolina Chapel Hill, N.C. 27599-3260, USA Theo Gasser Biostatistics Dept.,

More information

Nonparametric Density Estimation

Nonparametric Density Estimation Nonparametric Density Estimation Econ 690 Purdue University Justin L. Tobias (Purdue) Nonparametric Density Estimation 1 / 29 Density Estimation Suppose that you had some data, say on wages, and you wanted

More information

Function of Longitudinal Data

Function of Longitudinal Data New Local Estimation Procedure for Nonparametric Regression Function of Longitudinal Data Weixin Yao and Runze Li Abstract This paper develops a new estimation of nonparametric regression functions for

More information

41903: Introduction to Nonparametrics

41903: Introduction to Nonparametrics 41903: Notes 5 Introduction Nonparametrics fundamentally about fitting flexible models: want model that is flexible enough to accommodate important patterns but not so flexible it overspecializes to specific

More information

A new method of nonparametric density estimation

A new method of nonparametric density estimation A new method of nonparametric density estimation Andrey Pepelyshev Cardi December 7, 2011 1/32 A. Pepelyshev A new method of nonparametric density estimation Contents Introduction A new density estimate

More information

NEW CONSTRUCTION OF THE EAGON-NORTHCOTT COMPLEX. Oh-Jin Kang and Joohyung Kim

NEW CONSTRUCTION OF THE EAGON-NORTHCOTT COMPLEX. Oh-Jin Kang and Joohyung Kim Korean J Math 20 (2012) No 2 pp 161 176 NEW CONSTRUCTION OF THE EAGON-NORTHCOTT COMPLEX Oh-Jin Kang and Joohyung Kim Abstract The authors [6 introduced the concept of a complete matrix of grade g > 3 to

More information

Hermite Interpolation

Hermite Interpolation Jim Lambers MAT 77 Fall Semester 010-11 Lecture Notes These notes correspond to Sections 4 and 5 in the text Hermite Interpolation Suppose that the interpolation points are perturbed so that two neighboring

More information

KERNEL QUANTILE ESTIMATORS

KERNEL QUANTILE ESTIMATORS #1783 KERNEL QUANTILE ESTIMATORS SIMON J. SHEATHER and J.S. MARRON Simon J. Sheather is Lecturer, Australian Graduate School of Management, University of New South Wales, Kensington, NSW, 2033, Australia.

More information

Reproducing Kernel Hilbert Spaces

Reproducing Kernel Hilbert Spaces 9.520: Statistical Learning Theory and Applications February 10th, 2010 Reproducing Kernel Hilbert Spaces Lecturer: Lorenzo Rosasco Scribe: Greg Durrett 1 Introduction In the previous two lectures, we

More information

Hankel Operators plus Orthogonal Polynomials. Yield Combinatorial Identities

Hankel Operators plus Orthogonal Polynomials. Yield Combinatorial Identities Hanel Operators plus Orthogonal Polynomials Yield Combinatorial Identities E. A. Herman, Grinnell College Abstract: A Hanel operator H [h i+j ] can be factored as H MM, where M maps a space of L functions

More information

DISTRIBUTIONS FUNCTIONS OF PROBABILITY SOME THEOREMS ON CHARACTERISTIC. (1.3) +(t) = eitx df(x),

DISTRIBUTIONS FUNCTIONS OF PROBABILITY SOME THEOREMS ON CHARACTERISTIC. (1.3) +(t) = eitx df(x), SOME THEOREMS ON CHARACTERISTIC FUNCTIONS OF PROBABILITY DISTRIBUTIONS 1. Introduction E. J. G. PITMAN UNIVERSITY OF TASMANIA Let X be a real valued random variable with probability measure P and distribution

More information

3 Nonparametric Density Estimation

3 Nonparametric Density Estimation 3 Nonparametric Density Estimation Example: Income distribution Source: U.K. Family Expenditure Survey (FES) 1968-1995 Approximately 7000 British Households per year For each household many different variables

More information

Multivariate Locally Weighted Polynomial Fitting and Partial Derivative Estimation

Multivariate Locally Weighted Polynomial Fitting and Partial Derivative Estimation journal of multivariate analysis 59, 8705 (996) article no. 0060 Multivariate Locally Weighted Polynomial Fitting and Partial Derivative Estimation Zhan-Qian Lu Geophysical Statistics Project, National

More information

Gradient Based Optimization Methods

Gradient Based Optimization Methods Gradient Based Optimization Methods Antony Jameson, Department of Aeronautics and Astronautics Stanford University, Stanford, CA 94305-4035 1 Introduction Consider the minimization of a function J(x) where

More information

VARIANCE REDUCTION BY SMOOTHING REVISITED BRIAN J E R S K Y (POMONA)

VARIANCE REDUCTION BY SMOOTHING REVISITED BRIAN J E R S K Y (POMONA) PROBABILITY AND MATHEMATICAL STATISTICS Vol. 33, Fasc. 1 2013), pp. 79 92 VARIANCE REDUCTION BY SMOOTHING REVISITED BY ANDRZEJ S. KO Z E K SYDNEY) AND BRIAN J E R S K Y POMONA) Abstract. Smoothing is a

More information

Spline Density Estimation and Inference with Model-Based Penalities

Spline Density Estimation and Inference with Model-Based Penalities Spline Density Estimation and Inference with Model-Based Penalities December 7, 016 Abstract In this paper we propose model-based penalties for smoothing spline density estimation and inference. These

More information

Approximation Theory on Manifolds

Approximation Theory on Manifolds ATHEATICAL and COPUTATIONAL ETHODS Approximation Theory on anifolds JOSE ARTINEZ-ORALES Universidad Nacional Autónoma de éxico Instituto de atemáticas A.P. 273, Admon. de correos #3C.P. 62251 Cuernavaca,

More information

A NORMAL SCALE MIXTURE REPRESENTATION OF THE LOGISTIC DISTRIBUTION ABSTRACT

A NORMAL SCALE MIXTURE REPRESENTATION OF THE LOGISTIC DISTRIBUTION ABSTRACT #1070R A NORMAL SCALE MIXTURE REPRESENTATION OF THE LOGISTIC DISTRIBUTION LEONARD A. STEFANSKI Department of Statistics North Carolina State University Raleigh, NC 27695 ABSTRACT In this paper it is shown

More information

ESTIMATORS IN THE CONTEXT OF ACTUARIAL LOSS MODEL A COMPARISON OF TWO NONPARAMETRIC DENSITY MENGJUE TANG A THESIS MATHEMATICS AND STATISTICS

ESTIMATORS IN THE CONTEXT OF ACTUARIAL LOSS MODEL A COMPARISON OF TWO NONPARAMETRIC DENSITY MENGJUE TANG A THESIS MATHEMATICS AND STATISTICS A COMPARISON OF TWO NONPARAMETRIC DENSITY ESTIMATORS IN THE CONTEXT OF ACTUARIAL LOSS MODEL MENGJUE TANG A THESIS IN THE DEPARTMENT OF MATHEMATICS AND STATISTICS PRESENTED IN PARTIAL FULFILLMENT OF THE

More information

A NOTE ON A BASIS PROBLEM

A NOTE ON A BASIS PROBLEM PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 51, Number 2, September 1975 A NOTE ON A BASIS PROBLEM J. M. ANDERSON ABSTRACT. It is shown that the functions {exp xvx\v_. form a basis for the

More information

About this class. Maximizing the Margin. Maximum margin classifiers. Picture of large and small margin hyperplanes

About this class. Maximizing the Margin. Maximum margin classifiers. Picture of large and small margin hyperplanes About this class Maximum margin classifiers SVMs: geometric derivation of the primal problem Statement of the dual problem The kernel trick SVMs as the solution to a regularization problem Maximizing the

More information

Extended GaussMarkov Theorem for Nonparametric Mixed-Effects Models

Extended GaussMarkov Theorem for Nonparametric Mixed-Effects Models Journal of Multivariate Analysis 76, 249266 (2001) doi:10.1006jmva.2000.1930, available online at http:www.idealibrary.com on Extended GaussMarkov Theorem for Nonparametric Mixed-Effects Models Su-Yun

More information

ASSESSING GENERALIZED LINEAR MIXED MODELS USING RESIDUAL ANALYSIS. Received March 2011; revised July 2011

ASSESSING GENERALIZED LINEAR MIXED MODELS USING RESIDUAL ANALYSIS. Received March 2011; revised July 2011 International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 8, August 2012 pp. 5693 5701 ASSESSING GENERALIZED LINEAR MIXED MODELS USING

More information

Mohsen Pourahmadi. 1. A sampling theorem for multivariate stationary processes. J. of Multivariate Analysis, Vol. 13, No. 1 (1983),

Mohsen Pourahmadi. 1. A sampling theorem for multivariate stationary processes. J. of Multivariate Analysis, Vol. 13, No. 1 (1983), Mohsen Pourahmadi PUBLICATIONS Books and Editorial Activities: 1. Foundations of Time Series Analysis and Prediction Theory, John Wiley, 2001. 2. Computing Science and Statistics, 31, 2000, the Proceedings

More information