Bull. Korean Math. Soc. 36 (1999), No. 3, pp. 451{457 THE STRONG CONSISTENCY OF NONLINEAR REGRESSION QUANTILES ESTIMATORS Seung Hoe Choi and Hae Kyung

Similar documents
Regression with an Evaporating Logarithmic Trend

Asymptotic distribution of products of sums of independent random variables

STA Object Data Analysis - A List of Projects. January 18, 2018

Precise Rates in Complete Moment Convergence for Negatively Associated Sequences

Statistical Inference Based on Extremum Estimators

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at

First Year Quantitative Comp Exam Spring, Part I - 203A. f X (x) = 0 otherwise

Introducing a Novel Bivariate Generalized Skew-Symmetric Normal Distribution

Limit distributions for products of sums

Quantile regression with multilayer perceptrons.

Self-normalized deviation inequalities with application to t-statistic

A Note on Box-Cox Quantile Regression Estimation of the Parameters of the Generalized Pareto Distribution

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

The standard deviation of the mean

MAXIMAL INEQUALITIES AND STRONG LAW OF LARGE NUMBERS FOR AANA SEQUENCES

Journal of Multivariate Analysis. Superefficient estimation of the marginals by exploiting knowledge on the copula

A Note on the Kolmogorov-Feller Weak Law of Large Numbers

Preponderantly increasing/decreasing data in regression analysis

Convergence of random variables. (telegram style notes) P.J.C. Spreij

for all x ; ;x R. A ifiite sequece fx ; g is said to be ND if every fiite subset X ; ;X is ND. The coditios (.) ad (.3) are equivalet for =, but these

THE DATA-BASED CHOICE OF BANDWIDTH FOR KERNEL QUANTILE ESTIMATOR OF VAR

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Lecture 19: Convergence

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

STX FACULTY WORKING PAPER NO EC*0BB4. Roger W. Koenker. and Related Empirical Processes. Strong Consistency of Regression Quantiles

An Introduction to Asymptotic Theory

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Empirical Processes: Glivenko Cantelli Theorems

2.2. Central limit theorem.

Berry-Esseen bounds for self-normalized martingales

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Mi-Hwa Ko and Tae-Sung Kim

7.1 Convergence of sequences of random variables

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

Lecture 15: Density estimation

Distribution of Random Samples & Limit theorems

Notes On Median and Quantile Regression. James L. Powell Department of Economics University of California, Berkeley

Mathematical Statistics - MS

Lecture 8: Convergence of transformations and law of large numbers

Central limit theorem and almost sure central limit theorem for the product of some partial sums

LECTURE 8: ASYMPTOTICS I

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

32 estimating the cumulative distribution function

G. R. Pasha Department of Statistics Bahauddin Zakariya University Multan, Pakistan

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

A statistical method to determine sample size to estimate characteristic value of soil parameters

SOME SEQUENCE SPACES DEFINED BY ORLICZ FUNCTIONS

A Weak Law of Large Numbers Under Weak Mixing

Rank tests and regression rank scores tests in measurement error models

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Random Variables, Sampling and Estimation

Topic 9: Sampling Distributions of Estimators

ON POINTWISE BINOMIAL APPROXIMATION

SDS 321: Introduction to Probability and Statistics

Lecture 27: Optimal Estimators and Functional Delta Method

Study the bias (due to the nite dimensional approximation) and variance of the estimators

Topic 9: Sampling Distributions of Estimators

Complete Convergence for Asymptotically Almost Negatively Associated Random Variables

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

STAT Homework 1 - Solutions

A NEW METHOD FOR CONSTRUCTING APPROXIMATE CONFIDENCE INTERVALS FOR M-ESTU1ATES. Dennis D. Boos

A note on self-normalized Dickey-Fuller test for unit root in autoregressive time series with GARCH errors

REGRESSION WITH QUADRATIC LOSS

Lecture 33: Bootstrap

7.1 Convergence of sequences of random variables

An Introduction to Randomized Algorithms

Boundaries and the James theorem

Confidence interval for the two-parameter exponentiated Gumbel distribution based on record values

On Weak and Strong Convergence Theorems for a Finite Family of Nonself I-asymptotically Nonexpansive Mappings

Element sampling: Part 2

The random version of Dvoretzky s theorem in l n

COMMON FIXED POINT THEOREMS VIA w-distance

Estimation for Complete Data

Asymptotic Results for the Linear Regression Model

Review Article Complete Convergence for Negatively Dependent Sequences of Random Variables

HAJEK-RENYI-TYPE INEQUALITY FOR SOME NONMONOTONIC FUNCTIONS OF ASSOCIATED RANDOM VARIABLES

Chapter 7 Isoperimetric problem

Research Article On the Strong Laws for Weighted Sums of ρ -Mixing Random Variables

Output Analysis and Run-Length Control

Sieve Estimators: Consistency and Rates of Convergence

Estimation of the essential supremum of a regression function

Ma 4121: Introduction to Lebesgue Integration Solutions to Homework Assignment 5

Machine Learning Brett Bernstein

1 Covariance Estimation

Topic 9: Sampling Distributions of Estimators

Bounds for the Extreme Eigenvalues Using the Trace and Determinant

Uniformly Consistency of the Cauchy-Transformation Kernel Density Estimation Underlying Strong Mixing

Lecture 2: Monte Carlo Simulation

Sequences and Series of Functions

EFFECTIVE WLLN, SLLN, AND CLT IN STATISTICAL MODELS

Lecture 3: August 31

4.5 Multiple Imputation

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Gamma Distribution and Gamma Approximation

Approximation theorems for localized szász Mirakjan operators

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 21 11/27/2013

Transcription:

Bull. Korea Math. Soc. 36 (999), No. 3, pp. 45{457 THE STRONG CONSISTENCY OF NONLINEAR REGRESSION QUANTILES ESTIMATORS Abstract. This paper provides suciet coditios which esure the strog cosistecy of regressio quatiles estimators of oliear regressio models. The mai result is supported by the applicatio of a asymptotic property of the least absolute deviatio estimators as a special case of the proposed estimators. Some example is give to illustrate the applicatio of the mai result.. Itroductio I this paper we cosider the followig oliear regressio model (:) y t = f(x t o )+ t t = 2 where f is akow respose fuctio, x t which belog to bouded subspace of R q is a observed iput vector, the error term t are idepedet ad idetically distributed (i.i.d.) radom variables with ite variace. The parameter vector o which is iterior poit iis ukow ao be estimated. The most commo techique useo estimate the true parameters i model (.) is the method of least squares developed by Jerich (969) ad Wu (98). However, o the occasios of the error terms cotai some outliers or depart from ormal distributio the least squares method is poor estimators due to the extreme sesitivity of the least squares estimators to some outlier. To overcome this defect, the search for the robust procedures alterative the least squares method has geerated cosiderable iterest i statistical iferece. Received Jue 5, 998. 99 Mathematics Subject Classicatio: 62J02. Key words ad phrases: oliear regressio model, strog cosistecy, regressio quatiles estimator.

The Least Absolute Deviatio (LAD) estimators based o sample media is deed by ay vector miimizig the sum of absolute deviatios D = jy t ; f t j where f t = f(x t ). Oberhofer (982) gave suciet coditios for the weak cosistecy of the LAD estimators i oliear regressio models. Wag (995) derivehe asymptotic ormality of the oliear LAD estimators ad Kim ad Choi (995) ivestigatehe asymptotic properties of the oliear LAD estimators ad explaiehat the relative eciecy of the LAD estimators to the least square estimators is the same as the relative eciecy of the sample media to the sample mea. Meawhile, i case of a distributio fuctio of errors is positively skewed (or egatively skewed) other quatiles tha media (50th quatile) may reveal the iformatio about the ukow parameter o i model (.). Regressio quatiles which provide a atural geeralizatio of the otio of sample quatile to the geeral regressio model were proposed by Koeker ad Basset (982). Quatile-based estimators have log bee kow i the statistical literature as `L-estimators' for their relative eciecy for heavy-tailed error distributios. The -th regressio quatiles estimators (0 < < ) of the true parameter o based o (y t x t ), deoted by ^, is a parameter which miimizes the objective fuctio (:2) S ( ) = ' (y t ; f t ) where the \check" fuctio if 0 ' = ( ; ) if <0: Sice the check fuctio ' (x) rotates the absolute fuctio jxj by some 2 agle i the clockwise directio ( < ), the least absolute deviatio 2 estimators is a obviously importat special case of the regressio quatiles estimators. I some recet papers, aalysis of liear models usig quatiles estimatio has bee published by may authors : Basset ad Koeker (982, 986) ad Portoy (99). Basset ad Koeker (986) establishehe strog cosistecy of regressio quatiles statistics i 452

The strog cosistecy of oliear regressio quatiles estimators liear models with i.i.d. errors. Portoy (99) discussed asymptotic behavior of regressio quatiles uder more geeral heteroscedasticity ad depedece assumptios i liear models. The mai object of this paper is to provide simple ad suciet coditios for the strog cosistecy of the regressio quatiles estimators ^ i oliear regressio model (.). 2. Strog Cosistecy We start this sectio by itroducig some coditios which esure the strog cosistecy of the regressio quatile estimator i the oliear regressio model (.). Let P be a probability measure o R q ad H deote the distributio fuctio of iput vector x t. Let rf t = [ @ f(x @ i t )] (p). Assumptio A The parameter space is compact subspace of R p. Assumptio B B : The respose fuctio f(x t ) ahe partial derivatives rf t are cotiuous o. B 2 : The distributio fuctio G(x) of the errors is cotiuously dieretiable withp desity g(x) which is strictly positive atg ; =0. B 3 : V ( o ) = rf t( o )r T f t ( o ) coverges to a positive deite matrix V ( o )as!. B 4 : P fx 2 :f(x o ) 6= f(x )g > 0 for each 6= o. Modifyig (.2), we have aother objective fuctio of the oliear regressio quatiles estimators (2:) Q ( ) =S ( ) ; S ( o ): Sice S ( o ) is idepedet of, the regressio quatiles estimators ^ deed i (.2) is equivalet to the miimizer of (2.). Before we proceeo cosider the mai result, we preset the followig lemma eeded i the proof of the mai theorem. Lemma 2.. Suppose that model (:) satises Assumptios A ad B. The for ay, Q ( ) ; EfQ ( )g = o p 453

where o p deotes covergece i probability. Proof. Dee the radom variable t asfollowig if yt f t = t 0 otherwise: The we ca rewrite Q ( ) = = = [' (r t ) ; ' (" t )] [( ; )(r t I frt 0g ; " t I f"t 0g) +(r t I frt >0g ; " t I f"t >0g)] [( ; t )r t +( t ( 0 ) ; )" t ] where r t = y t ; f t. Let X t = ( ; t )r t +( t ( 0 ) ; )" t. Accordig to Holder's iequality, we get jx t j( +2)j t j + j +jkrf( )kk ; o k wherek:k deote Euclidea orm ad = o +(; ) 0 : O the other had, Chebyshev's iequality gives P fjq ( ) ; EfQ ( )gj >g The proof follows from Assumptio A ad B. max VarX t t : 2 The followig theorem is the mai result of this sectio, which provides suciet coditios for the strog cosistecy of regressio quatiles estimators. Theorem 2.. For the model (:), suppose that Assumptios A ad B are fullled. The the regressio quatiles estimators ^ deed i (:2) is strogly cosistet for 0. Proof. For ay >0, it is suciet to show that (2:2) lim if fq ( )g > 0 a:e:! k; o 454

The strog cosistecy of oliear regressio quatiles estimators From the lemma 2. we have Q ( ) = E X t + o p where E deotes the expected value of the error term t. Note that E X t = = ( ; I fdt g)( ; ) + (I fdt ( o )g ; )dg R 0 ; dg ; dt ; dg+ G( ) ; where =f t ; f t ( o ). First weprovethat o is a local miimizer of Q( ) = lim By simple calculatio, we get Q( ) = lim! r 2 Q( ) = lim! h 0 rq( ) = lim!! P E X t. i dg+ G( ) ; : [rf t (G( ) ; )]: [rf t (G( ) ; )+g( )rf t r T f t ]: Furthermore, rq( o ) = 0 ad r 2 Q( o ) is positive deite matrix. Hece Q( ) attais a local miimum at o. Next we show that this local miimizer o is ideehe global miimizer. Let N ( o ) = f : k ; o k < g: Sice R = N c ( o ) \ is compact, there exists such that We cosider Q( ) = lim! = lim! Q( ) = if Q( : ): 2R h 0 0 i dg+ G( ) ; ( ; ) dg: 455

If < 0, the ; is positive i ( 0). Thus there exist ad 2 such that < < 2 < 0. From Assumptio B 2, sice g is strictly positive o [ 2 ], there exists a > 0 such that g > o [ 2 ]. Thus we obtai (2.3) lim! 0 ( ; )g d > 2 ( ; )g d 2 > ( ; ) d: Likewise if > 0, we have similar result. Thus, we have Q( ) > Q( o ) or because the last term is positive. Fially, from Assumptio B 4 ad above fact we obtai (2:4) if E x Q ( ) k; o if k; o! R X dgdh(x) where! = fx 2 jf(x ) 6= f(x o )g: I virtue of (2.3) ad (2.4), we get for sucietly large if E x Q ( ) 2 k; o where 2 is a positive realumber. The proof is completed. To see that the assumptios of the mai theorem are suciet tocover a class of oliear regressio fuctios, we ow cosider the followig example. Example. Let p be a dieretial fuctio from R m to R + : Cosider the model y t = f(x t o )+ t where o 2 =[0 a ] [0 a 2 ] a a 2 < ad f(x ) = p(x) 2. May authors cosider the oliear model with p(x) = e x ad p(x) = x for m =. Assume that f t g are i.i.d. radom variables with the cotiuous p.d.f g(x) Pad distributio fuctio G(x) for which G(0) =. The V = rf tr T f t coverges to V where V = 2 6 4 p(x) 2 2 dg(x) p(x) 2 2 l p(x) dg(x) p(x) 2 2 l p(x) dg(x) ( p(x) 2 l p(x))2 dg(x) 3 7 5 : 456

The strog cosistecy of oliear regressio quatiles estimators For a o-zero vector =( 2 ) V T = ( + l p(x) 2 ) 2 p(x) 2 2 dg(x) > 0 where ( 2 ) 2 0. It is ot haro verify that Assumptios A ad B are satised. Thus we ca guaratee the strog cosistecy of the regressio quatile estimators. Ackowledgemets. We thak to the referee for suggestios ad remarks i improvig this paper. Refereces [] Basset, G. ad Koeker, R., A empirical quatile fuctio for liear models with i.i.d. errors, Joural of the America Statistical Associatio 77 (982), 407-45. [2], Strog cosistecy of regressio quatiles ad related empirical process, Ecoometric Theory 2 (986), 9-20. [3] Jerich, R. I., Asymptotic properties of oliear least squares estimators, The Aal of Mathematical Statistics 40 (969), 633-643. [4] Kim, H. K. ad Choi, S. H., Asymptotic properties of oliear Least Absolute Deviatio estimators, Joural of Korea Statistics Society 24 (995), 27-39. [5] Oberhofer, W., The cosistecy of oliear regressio miimizig the L -orm, The Aals of Statistics 0 (982), 36-39. [6] Portoy, S., Asymptotic behavior of regressio quatiles i o-statioary, depedet cases, Joural of Multivariate Aalysis 50 (99), 00-3. [7] Wag, J., Asymptotic ormality of L -estimators i oliear regressio, Joural of Multivariate Aalysis 54 (995), 227-238. [8] Wu, C. F., Asymptotic theory of oliear least squares estimatio, The Aals of Statistics 9 (98), 50-53. Seug Hoe Choi, Departmet of Geeral Studies, Hakuk Aviatio Uiversity, Koyag4, Korea E-mail: shchoi@mail.hagkog.ac.kr Hae Kyug Kim, Departmet of Mathematics, Yosei Uiversity, Seoul 20, Korea E-mail: abc27@yosei.ac.kr 457