On Liu Estimators for the Logit Regression Model

Similar documents
New Liu Estimators for the Poisson Regression Model: Method and Application

Liu-type Negative Binomial Regression: A Comparison of Recent Estimators and Applications

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Robust Logistic Ridge Regression Estimator in the Presence of High Leverage Multicollinear Observations

Stochastic Restricted Maximum Likelihood Estimator in Logistic Regression Model

SOME NEW ADJUSTED RIDGE ESTIMATORS OF LINEAR REGRESSION MODEL

[ ] λ λ λ. Multicollinearity. multicollinearity Ragnar Frisch (1934) perfect exact. collinearity. multicollinearity. exact

Large-Scale Data-Dependent Kernel Approximation Appendix

Basically, if you have a dummy dependent variable you will be estimating a probability.

A Comparative Study for Estimation Parameters in Panel Data Model

On the Restricted Almost Unbiased Ridge Estimator in Logistic Regression

p(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise

High-Order Hamilton s Principle and the Hamilton s Principle of High-Order Lagrangian Function

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Chapter 2 Transformations and Expectations. , and define f

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

FINITE-SAMPLE PROPERTIES OF THE MAXIMUM LIKELIHOOD ESTIMATOR FOR THE BINARY LOGIT MODEL WITH RANDOM COVARIATES

Florida State University Libraries

Continuous vs. Discrete Goods

A Robust Method for Calculating the Correlation Coefficient

Assignment 5. Simulation for Logistics. Monti, N.E. Yunita, T.

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION?

GENERALISED WALD TYPE TESTS OF NONLINEAR RESTRICTIONS. Zaka Ratsimalahelo

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

On a one-parameter family of Riordan arrays and the weight distribution of MDS codes

DERIVATION OF THE PROBABILITY PLOT CORRELATION COEFFICIENT TEST STATISTICS FOR THE GENERALIZED LOGISTIC DISTRIBUTION

Generalized Linear Methods

Testing for seasonal unit roots in heterogeneous panels

Statistics for Economics & Business

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

e i is a random error

ENTROPIC QUESTIONING

Chat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980

Computing MLE Bias Empirically

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

Statistics for Business and Economics

Maximum Likelihood Estimation

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Negative Binomial Regression

Non-Mixture Cure Model for Interval Censored Data: Simulation Study ABSTRACT

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Lecture 10 Support Vector Machines II

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Ridge Regression Estimators with the Problem. of Multicollinearity

[The following data appear in Wooldridge Q2.3.] The table below contains the ACT score and college GPA for eight college students.

Visualization of 2D Data By Rational Quadratic Functions

An (almost) unbiased estimator for the S-Gini index

Systems of Equations (SUR, GMM, and 3SLS)

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

An efficient method for computing single parameter partial expected value of perfect information

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Chapter 14: Logit and Probit Models for Categorical Response Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables

x = , so that calculated

LECTURE 9 CANONICAL CORRELATION ANALYSIS

4.3 Poisson Regression

SIMPLIFIED MODEL-BASED OPTIMAL CONTROL OF VAV AIR- CONDITIONING SYSTEM

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of

Time dependent weight functions for the Trajectory Piecewise-Linear approach?

The Ordinary Least Squares (OLS) Estimator

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

A Monte Carlo Study for Swamy s Estimate of Random Coefficient Panel Data Model

Errors for Linear Systems

ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE

Exponential Type Product Estimator for Finite Population Mean with Information on Auxiliary Attribute

Bounds for Spectral Radius of Various Matrices Associated With Graphs

Economics 130. Lecture 4 Simple Linear Regression Continued

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva

Chapter 6. Supplemental Text Material

Double Autocorrelation in Two Way Error Component Models

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

Econ107 Applied Econometrics Topic 9: Heteroskedasticity (Studenmund, Chapter 10)

Efficient nonresponse weighting adjustment using estimated response probability

Outline. Zero Conditional mean. I. Motivation. 3. Multiple Regression Analysis: Estimation. Read Wooldridge (2013), Chapter 3.

Linear regression. Regression Models. Chapter 11 Student Lecture Notes Regression Analysis is the

Finite-Sample Properties of the Maximum Likelihood Estimator for. the Poisson Regression Model With Random Covariates

Explicit bounds for the return probability of simple random walk

T E C O L O T E R E S E A R C H, I N C.

Approximations for a Fork/Join Station with Inputs from Finite Populations

On Optimal Design in Random Coefficient Regression Models and Alike

Comparison of Regression Lines

β0 + β1xi. You are interested in estimating the unknown parameters β

DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

10-701/ Machine Learning, Fall 2005 Homework 3

Fall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

UNR Joint Economics Working Paper Series Working Paper No Further Analysis of the Zipf Law: Does the Rank-Size Rule Really Exist?

Andreas C. Drichoutis Agriculural University of Athens. Abstract

Transcription:

CESIS Electronc Workng Paper Seres Paper No. 59 On Lu Estmators for the Logt Regresson Moel Krstofer Månsson B. M. Golam Kbra October 011 The Royal Insttute of technology Centre of Excellence for Scence an Innovaton Stues (CESIS) http://www.cess.se 1

On Lu Estmators for the Logt Regresson Moel By Krstofer Månsson 1 B. M. Golam Kbra an Ghaz Shukur 1,3 1 Department of Economcs, Fnance an Statstcs, Jönköpng Unversty P.O. Box 106, SE- 551 11 Jönköpng, Sween Department of Mathematcs an Statstcs, Flora Internatonal Unversty, Mam, Flora, USA 3 Department of Economcs an Statsts, Lnnaeus Unversty, Växö Sween. Abstract In nnovaton analyss the logt moel use to be apple on avalable ata when the epenent varables are chotomous. Snce most of the economc varables are correlate between each other practtoners often meet the problem of multcollnearty. Ths paper ntrouces a shrnkage estmator for the logt moel whch s a generalzaton of the estmator propose by Lu (1993) for the lnear regresson. Ths new estmaton metho s suggeste snce the mean square error (MSE) of the commonly use maxmum lkelhoo () metho becomes nflate when the explanatory varables of the regresson moel are hghly correlate. Usng MSE, the optmal value of the shrnkage parameter s erve an some methos of estmatng t are propose. It s shown by means of Monte Carlo smulatons that the estmate MSE an mean absolute error (MAE) are lower for the propose Lu estmator than those of the n the presence of multcollnearty. Fnally the beneft of the Lu estmator s shown n an emprcal applcaton where fferent economc factors are use to explan the probablty that muncpaltes have net ncrease of nhabtants. Key wors: Estmaton; MAE; MSE; Multcollnearty; Logt; Lu; Innovaton analyss. JEL Classfcaton: C18; C35; C39

1. Introucton Conser the stuaton when the epenent varable s Be, where where x s the th row of X whch s an n p 1 x exp ' = 1 exp x ' ata matrx wth p explanatory varables, an s a p 1 1 vector of coeffcents. In ths stuaton the parameters of the moel shoul be estmate usng the maxmum lkelhoo () metho by applyng the followng teratve weghte least square (IWLS) algorthm: β 1 X'WX X'W z, (1.1) where ẑ s a vector where the th element equals z log agonal matrx wth th agonal element equals 1 y 1 an Ŵ s a. Ths estmator approxmately mnmzes the weghte sum of square error (WSSE). However, several sources of nstablty for the estmator exsts. One may have the problem of separaton where a lnear combnaton of the regressors s perfectly prectve of the epenent varable. Ths problem scusse by Albert an Anerson (1984) lea to non-exstence of the estmator. The authors also showe that n case of almost perfect separaton the estmates are nstable. Another source of nstablty whch s the focus of ths paper arses when the regressors are collnear. In that stuaton the weghte matrx of cross-proucts, whch leas to nstablty an hgh varance of the estmator. X ' WX, s ll-contone Shrnkage estmator s a commonly apple soluton to the general problem cause by multcollnearty. For the lnear moel a lot of research has been conucte an Hoerl an Kennar (1970) suggeste the well-know rge regresson estmator. Ths estmator has then been extene to the logt moel by Schaeffer et al. (1984) an further evelopments were mae by Månsson an Shukur (011) where some fferent new rge parameters for logt rge regresson were suggeste. However, the savantage of ths metho s that the estmate parameters are complcate non-lnear functons of the rge parameter k whch can take on values between zero an nfnty. Therefore, Lu (1993) suggeste another estmator where the parameters obtane from ths estmator has the beneft of beng a lnear functon of the shrnkage parameter. Due to ths avantage over the rge regresson, the Lu estmator 3

has been use by varous researchers. Among them Akenez an Kacranlar (1995), Kacranlar (003) an Alheety an Kbra (009) an very recently Kbra (011) are notable. Ths estmator can also be extene to logt moels. Now, by notng that the IWLS algorthm n equaton (1.1) approxmately mnmzes the weghte sum of square error (WSSE), then one can apply the followng estmator -1 ' ' X WX I X WX I Z. (1.) The purpose of ths paper s to apply the Lu estmator n orer to solve the problems cause by multcollnearty. The Lu estmator s assume to perform better than when the regressors are hghly nter-correlate snce s, on average, too long n that stuaton an shrnks the length of the vector. Ths paper wll also suggest some methos of estmatng the shrnkage parameter. The performance of an the Lu estmator wll be stue usng Monte Carlo smulatons where factors such as the number of regressors, the sample sze an the egree of correlaton are vare. In orer to uge the performance of the estmator the mean square error (MSE) an mean absolute error (MAE) are use. The result shows that the Lu estmator always outperforms n the presence of multcollnearty. The benefts of the Lu estmator wll also be shown n an emprcal applcaton where fferent economc factors are use to explan the probablty that muncpaltes have a net ncrease of nhabtants. Ths paper s organze as follows: In Secton, the statstcal methoology s escrbe. In secton 3, the esgn of the experment an a result scusson are prove. Then n secton 4 an emprcal applcaton s prove. Fnally, n Secton 5, some conclung remarks are prove.. Statstcal methoology. 1. The Statstcal propertes of the an Lu estmators The Lu estmator for the logt moel s a base shrnkage estmator an a rect generalzaton of the one propose for lnear regresson moel by Lu (1993). The shrnkage parameter may take on values between zero an one an when equals to one then 4

. When s less than one we have. Snce s, on average, too long n the presence of multcollnearty, s assume to perform better than n such stuaton. Ths may also be shown by stuyng the MSE propertes of the two estmators. The MSE of the estmator equals: ' ' 1 1 MSE E L E tr X WX (.1) J 1 where s the th egenvalue of the ' X WX matrx. When lookng at the MSE t can easly be seen that t becomes nflate n the presence of multcollnearty snce some egenvalues wll be small when X ' WX s ll-contone. On the other han, the MSE of the Lu estmator s: ' ' ' ' - ' ' ' ' MSE E L E E Z Z Z Z tr Z Z k X WX ki J J 1 1 1 1 1 1 (.) where s efne as the th element of an s the egenvector efne such that where equals ag X ' WX ' so that the ecrease of the varance ( square bas ( MSE MSE 1. For the Lu estmator one wants to fn a value of ) s greater than the ncrease cause by ang the ). In orer to show that such a value of less than one exsts so that we start takng the frst ervatve of equaton (.) wth respect to : J g ' 1 1 1 J 1 1 an then by nsertng the value one n equaton (.3) we get: g' J 1 1 1 (.4) 5

whch s greater than zero snce 0. Hence, there exsts a value of between zero an one so that MSE MSE can be foun by settng equaton (.4) to zero an solve for. Furthermore, the optmal value of any nvual parameter. Then t may be shown that 1, (.5) 1 correspons to the optmal value of the shrnkage parameter. Hence, the optmal value of s negatve when s less than one an postve when t s greater than one. However, ust as n Lu (1993) the shrnkage parameter wll be lmte to values between zero an one.. Estmatng the shrnkage parameter In orer to estmate the optmal value of n equaton (.5) several methos wll be propose. The ea behn these propose estmators are obtane from the work of Hoerl an Kennar (1970), Kbra (003) an Khalaf an Shukur (005) where several fferent methos of estmatng the shrnkage parameter for lnear rge regresson have been propose. As n those papers, the shrnkage parameter,, wll be estmate by a sngle value. The frst estmator whch s base on the work by Hoerl an Kennar (1970) s the followng: max 1 D1 max 0, 1 max max, where we efne max an max to be the maxmum element of an X ' WX respectvely. Replacng the values of the unknown parameters wth the maxmum value of the unbase estmators s an ea taken from Hoerl an Kennar (1970). However, for the Lu estmator another maxmum operator s also use that wll ensure that the estmate value of the shrnkage parameter s not negatve. Furthermore, the followng estmators, whch are base on the eas n Kbra (003), are propose: 1 D max 0, mean, 1 J 1 1 D3 max 0,. p 1 6

Usng the average value an the mean s very common when estmatng the shrnkage parameter for the rge regresson an the D an D3 estmators have rect counterparts n equaton (13) an (15) of Kbra (003). Fnally, the followng estmators are propose: 1 D4 max 0, max, 1 1 D5 max 0, mn. 1 For these estmators other quantles than the mean s use whch was successfully apple by Khalaf an Shukur (005). 3. The Monte Carlo smulaton 3. 1 The Desgn of the Experment The man focus of ths paper s to compare the MSE propertes of the an Lu estmators when the regressors are hghly ntercorrelate. Hence, the core factor vare n the esgn of the experment s the egree of correlaton ( ) between the regressors. Therefore, the followng formula whch enables us to vary the strength of the correlaton s use to generate the explanatory varables: 1 1/ x z z 1,,... n, 1,,... p p (3.1) where z pseuo-ranom numbers from the stanar normal strbuton. We conser four fferent values of corresponng to 0., 0.85, 0.95 an 0.99. The n observatons for the epenent varable are obtane from the Be strbuton where x exp '. (3.) 1 exp x ' The parameter values of are chosen so that ' 1. We use,, an egrees of freeoms (f=n-p) an moels consstng of two an four explanatory varables. The experment s replcate 000 tmes by generatng new pseuo-ranom numbers. Then the MSE s calculate as follows: 7

an the MAE as: MSE 000 1 ' 000. (3.3) MAE 000 1 000. (3.4) 3. Result Dscusson The smulate MSE an MAE for all of the estmators for fferent n an ρ are presenet n Tables 1 an for p= an 4 respectvely. From the tables, we can see at a glance that the egree of correlaton nflates the MSE an MAE. Ths ncrease s partcularly large for an t s more severe when applyng MSE as performance crtera nstea of MAE. For the Lu estmators, the nflaton of the MSE an MAE s less severe than for. However, there s bg fference between the peformance of the Lu estmatos epenng on whch shrnkage parameter s apple. The least robust opton among the fferent propose methos of estmatng the shrnkage parameter s the D4. The performance of the D1 to D3 estmators are almost equavalent. However, the most robust opton s the D5 estmator. Ths shrnakge parameter has always ether the lowest value of both measures of performance or t s close to the estmator that mnmzes the MSE an MAE. Moreover, one can see that as the number of explanatory varables ncreases the MSE an MAE ncreases. Ths ncrease s more sever for the than the Lu estmator an t s also larger f MSE s use to uge the performance of the estmator nstea of MAE. Fnally, when conserng all of the results t s clear to see that ncreasng the sample sze has a postve effect especally for. Ths s expecte snce s a consstent estmator. 4. Emprcal Applcaton The fferent estmaton methos wll be llustrate usng a ataset taken from the Statstcs Sween. 1 A logt regresson moel s estmate where the epenent varable s efne as follows: 1 The homepage s www.scb.se 8

y 1 f the net populaton change s postve n muncpalty =. 0 otherwse Ths epenent varable s explane by the followng regressors, the number of unemploye people (x 1 ), the number of bul appartments (x ), the amount of bankrupt frms (x 3 ) an the populaton (x 4 ), respectvely. We wll estmate a logt moel for the full sample an for the urban regons n Sween. The full sample conssts of 90 observatons an the subsample conssts of 84 muncpaltts. The bvarate correlaton (for the full populaton) between the regressors can be foun n Table 3: Table 3: Correlaton matrx x 1 x x 3 x 4 x 1 1 x 0.8854 1 x 3 0.946 0.9430 1 x 4 0.9663 0.9010 0.9367 1 From Table 3 one can see that the bvarate correlatons are hgh (all are greater than 0.88) an that we therefore mght have a sever multcollnearty problem. The logt regresson moel s estmate n R usng the IWLS algotrhm 3 an the Lu estmators s apple wth the shrnkage parameter D5 snce ths s the one that mnmzes the estmate MSE an MAE. In orer to estmate the stanar errors of the fferent paramaters bootstrap technque s apple. The results can be foun n Table 4. We can see that the number of unemploye people an bankrupt frms have a negatve mpact whle the other two varables have a postve mpact on the probablty of a muncpalty to have a net ncrease of nhabtants. Ths s expecte snce a hgher value of unemploye people an bankrupte frms ncate a poor economc The urban regons are efne as the muncpalttes belongng to the Functonal analyss (FA) regons Stockholm, Göteborg an Malmö. 3 We are usng the functon glm() n orer to estmate the logt moel whch s part of the stanar routnes n R. However, any software wll work fne snce the Lu metho oes not requre any changes of the exstng routnes of estmatng the logt regresson moel. The Lu estmator only requres that one s able to extract the result the maxmum lkelhoo estmators of the coeffcents an the varance-covarance matrx whch s efne as ' 1 X WX. 9

performance. The postve effect of the populaton varable ncates that more people are movng to urban regons. The estmate stanar errors s ecrease for all varables, but the most substantal reucton can be foun for x (.e. the number of bul appartments). For ths varable the reucton of the estmate parameter s also substantal. Ths ncates that the multcollnearty problem leas to an estmate value that s lager than t shoul be. Hence, the postve mpact of bulng new appartments s most lkely exagerate when applyng. When lookng at the t-statstcs one can see that these values usng Lu metho are larger than those for the whch further shows the superorty of the Lu estmator snce the p-values become lower. Once agan t s for the varable x the largest ncrease of the t-statstc can be foun. For the subsample compare wth the full sample the sgn of the varables x 1 an x are change. The postve mpact of ncreasng the number of unemploye people may be ue to the fact that many mmgrants choose to settle own n the areas n the large ctes wth hgh unemployement rates. The negatve mpact of varable x may be explane by the fact that not enough appartments are constructe where a lot of the people are choosng to move. When lookng at the stanar errors one can see a much larger reucton of the stanar errors for the subsample than the full sample. The reucton of the bootstrappe stanar errors s especally remarkable for varables x 1 an x. The ncrease of the t-statstcs s also larger for the subsmaple than the full sample. In ths case the ncreas of x 4, may be notce snce ths varable becomes statstcally sgnfcant when the Lu estmator s apple. Table 4: The results from the logt regresson analyss Full sample Lu x 1 x x 3 x 4 x 1 x x 3 x 4-0.1990 3.376-0.4574 0.0149-0.098 1.1497-0.3566 0.0 se 0.156 1.3760 0.6174 0.0043 0.1060 0.6 0.49 0.0037 t -1.5844.4183-0.7408 3.4651-1.979 4.5878-0.7909 4.0541 Large cty regons x 1 x x 3 x 4 x 1 x x 3 x 4.983-1.418-0.0409 0.0184 0.46-0.1333-0.11 0.016 se 15.904 31.149 0.9088 0.1078 0.3393 0.6864 0.1801 0.006 t 0.1445-0.0399-0.04 0.1707 0.7-0.194-0.654.619 Lu 10

5. Some Conclung Remarks In ths paper a new Lu estmator for the logt moel has been propose. The MSE an MAE of ths new estmator an the tratonal metho are calculate by usng Monte Carlo smulatons. In the esgn of the experment, factors such as the egree of correlaton, the sample sze an the number of explanatory varables are vare. The result from the smulaton stuy clearly showe that the MSE an MAE of the metho become nflate n the presence of multcollnearty. Ths problem s partcularly severe when the sample sze s small an the correlaton between two explanatory varables s hgh. The results from the Monte Carlo stuy also event that the new Lu estmator s much more robust to ncrease of correlaton an t has superor MSE an MAE propertes over the n all of the evaluate stuatons. The best opton to estmate the logt moel n the presence of multcollnearty s to apply the Lu estmator together wth the shrnkage parameter D5. The beneft of ths metho s clearly shown n an emprcal applcaton where one can see a substantal ecrease of the stanar errors an an ncrease of the t-statstcs, especally for the subsample. We hope the fnngs of the paper wll be useful for the practtoners. Acknowlegements Ths paper was wrtten whle Dr. Kbra was vstng Professor Ghaz Shukur at the Department of Economcs an Statstcs, Lnnaeus Unversty, an Department of Fnance, Economcs an Statstcs, Jönköpng Internatonal Busness School, Sween, urng May- June 011. He acknowleges the excellent research support prove by Lnnaeus Unversty an Jönköpng Internatonal Busness School, Sween. 11

References Albert, A. an Anerson, J. A. (1984). On the exstence of maxmum lkelhoo estmates n logstc regresson moels. Bometrka 71, 1-10. Akenz, F. an S. Kacranlar (1995). On the almost unbase generalze Lu estmator an unbase estmaton of the bas an MSE. Communcatons n Statstcs- Theory an Methos, 4, 1789-1797. Alheety, M. I. an B. M. G. Kbra (009). On the Lu an almost unbase Lu estmators n the presence of multcollnearty wth heterosceastc or correlate errors. Surveys n Mathematcs an ts Applcatons, 4, 155-167. Hoerl, A.E. an R.W. Kennar (1970). Rge regresson: base estmaton for nonorthogonal Problems. Technometrcs, 1, 55-67. Lu, K. (1993). A new class of base estmate n lnear regresson. Communcatons n Statstcs-Theory an Methos,, 393-40. Månsson, K. an Shukur, G. (011), On Rge Parameters n Logstc Regresson, Communcatons n Statstcs, Theory an Methos, 40, Issue 18, 3366-3381. Kacranlar, S. (003). Lu estmator n the general lnear regresson moel. Journal of Apple Statstcal Scence, 13, 9-34. Khalaf, G. an Shukur, G. (005). Choosng Rge Parameter for Regresson Problems. Communcatons n Statstcs, Theory an Methos, 34, Issue 5, 1177-118. Kbra B.M.G. (003). Performance of some new rge regresson estmators Communcatons n Statstcs, Theory an Methos, 3, 419-435. Kbra, B. M. G. (011). On Some Lu an Rge Type Estmators an ther Propertes Uner the Ill-contone Gaussan Lnear Regresson Moel. To Appear n Journal of Statstcal Computaton an Smulaton. Schaefer, R.L., Ro, L. D. an Wolfe, R. A. (1984). A rge logstc estmator. Communcatons n Statstcs- Theory an Methos, 13, 99-113. Munz, G. an Kbra, B. M. G. (009). On some rge regresson estmators: An Emprcal Comparson. Communcatons n Statstcs-Smulaton an Computaton 38:61-630. 1

Table 1: The smulate MSE an MAE for fferent n an an p= =0. 00 =0.85 00 =0.95 00 =0.99 00 Estmate MSE Estmate MAE D1 D D3 D4 D5 D1 D D3 D4 D5 1.041 0.45 0.341 0.341 0.478 0.301 1.087 0.695 0.69 0.69 0.711 0.605 0.639 0.37 0.86 0.86 0.338 0.70 0.865 0.61 0.58 0.58 0.618 0.57 0.44 0.58 0.37 0.37 0.6 0.30 0. 0.551 0.534 0.534 0.554 0.530 0.80 0.189 0.181 0.181 0.190 0.179 0.58 0.477 0.470 0.470 0.478 0.469 0.05 0. 0.147 0.147 0. 0.147 0.498 0.48 0.46 0.46 0.49 0.46 1.913 0.748 0.5 0.5 0.88 0.443 1.454 0.84 0.735 0.735 0.881 0.669 1.003 0.40 0.35 0.35 0.453 0.31 1.104 0.685 0.640 0.640 0.706 0.615 0.810 0.383 0.337 0.337 0.400 0.31 0.978 0.654 0.63 0.63 0.664 0.609 0.465 0.56 0.40 0.40 0.61 0.33 0.749 0.551 0.539 0.539 0.554 0.535 0.3 0.1 0.05 0.05 0.13 0.04 0.65 0.9 0.4 0.4 0.510 0.3 5.453 1.785 1.433 1.433.803 0.815.55 1.177 1.0 1.0 1.497 0.787 3.057 0.980 0.806 0.806 1.35 0.541 1.940 0.98 0.85 0.85 1.085 0.703.530 0.900 0.71 0.71 1.15 0.9 1.3 0.89 0.819 0.819 0.997 0.709 1.438 0.5 0.456 0.456 0.601 0.379 1.315 0.708 0.67 0.67 0.749 0.68 1.093 0.407 0.363 0.363 0.443 0.36 1.165 0.669 0.643 0.643 0.691 0.60 7.40 9.476 8.969 8.969 18.441 4.5 5.636.491.573.573 3.9 1.476 15.38 4.934 4.767 4.767 9.45.5 4.363 1.863 1.885 1.885.815 1.157 1.14 4.014 3.587 3.587 7.19 1.860 3.8 1.685 1.644 1.644.47 1.08 7.660.389.097.097 4.17 1.098 3.114 1.35 1. 1. 1.851 0.853 5.391 1.60 1.414 1.414.616 0.833.608 1.109 1.047 1.047 1.454 0.766 13

Table : The smulate MSE an MAE for fferent n an an p=4 =0. 00 =0.85 00 =0.95 00 =0.99 00 Estmate MSE Estmate MAE D1 D D3 D4 D5 D1 D D3 D4 D5 4.33 1.854 1.00 0.877.48 0.65 3.06 1.936 1.463 1.391.136 1.59.190 1.098 0.699 0.653 1.04 0.64.87 1.569 1.310 1.83 1.69 1.65 1.54 0.844 0.61 0.601 0.889 0.591 1.914 1.393 1.43 1.3 1.41 1.6 0.936 0.583 0.5 0.3 0.594 0.1 1.55 1.193 1.134 1.133 1.01 1.13 0.659 0.447 0.415 0.415 0.4 0.414 1.79 1.0 1.04 1.05 1.053 1.04 7.635 3.164 1.964 1.601 4.615 0.761 4.039.367 1.737 1.610.794 1. 3.9 1.790 1.004 0.871.07 0.687 3.0 1.940 1.487 1.419.16 1.30.615 1.51 0.774 0.719 1.416 0.670.9 1.66 1.364 1.334 1.749 1.306 1.574 0.847 0.631 0.615 0.89 0.607 1.970 1.404 1.57 1.49 1.433 1.45 1.187 0.694 0.570 0.564 0.715 0.561 1.716 1.9 1.04 1.00 1.306 1.199 6.48 9.661 7.841 6.70 18.98 1.977 7.37 3.733.864.660 5.339 1.307 1.86 4.886 3..558 7.988 0.87 5.407.913.161 1.971 3.777 1.79 8.701 3.459 1.946 1.535 5.034 0.708 4.511.58 1.831 1.667 3.070 1.86 5.0.10 1.164 0.964.77 0.716 3.560.084 1.558 1.458.359 1.36 3.87 1.693 0.979 0.835.06 0.714 3.05 1.885 1.470 1.401.043 1.338 144.7.01 57.38 51.74 11.78 15.5 16.58 7.815 7.549 7.49 13.98.554 68.14 3.18 0.30 17.95 51.79 4.515 1.36 5.853 5.08 4.999 9.851 1.87 47.87 16.10 13.68 11.79 34.46 3.04 10.55 5.078 4.417 4.15 8.197 1.649 8.98 9.606 7.80 6. 19.16 1.91 8.303 4.013 3.45 3.007 6.101 1.44 0.74 6.917 4.908 4.003 1.80 1.55 7.013 3.44.709.447 4.9 1.340 14