Alternative Biased Estimator Based on Least. Trimmed Squares for Handling Collinear. Leverage Data Points

Size: px
Start display at page:

Download "Alternative Biased Estimator Based on Least. Trimmed Squares for Handling Collinear. Leverage Data Points"

Transcription

1 International Journal of Contemporary Mathematical Sciences Vol. 13, 018, no. 4, HIKARI Ltd, Alternative Biased Estimator Based on Least Trimmed Squares for Handling Collinear Leverage Data Points Moawad El-Fallah Abd El-Salam Department of Statistics & Mathematics and Insurance Faculty of Commerce, Zagazig University, Egypt Copyright 018 Moawad El-Fallah Abd El-Salam. This article is distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Abstract The multicollinearity in multiple linear regression models and the existence of leverage data points are common problems. These problems exert undesirable effects on the least squares estimators. So, it would seem important to combine methods of estimation designed to deal with these problems simultaneously. In this paper, alternative biased robust regression estimator is defined by mixing the ridge estimation technique into the robust least trimmed squares estimation to obtain the Ridge Least Trimmed Squares (RLTS). The efficiency of the combined estimator(rlts) is compared with some existing regression estimators, which namely, the Ordinary Least Squares (); Ridge Regression (RR) and Ridge Least Absolute Deviation(RLAD). The numerical results of this study show that, the RLTS regression estimator is more efficient than other estimators, based on, Bias and mean squared error criteria for many combinations of leverage data points and degree of multicollinearity. Keywords: Leverage Data Points; Multicollinearity; Ridge regression; Ridge Least Absolute Deviation; Ridge Least Trimmed Squares estimation; Bias and Mean Squared Error criteria 1. Introduction Two important problems are considered in regression analysis; multicollinearity

2 178 Moawad El-Fallah Abd El-Salam and the existence of leverage data points. The ordinary least squares estimators () of coefficients are known to possess certain optimal properties when explanatory variables are not correlated among themselves, and the disturbances of the regression equation are independent, identically distributed normal random variables. The presence of correlation among the explanatory variables may result in imprecise information being available about the regression coefficients. In addition, the least squares estimator may produce extremely poor estimates in the presence of leverage data points. Thus, various remedial techniques have been suggested for these problems separately. One such remedial technique is ridge regression to deal with multicollinearity, and the robust estimation techniques are not as strongly affected by the presence of leverage data points. However, although, we usually think of these two problems separately, but in practical situations, these problems occur simultaneously. Several robust ridge regression estimators have been suggested for handling these two problems simultaneously, see (Lukman et al. (014) [10] and Nkiruka and Uchenna (016)) [13]. In this paper, we take the initiative to develop a more robust ridge estimators to remedy these two problems. We proposed combining the ridge regression with the highly efficient and high breakdown point estimator, namely the Ridge Least Trimmed Squares(RLTS)estimator. We call this modified method, the robust ridge regression based on Least Trimmed Squares estimation (RLTS). We expect that, the modified method would be less sensitive to the presence of leverage data points and multicollinearity. So, the aim of this paper is devoted to examine some estimators which are resistant to the combined problems of multicollinearity and leverage data points. Exactly, can the ridge estimators and some robust estimation techniques be combined to produce a robust ridge regression estimator?. The remainder of the paper is organized as follows. In section (), the ridge regression estimator will be reviewed. The robust regression estimation will be discussed in section (3). In section (4), we discuss the augmented ridge robust estimators as a way of combining biased and robust regression techniques, while, Section (5) introduces the proposed combined ridge robust estimator (RLTS). Section (6) presents the results of a Monte Carlo simulation study to investigate how such estimators perform well, and some concluding remarks are presented in section (7).. Ridge Regression Estimators Consider the following linear regression model: Y X, (1) where : y is an ( n 1) vector of observations on the dependent variable, X is an ( n p) matrix of observations on the explanatory variables, is a ( p 1) vector of regression coefficients to be estimated, and is an ( n 1) vector of disturbances. The least squares estimator of can be written as:

3 Alternative biased estimator based on least trimmed squares 179 ˆ ( X' X )- 1 X 'Y () This method gives unbiased and minimum variance among all unbiased linear estimators provided that the errors are independent and identically, normally distributed. However, in the presence of multicollinearity, the singularities present in ( X ' X ) matrix and this ill-conditioned X matrix can result in very poor estimates. The degree of multicollinearity is often indicated by conditioned number (CN) of the matrix X ( or X ' X ). CN is defined as the ratio of the largest singular values of X to the smallest, λ CN ( X ) max 1, (3) λ min where: are the eigenvalues of the matrix ( X ' X ). Belsley et al. (1980) [] have empirically shown that weak dependencies are linked to CN around 5 to 10, whereas moderate to strong relations are linked to CN of 30 to 100. Hoerl and Kennard (1970) [6] pointed out that adding a small constant to the diagonal of a matrix, will improve the conditioning of a matrix as this would dramatically reduced its CN. The ridge regression is defined as follows: ˆ RR ( X'X KI ) -1 X'Y, (4) where: I is the ( p p) identity matrix and K is the biasing constant. Various methods for determining K value been introduced in the literature such as described by Hoerl and Kennard (1970) [6] and Gibbons (1981) [5] as: PS Kˆ H βˆ, (5) where, ' Y - Xβˆ Y - Xβˆ S ( )'( ) (6) n - p When k 0, ˆ ˆ RR, when K 0, ˆRR is biased but more stable and precise than estimator and when K, ˆ RR 0. Hoerl and Kennard (1970) [6] have shown that, there always exist a value K 0 shuch that MSE ( ˆ RR ) is Less than MSE( ˆ ). 3. Robust Regression Estimators Robust regression estimators have been proven to be more reliable and

4 180 Moawad El-Fallah Abd El-Salam efficient than least squares estimator especially when the data are contaminated with leverage observations. Since the leverage data points greatly influence the estimated coefficients, standard errors and test statistics, the usual statistical procedure may be most inefficient as the precision of the estimator has been affected. Several different robust regression estimators exist. Two of the most commonly considered are: LAD-estimators and LTS-estimators. 3.1 The Least Absolute Deviation Estimator (LAD): The LAD estimator, ˆ, can be defined as the solution to the following LAD minimization problem: n mim Y - X' β (7) 1 i LAD i 1 Rather than minimizing the sum of squared residuals as in least squares estimation, the sum of the absolute values of the residuals is minimized. Thus, the effect of leverage data points on the LAD estimates will be less than that on estimates. 3. The Least Trimmed Squares Estimator (LTS): The Least Trimmed Squares (LTS) estimator was proposed by Rousseeuw (1984) [16] and has the property of being highly resistant to a relatively large proportion of leverage data points. Thus LTS has a high breakdown value. For the details of this technique and its properties, see Rousseeuw and Leroy (1987) [17], and also chapter 5 of Zaman (1996) [0]. The estimated ˆLTS can be defined by: h Min e (i), (8) i 1 where, e (1) e (),, e (n) are the ordered squared residuals, e i = (yi xi ), i=1,,n, and the value of h must be determined based on trimming the data values. For example, the 0% trimmed LTS estimator is defined to be the value of (4n/ 5) ˆ minimizing 4. Chen (00) [3] defined h in the range [(n/) e ( i) ( ) i 1 +1] h [(3n+k+1)/4], and recommended that, the breakdown value for the LTS estimator is( n-h/n ), when h=[(3n+k+1)/4]. Although, there exist other high breakdown estimators, Rousseeuw and Van Driessen, (1999) [18] pointed out that, the LTS has many advantages to recommend itself, and they developed a fast algorithm for its computation. The main advantages of the LTS method is as follows. Firstly, LTS is simple to understand and easy to motivate. Also, it is more efficient than the LMS (Least Median of Squares) introduced by Rousseeuw (1984) [16] with which it shares these advantages.

5 Alternative biased estimator based on least trimmed squares 181 Many estimators commonly regarded as robust in the econometrics literature have low breakdown values and cannot deal with any significant number of leverage data. For example, the bounded influence estimator of Krasker and Welsch (198) [9], and the least absolute deviation method, both suffer heavily from the presence of a small subgroup of these data influential; see Yohai (1987) [19]. 4. Robust Ridge Regression Estimators There are many studies that have been related using the robust ridge regression estimators in literature such as: Pfaffenberger and Dieman (1984) [14]; Moawad El-Fallah (013) [1] and Mal and Dul (014) [11]. In this section, we present some combinations of ridge and robust regression estimation discussed in sections () and (3) respectively. In this respect, the RLAD estimator, which is based on the LAD and ridge estimators denoted by, ˆRLAD, can be computed using the following: ˆ (X' X K* I) -1 X' Y RLAD, (9) where the value of K * is determined from data using : PS K* LAD βˆ βˆ LAD LAD (10) and, S (Y - Xβ ˆ )'(Y - Xβ ˆ ) LAD LAD n p, (11) where, ˆLAD is the LAD estimator defined as the solution to equation (7). It be noted that the value of K * is the estimator of K presented by equation (5) with two changes. First, the LAD estimator of is used rather than estimator. Second, the estimator of used in equation (11) is modified by the LAD coefficient estimates rather than the least squares estimates. These changes are aimed to reduce the effect of extreme points on the value chosen for the biasing parameter. 5. Alternative Combined Regression Estimator Instead of ˆ estimator which was aimed to reduce the effect of RLAD leverage data points on the value chosen for the biasing parameter k. Another alternative combined estimator between robust and ridge regression estimation is ˆRLTS. In this respect, it is hoped that, the problems of multicollinearity and leve-

6 18 Moawad El-Fallah Abd El-Salam rage data points can be solved simultaneously. The estimator can be ˆRLTS calculated by the following. ˆ * -1 (X'X K I) X'Y RLTS LTS, (1) where the value of K * is given by: K LTS PS LTS βˆ βˆ LTS LTS (13) and S LTS (Y - Xβ ˆ )'(Y - Xβ ˆ ) LTS LTS n p (14) 6. Simulation Study (6.1) Design of the Experiment: We carry out a Monte Carlo simulation study to compare the performance of some alternative combined estimators under concern. The simulation is designed to allow both multicollinearity and leverage data points simultaneously. Varying degrees of multicollinearity are allowed. Also, the non-normal distributions are used to generate leverage data points. The study contains four estimators which are: 1-The least squares estimator ( ˆ ). -The ridge regression estimator ( ˆ R ). 3-The ridge least absolute deviation estimator ( ˆ ). RLAD 4-The ridge least absolute deviation estimator ( ˆ ). RLTS The Least Squares estimator was defined in equation (). The Ridge estimator was defined in equation (4) using the K value in equation (5). The Ridge Least Absolute Deviation estimator was defined in equation (9) using the K value in equation (10). The Ridge Least Trimmed Squares estimator was defined in equation (1) with K of equation (13). Suppose, we have the following linear regression model: y x x e, where i 1,,..., n (15) i o 1 i1 i i Dempster et al. (1977) [4], pointed out that, the parameter values of o, 1 and are set equal to one. The explanatory variables x i1 and x i are generated as: x ( 1 ) z z, i 1,,..., n, j 1, (16) ij ij ij

7 Alternative biased estimator based on least trimmed squares 183 Where, z are independent standard normal random numbers generated by the ij normal distribution. The value of representing the correlation between the two explanatory variables and its values were chosen as: 0.0, 0.5, and Once, for a given sample size n, the explanatory variables values were generated. The sample sizes which will be examined in this study are: 0, 40 and 60. One important factor in this study is the disturbance distribution. The following three disturbance distributions are used: Standard normal distribution. distribution with mean zero and variance two. distribution with median zero and scale parameter one. In general, all the obtained random numbers are generated using the IMSL subroutines as: Standard normal random numbers are generated using the GGNPM subroutine. random numbers are generated using the GGUBFS subroutine. random numbers are generated using the GGCAY subroutine. The simulations were performed on an IBM 4341 Model 1. Programs were written in double-precision FORTRAN. For each of the treatments in the three factor experiment, (, sample size n, number of distributions), 500 Mote Carlo trials are used, and the following statistics are computed. (1) The average of the estimates. () The mean squared error (MSE) and the 6 pairwise MSE ratios where: MSE ( βˆ i - βi ) (17) 500 i 1 (3) The 6 pairwise comparisons of " closeness " to the actual parameter values. The pairwise comparisons are : ( ˆ with ˆ R ),( ˆ with ˆ RLAD ), ( ˆ with ˆ ) RLTS, ( ˆ with ˆ R ) RLAD, ( ˆ R with ˆ ) RLTS, ( ˆ with ˆ ) RLAD RLTS. (6.) The Results of comparisons We consider the comparison of the two ridge robust estimators RLAD and RLTS. Table (1) presents the number of times that the RLTS estimates are closer than the RLAD estimates to the true value of the parameter only. While, 1 Table () presents the results of the mean squared estimation error ratios. These ratios represent the efficiency of RLTS relative to RLAD. It be noted that, values less than one indicated that RLTS is more efficient, while values greater than one indicated that RLAD is more efficient.

8 184 Moawad El-Fallah Abd El-Salam Table (1): Number of times RLTS is closer than RLAD to the true parameter value. 1 Error Distribution Values of n n n Table (): MSE Ratios of RLTS to RLAD for Estimation of * 1. Error Distribution Values of n n n It be noted that, values less than one indicate RLTS is more efficient than RLAD; while, values greater than one indicate RLAD is more efficient than RLTS.

9 Alternative biased estimator based on least trimmed squares 185 From the results of Table (1), we see that the RLTS estimator performs better than RLAD estimator over a wide range of combinations between and the error distribution. These results are supported by the mean squared estimation error ratios presented in Table (). Therefore, as the RLTS estimator clearly is superior to the RLAD estimator, the remaining comparisons will be restricted to RLTS to conserve space. Tables (3) and (5) show the number of times that the RLTS estimates are closer than the, RR and estimates, respectively, to the true value of the parameter. Also, the MSE ratios of RT to each of these estimators RR and 1 are given in Tables (4) and (6) respectively. From Tables (3) and (4), we see that the RR estimator marginally is superior than RLTS when disturbances are normal and the correlation is high. Otherwise RLTS is superior. From Tables (5) and (6), is superior when there is no correlation except for disturbances. Otherwise, RLTS is superior. To conclude, the results from comparisons of RLTS estimator to the RR, RLAD and estimators are not entirely unexpected, given the properties of the various estimators. Therefore, the most important result from these comparisons is, the RLAD estimator is superior to any other estimator over a wide range values of for the given disturbance distributions as the ridge regression, in some cases, is expected to perform well. Table (3): Number of times RLTS is closer than RR to the true parameter value. 1 Values of Error Distribution n n n

10 186 Moawad El-Fallah Abd El-Salam Table (4): MSE Ratios of RLTS to RR for Estimation of * 1 Error Distribution Values of n n n * Value less than one indicate RLTS is more efficient than RR; while, values greater than one indicate RR is more efficient than RLTS. Table (5): Number of times RLTS is closer than to the true parameter value. 1 Values of Error Distribution n n n

11 Alternative biased estimator based on least trimmed squares 187 Table (6): MSE Ratios of RLTS to for Estimation of 1 Values of Error Distribution n n n * Value less than one indicate RLTS is more efficient than ; while, values greater than one indicate is more efficient than RLTS. 7. Concluding Remarks The presence of leverage data points and multicollinearity are considered two of the more frequent problems in regression analysis. Although, we usually think of these two problems separately, however, these problems occur simultaneously in applied situations. A Monte Carlo simulation was designed to compare the performance of some combining ridge and robust regression estimators for dealing with these two problems. The results of comparisons indicated that, the ridge least trimmed squares (RLTS) estimator is better than other estimators (, RR and RLAD) for many combinations of non-normal error distribution type (which reflected the presence of leverage data points) and degree of multicollinearity (Tables (), (4) and (6)). Only, this estimator is less efficient than the others when disturbances are normal. In addition, RLAD outperforms both LAD and estimators when the degree of multicollinearity is high. Therefore, the RLTS estimator appears to be a suitable alternative to other estimators for the different combinations of multicollinearity and leverage data points which are indicated by the non-normal error distributions. There are limitations to the present study, however. First, since this is a simulation study, its limitations must be recognized. Data have been generated to try and allow generalization to practical situations, however. Second, other possible members of the robust regression approach may be used to construct the title of combined biased robust estimators.

12 188 Moawad El-Fallah Abd El-Salam References [1] K. Ayinde, Combined Estimators as Alternative to Ordinary least Squares Estimator, International Journal of Sciences: Basic and applied Research, 8 (013), no. 1, [] D. Belsley, E. Kuh and R.E. Welsh, Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, Wiley & Sons, New York, [3] C. Chen, Robust regression and outlier detection with the ROBUSTRER procedure, SUGI paper, SAS Institute, (00), [4] A.P. Dempster, M. Schatzoff and N. Wermuth, A simulation study of alternatives to Ordinary Least Squares, Journal of the American Statistical Association, 7 (1977), no., [5] D. Gibbons, A simulation study of some ridge estimators, Journal of the American Statistical Association, 76 (1981), no. 3, [6] A.E. Hoerl and R. W. Kennard, Ridge Regression: Iterative Estimation of the Biasing parameter, Communications in Statistics: A Theory Methods, 5 (1970), no. 1, [7] B. Kan, O. Alpu and B. Yazici, Robust Ridge and Robust Liu Estimator for Regression Based on the LTS estimator, Journal of Applied Statistics, 40 (013), no. 3, [8] B. Kan and O. Alpu, Combining Some Biased Estimation Methods with Least Trimmed Squares Regression and its application, Rev. Colomb. Estatist., 38 (015), no., [9] W.S. Krasker and R.E. Welsch, Efficient bounded influence regression estimation, Journal of the American Statistical Association, 77 (198), no. 379, [10] A. Lukman, O. Arowolo and K. Ayinde, Some Robust Ridge Regression for Handling Multicollinearity and Outlier, International Journal of Sciences: Basic and Applied Research, 16 (014), no., [11] C.Z. Mal and Y.L. Dul, Generalized shrunken type-gm estimator and its application, International Conference on Applied Sciences (ICAS013), IOP Publishing Ltd., Vol. 57, 014,

13 Alternative biased estimator based on least trimmed squares [1] A. Moawad Abd El-Falah, The Efficiency of Some Robust Ridge Regression For Handling Multicollinearity and Nonnormal errors problems, Applied Mathematical Science, 7 (013), no. 77, [13] O.E. Nkiruka and O.J. Unhenna, A Comparative Study of Some Estimation Methods for Multicollinear Data, International Journal of Engineering and Applied Sciences, 3 (016), no. 1, [14] R.C. Pfaffenberger and T.E. Dielman, A Modified Ridge Regression Estimator Using the Least Absolute Value Criterion in the Multiple Linear Regression Model, Proceedings of the American Institute for Decision Sciences, Toronto, (1984), [15] M. Z. Siti, S.Z. Mohammad and Al. bin I. Muhammad, Weighted Ridge MM- Estimator in Robust Ridge Regression with Multicollinearity, Mathematical Models and Methods in Modern Science. Symp. Computational Statistics, 1 (01), no. 3, [16] P.J. Rousseeuw, Least median of squares regression, Journal of the American Statistical Association, 79 (1984), no. 388, [17] P.J. Rousseeuw and A.M. Leroy, Robust Regression and Outliers Detection, Wiley & Sons, New York, [18] P.J. Rousseeuw and K. Van Driessen, A fast algorithm for the minimum covariance determinant estimator, Technometrics, 41 (1999), no. 3, [19] V.J. Yohai, High breakdown- point and high efficiency robust estimates for regression, Annals of Statistics, 15 (1987), no., [0] A. Zaman, Statistical Foundations for Econometric Techniques, Academic Press, New York, Received: June 15, 018; Published: July 18, 018

Using Ridge Least Median Squares to Estimate the Parameter by Solving Multicollinearity and Outliers Problems

Using Ridge Least Median Squares to Estimate the Parameter by Solving Multicollinearity and Outliers Problems Modern Applied Science; Vol. 9, No. ; 05 ISSN 9-844 E-ISSN 9-85 Published by Canadian Center of Science and Education Using Ridge Least Median Squares to Estimate the Parameter by Solving Multicollinearity

More information

Multicollinearity and A Ridge Parameter Estimation Approach

Multicollinearity and A Ridge Parameter Estimation Approach Journal of Modern Applied Statistical Methods Volume 15 Issue Article 5 11-1-016 Multicollinearity and A Ridge Parameter Estimation Approach Ghadban Khalaf King Khalid University, albadran50@yahoo.com

More information

Regression Analysis for Data Containing Outliers and High Leverage Points

Regression Analysis for Data Containing Outliers and High Leverage Points Alabama Journal of Mathematics 39 (2015) ISSN 2373-0404 Regression Analysis for Data Containing Outliers and High Leverage Points Asim Kumer Dey Department of Mathematics Lamar University Md. Amir Hossain

More information

Two Stage Robust Ridge Method in a Linear Regression Model

Two Stage Robust Ridge Method in a Linear Regression Model Journal of Modern Applied Statistical Methods Volume 14 Issue Article 8 11-1-015 Two Stage Robust Ridge Method in a Linear Regression Model Adewale Folaranmi Lukman Ladoke Akintola University of Technology,

More information

Effects of Outliers and Multicollinearity on Some Estimators of Linear Regression Model

Effects of Outliers and Multicollinearity on Some Estimators of Linear Regression Model 204 Effects of Outliers and Multicollinearity on Some Estimators of Linear Regression Model S. A. Ibrahim 1 ; W. B. Yahya 2 1 Department of Physical Sciences, Al-Hikmah University, Ilorin, Nigeria. e-mail:

More information

Improved Ridge Estimator in Linear Regression with Multicollinearity, Heteroscedastic Errors and Outliers

Improved Ridge Estimator in Linear Regression with Multicollinearity, Heteroscedastic Errors and Outliers Journal of Modern Applied Statistical Methods Volume 15 Issue 2 Article 23 11-1-2016 Improved Ridge Estimator in Linear Regression with Multicollinearity, Heteroscedastic Errors and Outliers Ashok Vithoba

More information

Relationship between ridge regression estimator and sample size when multicollinearity present among regressors

Relationship between ridge regression estimator and sample size when multicollinearity present among regressors Available online at www.worldscientificnews.com WSN 59 (016) 1-3 EISSN 39-19 elationship between ridge regression estimator and sample size when multicollinearity present among regressors ABSTACT M. C.

More information

A Modification of the Jarque-Bera Test. for Normality

A Modification of the Jarque-Bera Test. for Normality Int. J. Contemp. Math. Sciences, Vol. 8, 01, no. 17, 84-85 HIKARI Lt, www.m-hikari.com http://x.oi.org/10.1988/ijcms.01.9106 A Moification of the Jarque-Bera Test for Normality Moawa El-Fallah Ab El-Salam

More information

Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error Distributions

Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error Distributions Journal of Modern Applied Statistical Methods Volume 8 Issue 1 Article 13 5-1-2009 Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error

More information

Ridge Regression and Ill-Conditioning

Ridge Regression and Ill-Conditioning Journal of Modern Applied Statistical Methods Volume 3 Issue Article 8-04 Ridge Regression and Ill-Conditioning Ghadban Khalaf King Khalid University, Saudi Arabia, albadran50@yahoo.com Mohamed Iguernane

More information

Detecting and Assessing Data Outliers and Leverage Points

Detecting and Assessing Data Outliers and Leverage Points Chapter 9 Detecting and Assessing Data Outliers and Leverage Points Section 9.1 Background Background Because OLS estimators arise due to the minimization of the sum of squared errors, large residuals

More information

MULTICOLLINEARITY DIAGNOSTIC MEASURES BASED ON MINIMUM COVARIANCE DETERMINATION APPROACH

MULTICOLLINEARITY DIAGNOSTIC MEASURES BASED ON MINIMUM COVARIANCE DETERMINATION APPROACH Professor Habshah MIDI, PhD Department of Mathematics, Faculty of Science / Laboratory of Computational Statistics and Operations Research, Institute for Mathematical Research University Putra, Malaysia

More information

Measuring Local Influential Observations in Modified Ridge Regression

Measuring Local Influential Observations in Modified Ridge Regression Journal of Data Science 9(2011), 359-372 Measuring Local Influential Observations in Modified Ridge Regression Aboobacker Jahufer 1 and Jianbao Chen 2 1 South Eastern University and 2 Xiamen University

More information

Package ltsbase. R topics documented: February 20, 2015

Package ltsbase. R topics documented: February 20, 2015 Package ltsbase February 20, 2015 Type Package Title Ridge and Liu Estimates based on LTS (Least Trimmed Squares) Method Version 1.0.1 Date 2013-08-02 Author Betul Kan Kilinc [aut, cre], Ozlem Alpu [aut,

More information

Multicollinearity and maximum entropy estimators. Abstract

Multicollinearity and maximum entropy estimators. Abstract Multicollinearity and maximum entropy estimators Quirino Paris University of California, Davis Abstract Multicollinearity hampers empirical econometrics. The remedies proposed to date suffer from pitfalls

More information

Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model

Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model e Scientific World Journal, Article ID 206943, 8 pages http://dx.doi.org/10.1155/2014/206943 Research Article An Unbiased Two-Parameter Estimation with Prior Information in Linear Regression Model Jibo

More information

Robust model selection criteria for robust S and LT S estimators

Robust model selection criteria for robust S and LT S estimators Hacettepe Journal of Mathematics and Statistics Volume 45 (1) (2016), 153 164 Robust model selection criteria for robust S and LT S estimators Meral Çetin Abstract Outliers and multi-collinearity often

More information

Empirical Power of Four Statistical Tests in One Way Layout

Empirical Power of Four Statistical Tests in One Way Layout International Mathematical Forum, Vol. 9, 2014, no. 28, 1347-1356 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2014.47128 Empirical Power of Four Statistical Tests in One Way Layout Lorenzo

More information

Journal of Asian Scientific Research COMBINED PARAMETERS ESTIMATION METHODS OF LINEAR REGRESSION MODEL WITH MULTICOLLINEARITY AND AUTOCORRELATION

Journal of Asian Scientific Research COMBINED PARAMETERS ESTIMATION METHODS OF LINEAR REGRESSION MODEL WITH MULTICOLLINEARITY AND AUTOCORRELATION Journal of Asian Scientific Research ISSN(e): 3-1331/ISSN(p): 6-574 journal homepage: http://www.aessweb.com/journals/5003 COMBINED PARAMETERS ESTIMATION METHODS OF LINEAR REGRESSION MODEL WITH MULTICOLLINEARITY

More information

Regularized Multiple Regression Methods to Deal with Severe Multicollinearity

Regularized Multiple Regression Methods to Deal with Severe Multicollinearity International Journal of Statistics and Applications 21, (): 17-172 DOI: 1.523/j.statistics.21.2 Regularized Multiple Regression Methods to Deal with Severe Multicollinearity N. Herawati *, K. Nisa, E.

More information

IMPROVING THE SMALL-SAMPLE EFFICIENCY OF A ROBUST CORRELATION MATRIX: A NOTE

IMPROVING THE SMALL-SAMPLE EFFICIENCY OF A ROBUST CORRELATION MATRIX: A NOTE IMPROVING THE SMALL-SAMPLE EFFICIENCY OF A ROBUST CORRELATION MATRIX: A NOTE Eric Blankmeyer Department of Finance and Economics McCoy College of Business Administration Texas State University San Marcos

More information

Ridge Regression Revisited

Ridge Regression Revisited Ridge Regression Revisited Paul M.C. de Boer Christian M. Hafner Econometric Institute Report EI 2005-29 In general ridge (GR) regression p ridge parameters have to be determined, whereas simple ridge

More information

COMBINING THE LIU-TYPE ESTIMATOR AND THE PRINCIPAL COMPONENT REGRESSION ESTIMATOR

COMBINING THE LIU-TYPE ESTIMATOR AND THE PRINCIPAL COMPONENT REGRESSION ESTIMATOR Noname manuscript No. (will be inserted by the editor) COMBINING THE LIU-TYPE ESTIMATOR AND THE PRINCIPAL COMPONENT REGRESSION ESTIMATOR Deniz Inan Received: date / Accepted: date Abstract In this study

More information

ENHANCING THE EFFICIENCY OF THE RIDGE REGRESSION MODEL USING MONTE CARLO SIMULATIONS

ENHANCING THE EFFICIENCY OF THE RIDGE REGRESSION MODEL USING MONTE CARLO SIMULATIONS www.arpapress.com/volumes/vol27issue1/ijrras_27_1_02.pdf ENHANCING THE EFFICIENCY OF THE RIDGE REGRESSION MODEL USING MONTE CARLO SIMULATIONS Rania Ahmed Hamed Mohamed Department of Statistics, Mathematics

More information

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful? Journal of Modern Applied Statistical Methods Volume 10 Issue Article 13 11-1-011 Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

More information

The Efficiency of Some Robust Ridge Regression. for Handling Multicollinearity and Non-Normals. Errors Problems

The Efficiency of Some Robust Ridge Regression. for Handling Multicollinearity and Non-Normals. Errors Problems Applied Mathematical Scieces, Vol. 7, 203, o. 77, 383-3846 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/0.2988/ams.203.36297 The Efficiecy of Some Robust Ridge Regressio for Hadlig Multicolliearity ad

More information

Remedial Measures for Multiple Linear Regression Models

Remedial Measures for Multiple Linear Regression Models Remedial Measures for Multiple Linear Regression Models Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Remedial Measures for Multiple Linear Regression Models 1 / 25 Outline

More information

Efficient Choice of Biasing Constant. for Ridge Regression

Efficient Choice of Biasing Constant. for Ridge Regression Int. J. Contemp. Math. Sciences, Vol. 3, 008, no., 57-536 Efficient Choice of Biasing Constant for Ridge Regression Sona Mardikyan* and Eyüp Çetin Department of Management Information Systems, School of

More information

SOME NEW PROPOSED RIDGE PARAMETERS FOR THE LOGISTIC REGRESSION MODEL

SOME NEW PROPOSED RIDGE PARAMETERS FOR THE LOGISTIC REGRESSION MODEL IMPACT: International Journal of Research in Applied, Natural and Social Sciences (IMPACT: IJRANSS) ISSN(E): 2321-8851; ISSN(P): 2347-4580 Vol. 3, Issue 1, Jan 2015, 67-82 Impact Journals SOME NEW PROPOSED

More information

Dr. Maddah ENMG 617 EM Statistics 11/28/12. Multiple Regression (3) (Chapter 15, Hines)

Dr. Maddah ENMG 617 EM Statistics 11/28/12. Multiple Regression (3) (Chapter 15, Hines) Dr. Maddah ENMG 617 EM Statistics 11/28/12 Multiple Regression (3) (Chapter 15, Hines) Problems in multiple regression: Multicollinearity This arises when the independent variables x 1, x 2,, x k, are

More information

PAijpam.eu M ESTIMATION, S ESTIMATION, AND MM ESTIMATION IN ROBUST REGRESSION

PAijpam.eu M ESTIMATION, S ESTIMATION, AND MM ESTIMATION IN ROBUST REGRESSION International Journal of Pure and Applied Mathematics Volume 91 No. 3 2014, 349-360 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu doi: http://dx.doi.org/10.12732/ijpam.v91i3.7

More information

Heteroskedasticity-Consistent Covariance Matrix Estimators in Small Samples with High Leverage Points

Heteroskedasticity-Consistent Covariance Matrix Estimators in Small Samples with High Leverage Points Theoretical Economics Letters, 2016, 6, 658-677 Published Online August 2016 in SciRes. http://www.scirp.org/journal/tel http://dx.doi.org/10.4236/tel.2016.64071 Heteroskedasticity-Consistent Covariance

More information

DIMENSION REDUCTION OF THE EXPLANATORY VARIABLES IN MULTIPLE LINEAR REGRESSION. P. Filzmoser and C. Croux

DIMENSION REDUCTION OF THE EXPLANATORY VARIABLES IN MULTIPLE LINEAR REGRESSION. P. Filzmoser and C. Croux Pliska Stud. Math. Bulgar. 003), 59 70 STUDIA MATHEMATICA BULGARICA DIMENSION REDUCTION OF THE EXPLANATORY VARIABLES IN MULTIPLE LINEAR REGRESSION P. Filzmoser and C. Croux Abstract. In classical multiple

More information

Comparison of Some Improved Estimators for Linear Regression Model under Different Conditions

Comparison of Some Improved Estimators for Linear Regression Model under Different Conditions Florida International University FIU Digital Commons FIU Electronic Theses and Dissertations University Graduate School 3-24-2015 Comparison of Some Improved Estimators for Linear Regression Model under

More information

IDENTIFYING MULTIPLE OUTLIERS IN LINEAR REGRESSION : ROBUST FIT AND CLUSTERING APPROACH

IDENTIFYING MULTIPLE OUTLIERS IN LINEAR REGRESSION : ROBUST FIT AND CLUSTERING APPROACH SESSION X : THEORY OF DEFORMATION ANALYSIS II IDENTIFYING MULTIPLE OUTLIERS IN LINEAR REGRESSION : ROBUST FIT AND CLUSTERING APPROACH Robiah Adnan 2 Halim Setan 3 Mohd Nor Mohamad Faculty of Science, Universiti

More information

Section 2 NABE ASTEF 65

Section 2 NABE ASTEF 65 Section 2 NABE ASTEF 65 Econometric (Structural) Models 66 67 The Multiple Regression Model 68 69 Assumptions 70 Components of Model Endogenous variables -- Dependent variables, values of which are determined

More information

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression

A Comparison between Biased and Unbiased Estimators in Ordinary Least Squares Regression Journal of Modern Alied Statistical Methods Volume Issue Article 7 --03 A Comarison between Biased and Unbiased Estimators in Ordinary Least Squares Regression Ghadban Khalaf King Khalid University, Saudi

More information

Leverage effects on Robust Regression Estimators

Leverage effects on Robust Regression Estimators Leverage effects on Robust Regression Estimators David Adedia 1 Atinuke Adebanji 2 Simon Kojo Appiah 2 1. Department of Basic Sciences, School of Basic and Biomedical Sciences, University of Health and

More information

Improved Liu Estimators for the Poisson Regression Model

Improved Liu Estimators for the Poisson Regression Model www.ccsenet.org/isp International Journal of Statistics and Probability Vol., No. ; May 202 Improved Liu Estimators for the Poisson Regression Model Kristofer Mansson B. M. Golam Kibria Corresponding author

More information

MULTIVARIATE TECHNIQUES, ROBUSTNESS

MULTIVARIATE TECHNIQUES, ROBUSTNESS MULTIVARIATE TECHNIQUES, ROBUSTNESS Mia Hubert Associate Professor, Department of Mathematics and L-STAT Katholieke Universiteit Leuven, Belgium mia.hubert@wis.kuleuven.be Peter J. Rousseeuw 1 Senior Researcher,

More information

Final Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58

Final Review. Yang Feng.   Yang Feng (Columbia University) Final Review 1 / 58 Final Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Final Review 1 / 58 Outline 1 Multiple Linear Regression (Estimation, Inference) 2 Special Topics for Multiple

More information

A Modified M-estimator for the Detection of Outliers

A Modified M-estimator for the Detection of Outliers A Modified M-estimator for the Detection of Outliers Asad Ali Department of Statistics, University of Peshawar NWFP, Pakistan Email: asad_yousafzay@yahoo.com Muhammad F. Qadir Department of Statistics,

More information

Improved Feasible Solution Algorithms for. High Breakdown Estimation. Douglas M. Hawkins. David J. Olive. Department of Applied Statistics

Improved Feasible Solution Algorithms for. High Breakdown Estimation. Douglas M. Hawkins. David J. Olive. Department of Applied Statistics Improved Feasible Solution Algorithms for High Breakdown Estimation Douglas M. Hawkins David J. Olive Department of Applied Statistics University of Minnesota St Paul, MN 55108 Abstract High breakdown

More information

Generalized Maximum Entropy Estimators: Applications to the Portland Cement Dataset

Generalized Maximum Entropy Estimators: Applications to the Portland Cement Dataset The Open Statistics & Probability Journal 2011 3 13-20 13 Open Access Generalized Maximum Entropy Estimators: Applications to the Portland Cement Dataset Fikri Akdeniz a* Altan Çabuk b and Hüseyin Güler

More information

Small Sample Corrections for LTS and MCD

Small Sample Corrections for LTS and MCD myjournal manuscript No. (will be inserted by the editor) Small Sample Corrections for LTS and MCD G. Pison, S. Van Aelst, and G. Willems Department of Mathematics and Computer Science, Universitaire Instelling

More information

Regression Estimation in the Presence of Outliers: A Comparative Study

Regression Estimation in the Presence of Outliers: A Comparative Study International Journal of Probability and Statistics 2016, 5(3): 65-72 DOI: 10.5923/j.ijps.20160503.01 Regression Estimation in the Presence of Outliers: A Comparative Study Ahmed M. Gad 1,*, Maha E. Qura

More information

On Monitoring Shift in the Mean Processes with. Vector Autoregressive Residual Control Charts of. Individual Observation

On Monitoring Shift in the Mean Processes with. Vector Autoregressive Residual Control Charts of. Individual Observation Applied Mathematical Sciences, Vol. 8, 14, no. 7, 3491-3499 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/.12988/ams.14.44298 On Monitoring Shift in the Mean Processes with Vector Autoregressive Residual

More information

Introduction Robust regression Examples Conclusion. Robust regression. Jiří Franc

Introduction Robust regression Examples Conclusion. Robust regression. Jiří Franc Robust regression Robust estimation of regression coefficients in linear regression model Jiří Franc Czech Technical University Faculty of Nuclear Sciences and Physical Engineering Department of Mathematics

More information

APPLICATION OF RIDGE REGRESSION TO MULTICOLLINEAR DATA

APPLICATION OF RIDGE REGRESSION TO MULTICOLLINEAR DATA Journal of Research (Science), Bahauddin Zakariya University, Multan, Pakistan. Vol.15, No.1, June 2004, pp. 97-106 ISSN 1021-1012 APPLICATION OF RIDGE REGRESSION TO MULTICOLLINEAR DATA G. R. Pasha 1 and

More information

MIT Spring 2015

MIT Spring 2015 Regression Analysis MIT 18.472 Dr. Kempthorne Spring 2015 1 Outline Regression Analysis 1 Regression Analysis 2 Multiple Linear Regression: Setup Data Set n cases i = 1, 2,..., n 1 Response (dependent)

More information

Linear Models 1. Isfahan University of Technology Fall Semester, 2014

Linear Models 1. Isfahan University of Technology Fall Semester, 2014 Linear Models 1 Isfahan University of Technology Fall Semester, 2014 References: [1] G. A. F., Seber and A. J. Lee (2003). Linear Regression Analysis (2nd ed.). Hoboken, NJ: Wiley. [2] A. C. Rencher and

More information

Robust Methods in Regression Analysis: Comparison and Improvement. Mohammad Abd- Almonem H. Al-Amleh. Supervisor. Professor Faris M.

Robust Methods in Regression Analysis: Comparison and Improvement. Mohammad Abd- Almonem H. Al-Amleh. Supervisor. Professor Faris M. Robust Methods in Regression Analysis: Comparison and Improvement By Mohammad Abd- Almonem H. Al-Amleh Supervisor Professor Faris M. Al-Athari This Thesis was Submitted in Partial Fulfillment of the Requirements

More information

Contents. 1 Review of Residuals. 2 Detecting Outliers. 3 Influential Observations. 4 Multicollinearity and its Effects

Contents. 1 Review of Residuals. 2 Detecting Outliers. 3 Influential Observations. 4 Multicollinearity and its Effects Contents 1 Review of Residuals 2 Detecting Outliers 3 Influential Observations 4 Multicollinearity and its Effects W. Zhou (Colorado State University) STAT 540 July 6th, 2015 1 / 32 Model Diagnostics:

More information

Ill-conditioning and multicollinearity

Ill-conditioning and multicollinearity Linear Algebra and its Applications 2 (2000) 295 05 www.elsevier.com/locate/laa Ill-conditioning and multicollinearity Fikri Öztürk a,, Fikri Akdeniz b a Department of Statistics, Ankara University, Ankara,

More information

Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods

Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods Chapter 4 Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods 4.1 Introduction It is now explicable that ridge regression estimator (here we take ordinary ridge estimator (ORE)

More information

Practical High Breakdown Regression

Practical High Breakdown Regression Practical High Breakdown Regression David J. Olive and Douglas M. Hawkins Southern Illinois University and University of Minnesota February 8, 2011 Abstract This paper shows that practical high breakdown

More information

Singular Value Decomposition Compared to cross Product Matrix in an ill Conditioned Regression Model

Singular Value Decomposition Compared to cross Product Matrix in an ill Conditioned Regression Model International Journal of Statistics and Applications 04, 4(): 4-33 DOI: 0.593/j.statistics.04040.07 Singular Value Decomposition Compared to cross Product Matrix in an ill Conditioned Regression Model

More information

Research Article On the Stochastic Restricted r-k Class Estimator and Stochastic Restricted r-d Class Estimator in Linear Regression Model

Research Article On the Stochastic Restricted r-k Class Estimator and Stochastic Restricted r-d Class Estimator in Linear Regression Model Applied Mathematics, Article ID 173836, 6 pages http://dx.doi.org/10.1155/2014/173836 Research Article On the Stochastic Restricted r-k Class Estimator and Stochastic Restricted r-d Class Estimator in

More information

Stat 5100 Handout #26: Variations on OLS Linear Regression (Ch. 11, 13)

Stat 5100 Handout #26: Variations on OLS Linear Regression (Ch. 11, 13) Stat 5100 Handout #26: Variations on OLS Linear Regression (Ch. 11, 13) 1. Weighted Least Squares (textbook 11.1) Recall regression model Y = β 0 + β 1 X 1 +... + β p 1 X p 1 + ε in matrix form: (Ch. 5,

More information

Two Simple Resistant Regression Estimators

Two Simple Resistant Regression Estimators Two Simple Resistant Regression Estimators David J. Olive Southern Illinois University January 13, 2005 Abstract Two simple resistant regression estimators with O P (n 1/2 ) convergence rate are presented.

More information

Evaluation of a New Estimator

Evaluation of a New Estimator Pertanika J. Sci. & Technol. 16 (2): 107-117 (2008) ISSN: 0128-7680 Universiti Putra Malaysia Press Evaluation of a New Estimator Ng Set Foong 1 *, Low Heng Chin 2 and Quah Soon Hoe 3 1 Department of Information

More information

Small sample corrections for LTS and MCD

Small sample corrections for LTS and MCD Metrika (2002) 55: 111 123 > Springer-Verlag 2002 Small sample corrections for LTS and MCD G. Pison, S. Van Aelst*, and G. Willems Department of Mathematics and Computer Science, Universitaire Instelling

More information

ISyE 691 Data mining and analytics

ISyE 691 Data mining and analytics ISyE 691 Data mining and analytics Regression Instructor: Prof. Kaibo Liu Department of Industrial and Systems Engineering UW-Madison Email: kliu8@wisc.edu Office: Room 3017 (Mechanical Engineering Building)

More information

A Modern Look at Classical Multivariate Techniques

A Modern Look at Classical Multivariate Techniques A Modern Look at Classical Multivariate Techniques Yoonkyung Lee Department of Statistics The Ohio State University March 16-20, 2015 The 13th School of Probability and Statistics CIMAT, Guanajuato, Mexico

More information

Ridge Estimator in Logistic Regression under Stochastic Linear Restrictions

Ridge Estimator in Logistic Regression under Stochastic Linear Restrictions British Journal of Mathematics & Computer Science 15(3): 1-14, 2016, Article no.bjmcs.24585 ISSN: 2231-0851 SCIENCEDOMAIN international www.sciencedomain.org Ridge Estimator in Logistic Regression under

More information

REGRESSION DIAGNOSTICS AND REMEDIAL MEASURES

REGRESSION DIAGNOSTICS AND REMEDIAL MEASURES REGRESSION DIAGNOSTICS AND REMEDIAL MEASURES Lalmohan Bhar I.A.S.R.I., Library Avenue, Pusa, New Delhi 110 01 lmbhar@iasri.res.in 1. Introduction Regression analysis is a statistical methodology that utilizes

More information

Indian Statistical Institute

Indian Statistical Institute Indian Statistical Institute Introductory Computer programming Robust Regression methods with high breakdown point Author: Roll No: MD1701 February 24, 2018 Contents 1 Introduction 2 2 Criteria for evaluating

More information

Solving Homogeneous Systems with Sub-matrices

Solving Homogeneous Systems with Sub-matrices Pure Mathematical Sciences, Vol 7, 218, no 1, 11-18 HIKARI Ltd, wwwm-hikaricom https://doiorg/112988/pms218843 Solving Homogeneous Systems with Sub-matrices Massoud Malek Mathematics, California State

More information

COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION

COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION Answer all parts. Closed book, calculators allowed. It is important to show all working,

More information

A Practical Guide for Creating Monte Carlo Simulation Studies Using R

A Practical Guide for Creating Monte Carlo Simulation Studies Using R International Journal of Mathematics and Computational Science Vol. 4, No. 1, 2018, pp. 18-33 http://www.aiscience.org/journal/ijmcs ISSN: 2381-7011 (Print); ISSN: 2381-702X (Online) A Practical Guide

More information

Diagnostic plot for the identification of high leverage collinearity-influential observations

Diagnostic plot for the identification of high leverage collinearity-influential observations Statistics & Operations Research Transactions SORT 39 (1) January-June 2015, 51-70 ISSN: 1696-2281 eissn: 2013-8830 www.idescat.cat/sort/ Statistics & Operations Research c Institut d Estadstica de Catalunya

More information

A Signed-Rank Test Based on the Score Function

A Signed-Rank Test Based on the Score Function Applied Mathematical Sciences, Vol. 10, 2016, no. 51, 2517-2527 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2016.66189 A Signed-Rank Test Based on the Score Function Hyo-Il Park Department

More information

Acta Universitatis Carolinae. Mathematica et Physica

Acta Universitatis Carolinae. Mathematica et Physica Acta Universitatis Carolinae. Mathematica et Physica TomĂĄĹĄ Jurczyk Ridge least weighted squares Acta Universitatis Carolinae. Mathematica et Physica, Vol. 52 (2011), No. 1, 15--26 Persistent URL: http://dml.cz/dmlcz/143664

More information

Logistic Kernel Estimator and Bandwidth Selection. for Density Function

Logistic Kernel Estimator and Bandwidth Selection. for Density Function International Journal of Contemporary Matematical Sciences Vol. 13, 2018, no. 6, 279-286 HIKARI Ltd, www.m-ikari.com ttps://doi.org/10.12988/ijcms.2018.81133 Logistic Kernel Estimator and Bandwidt Selection

More information

The Effect of a Single Point on Correlation and Slope

The Effect of a Single Point on Correlation and Slope Rochester Institute of Technology RIT Scholar Works Articles 1990 The Effect of a Single Point on Correlation and Slope David L. Farnsworth Rochester Institute of Technology This work is licensed under

More information

Financial Development and Economic Growth in Henan Province Based on Spatial Econometric Model

Financial Development and Economic Growth in Henan Province Based on Spatial Econometric Model International Journal of Contemporary Mathematical Sciences Vol. 12, 2017, no. 5, 209-216 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ijcms.2017.7727 Financial Development and Economic Growth

More information

Double Gamma Principal Components Analysis

Double Gamma Principal Components Analysis Applied Mathematical Sciences, Vol. 12, 2018, no. 11, 523-533 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ams.2018.8455 Double Gamma Principal Components Analysis Ameerah O. Bahashwan, Zakiah

More information

A New Asymmetric Interaction Ridge (AIR) Regression Method

A New Asymmetric Interaction Ridge (AIR) Regression Method A New Asymmetric Interaction Ridge (AIR) Regression Method by Kristofer Månsson, Ghazi Shukur, and Pär Sölander The Swedish Retail Institute, HUI Research, Stockholm, Sweden. Deartment of Economics and

More information

A ROBUST METHOD OF ESTIMATING COVARIANCE MATRIX IN MULTIVARIATE DATA ANALYSIS G.M. OYEYEMI *, R.A. IPINYOMI **

A ROBUST METHOD OF ESTIMATING COVARIANCE MATRIX IN MULTIVARIATE DATA ANALYSIS G.M. OYEYEMI *, R.A. IPINYOMI ** ANALELE ŞTIINłIFICE ALE UNIVERSITĂłII ALEXANDRU IOAN CUZA DIN IAŞI Tomul LVI ŞtiinŃe Economice 9 A ROBUST METHOD OF ESTIMATING COVARIANCE MATRIX IN MULTIVARIATE DATA ANALYSIS G.M. OYEYEMI, R.A. IPINYOMI

More information

Quantitative Methods I: Regression diagnostics

Quantitative Methods I: Regression diagnostics Quantitative Methods I: Regression University College Dublin 10 December 2014 1 Assumptions and errors 2 3 4 Outline Assumptions and errors 1 Assumptions and errors 2 3 4 Assumptions: specification Linear

More information

Outlier detection and variable selection via difference based regression model and penalized regression

Outlier detection and variable selection via difference based regression model and penalized regression Journal of the Korean Data & Information Science Society 2018, 29(3), 815 825 http://dx.doi.org/10.7465/jkdi.2018.29.3.815 한국데이터정보과학회지 Outlier detection and variable selection via difference based regression

More information

9. Robust regression

9. Robust regression 9. Robust regression Least squares regression........................................................ 2 Problems with LS regression..................................................... 3 Robust regression............................................................

More information

ROBUST ESTIMATION OF A CORRELATION COEFFICIENT: AN ATTEMPT OF SURVEY

ROBUST ESTIMATION OF A CORRELATION COEFFICIENT: AN ATTEMPT OF SURVEY ROBUST ESTIMATION OF A CORRELATION COEFFICIENT: AN ATTEMPT OF SURVEY G.L. Shevlyakov, P.O. Smirnov St. Petersburg State Polytechnic University St.Petersburg, RUSSIA E-mail: Georgy.Shevlyakov@gmail.com

More information

A Comparison of Robust Estimators Based on Two Types of Trimming

A Comparison of Robust Estimators Based on Two Types of Trimming Submitted to the Bernoulli A Comparison of Robust Estimators Based on Two Types of Trimming SUBHRA SANKAR DHAR 1, and PROBAL CHAUDHURI 1, 1 Theoretical Statistics and Mathematics Unit, Indian Statistical

More information

Fast and robust bootstrap for LTS

Fast and robust bootstrap for LTS Fast and robust bootstrap for LTS Gert Willems a,, Stefan Van Aelst b a Department of Mathematics and Computer Science, University of Antwerp, Middelheimlaan 1, B-2020 Antwerp, Belgium b Department of

More information

A Proposed Nth Order Jackknife Ridge Estimator for Linear Regression Designs

A Proposed Nth Order Jackknife Ridge Estimator for Linear Regression Designs ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 A Proposed Nth Order Jackknife Ridge Estimator for Linear Regression s Mbe Egom Nja Dept. of Mathematics, Federal University Lafia, Nigeria.

More information

Outlier Detection via Feature Selection Algorithms in

Outlier Detection via Feature Selection Algorithms in Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS032) p.4638 Outlier Detection via Feature Selection Algorithms in Covariance Estimation Menjoge, Rajiv S. M.I.T.,

More information

EFFICIENCY of the PRINCIPAL COMPONENT LIU- TYPE ESTIMATOR in LOGISTIC REGRESSION

EFFICIENCY of the PRINCIPAL COMPONENT LIU- TYPE ESTIMATOR in LOGISTIC REGRESSION EFFICIENCY of the PRINCIPAL COMPONEN LIU- YPE ESIMAOR in LOGISIC REGRESSION Authors: Jibo Wu School of Mathematics and Finance, Chongqing University of Arts and Sciences, Chongqing, China, linfen52@126.com

More information

Bayesian and Non Bayesian Estimations for. Birnbaum-Saunders Distribution under Accelerated. Life Testing Based oncensoring sampling

Bayesian and Non Bayesian Estimations for. Birnbaum-Saunders Distribution under Accelerated. Life Testing Based oncensoring sampling Applied Mathematical Sciences, Vol. 7, 2013, no. 66, 3255-3269 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.34232 Bayesian and Non Bayesian Estimations for Birnbaum-Saunders Distribution

More information

Regression Analysis By Example

Regression Analysis By Example Regression Analysis By Example Third Edition SAMPRIT CHATTERJEE New York University ALI S. HADI Cornell University BERTRAM PRICE Price Associates, Inc. A Wiley-Interscience Publication JOHN WILEY & SONS,

More information

Approximations to the t Distribution

Approximations to the t Distribution Applied Mathematical Sciences, Vol. 9, 2015, no. 49, 2445-2449 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2015.52148 Approximations to the t Distribution Bashar Zogheib 1 and Ali Elsaheli

More information

Advanced Engineering Statistics - Section 5 - Jay Liu Dept. Chemical Engineering PKNU

Advanced Engineering Statistics - Section 5 - Jay Liu Dept. Chemical Engineering PKNU Advanced Engineering Statistics - Section 5 - Jay Liu Dept. Chemical Engineering PKNU Least squares regression What we will cover Box, G.E.P., Use and abuse of regression, Technometrics, 8 (4), 625-629,

More information

Regularized Common Factor Analysis

Regularized Common Factor Analysis New Trends in Psychometrics 1 Regularized Common Factor Analysis Sunho Jung 1 and Yoshio Takane 1 (1) Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, QC, H3A 1B1, Canada

More information

Ensemble Spatial Autoregressive Model on. the Poverty Data in Java

Ensemble Spatial Autoregressive Model on. the Poverty Data in Java Applied Mathematical Sciences, Vol. 9, 2015, no. 43, 2103-2110 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2015.4121034 Ensemble Spatial Autoregressive Model on the Poverty Data in Java

More information

Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances

Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances Advances in Decision Sciences Volume 211, Article ID 74858, 8 pages doi:1.1155/211/74858 Research Article A Nonparametric Two-Sample Wald Test of Equality of Variances David Allingham 1 andj.c.w.rayner

More information

A nonparametric two-sample wald test of equality of variances

A nonparametric two-sample wald test of equality of variances University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 211 A nonparametric two-sample wald test of equality of variances David

More information

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata'

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata' Business Statistics Tommaso Proietti DEF - Università di Roma 'Tor Vergata' Linear Regression Specication Let Y be a univariate quantitative response variable. We model Y as follows: Y = f(x) + ε where

More information

Introduction to Linear regression analysis. Part 2. Model comparisons

Introduction to Linear regression analysis. Part 2. Model comparisons Introduction to Linear regression analysis Part Model comparisons 1 ANOVA for regression Total variation in Y SS Total = Variation explained by regression with X SS Regression + Residual variation SS Residual

More information

Empirical Comparison of ML and UMVU Estimators of the Generalized Variance for some Normal Stable Tweedie Models: a Simulation Study

Empirical Comparison of ML and UMVU Estimators of the Generalized Variance for some Normal Stable Tweedie Models: a Simulation Study Applied Mathematical Sciences, Vol. 10, 2016, no. 63, 3107-3118 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ams.2016.69238 Empirical Comparison of and Estimators of the Generalized Variance for

More information

COMPARISON OF THE ESTIMATORS OF THE LOCATION AND SCALE PARAMETERS UNDER THE MIXTURE AND OUTLIER MODELS VIA SIMULATION

COMPARISON OF THE ESTIMATORS OF THE LOCATION AND SCALE PARAMETERS UNDER THE MIXTURE AND OUTLIER MODELS VIA SIMULATION (REFEREED RESEARCH) COMPARISON OF THE ESTIMATORS OF THE LOCATION AND SCALE PARAMETERS UNDER THE MIXTURE AND OUTLIER MODELS VIA SIMULATION Hakan S. Sazak 1, *, Hülya Yılmaz 2 1 Ege University, Department

More information