A Proposed Nth Order Jackknife Ridge Estimator for Linear Regression Designs

Similar documents
On a Certain Family of Meromorphic p- valent Functions. with Negative Coefficients

Advances in Physics Theories and Applications ISSN X (Paper) ISSN (Online) Vol.17, 2013

Equation of a Particle in Gravitational Field of Spherical Body

Gamma Function for Different Negative Numbers and Its Applications

The Influence of the Angle of Inclination on Laminar Natural Convection in a Square Enclosure

On The Time Lag of A First-Order Process s Sinusoidal Response

An Application of Discriminant Analysis On University Matriculation Examination Scores For Candidates Admitted Into Anamabra State University

Polynomial Regression Model of Making Cost Prediction In. Mixed Cost Analysis

The Fitting of a SARIMA model to Monthly Naira-Euro Exchange Rates

On The Performance of the Logistic-Growth Population Projection Models

Dominator Chromatic Number of Circular-Arc Overlap Graphs

Efficient Choice of Biasing Constant. for Ridge Regression

Common Fixed point theorems for contractive maps of Integral type in modular metric spaces

A Common Fixed Point Theorem for Compatible Mapping

Designing a 2D RZ Venture Model for Neutronic Analysis of the Nigeria Research Reactor-1 (NIRR-1)

A Fixed Point Theorem for Weakly C - Contraction Mappings of Integral Type.

Innovative Systems Design and Engineering ISSN (Paper) ISSN (Online) Vol 3, No 6, 2012

Accuracy Assessment of Image Classification Algorithms

An Implicit Partial Pivoting Gauss Elimination Algorithm for Linear System of Equations with Fuzzy Parameters

Simulation of Flow Pattern through a Three-Bucket Savonius. Wind Turbine by Using a Sliding Mesh Technique

Numerical Solution of Linear Volterra-Fredholm Integral Equations Using Lagrange Polynomials

Performance of SF 6 GIB under the influence of power frequency voltages

One Step Continuous Hybrid Block Method for the Solution of

Chemistry and Materials Research ISSN (Print) ISSN (Online) Vol.3 No.7, 2013

Optical Properties of (PVA-CoCl 2 ) Composites

Estimation of Flow Accelerated Corrosion (FAC) in Feeder Pipes using CFDd Software Fluent

The optical constants of highly absorbing films using the spectral reflectance measured by double beam spectrophotometer

Numerical Solution of Linear Volterra-Fredholm Integro- Differential Equations Using Lagrange Polynomials

Existence And Uniqueness Solution For Second-Partial Differential Equation By Using Conditions And Continuous Functions

A Common Fixed Point Theorems in Menger Space using Occationally Weakly Compatible Mappings

Comparison between five estimation methods for reliability function of weighted Rayleigh distribution by using simulation

New Multi-Step Runge Kutta Method For Solving Fuzzy Differential Equations

Development of Temperature Gradient Frequency Distribution for Kenyan Cement Concrete Pavement: Case Study of Mbagathi Way, Nairobi.

Improved Dead Oil Viscosity Model

An Assessment of Relationship between the Consumption Pattern of Carbonated Soft Drinks in Ilorin Emirate

Wavefront Analysis for Annular Ellipse Aperture

Method Of Weighted Residual For Periodic Boundary Value. Problem: Galerkin s Method Approach

A Common Fixed Point Theorem for Six Mappings in G-Banach Space with Weak-Compatibility

Hydromagnetic Turbulent Flow. Past A Semi-Infinite Vertical Plate Subjected To Heat Flux

ESTIMATION OF GLOBAL SOLAR RADIATION AT CALABAR USING TWO MODELS

Photon-Photon Collision: Simultaneous Observation of Wave- Particle Characteristics of Light

Effect of Potassium Chloride on Physical and Optical Properties of Polystyrene

Balancing of Chemical Equations using Matrix Algebra

The Optical Constants of Highly Absorbing Films Using the Spectral Reflectance Measured By Double Beam Spectrophotometer

Explosion Phenomenon Observed from Seed Capsules of Pletekan (Ruellia Tuberosa L.)

Preparation of (PS-PMMA) Copolymer and Study The Effect of Sodium Fluoride on Its Optical Properties

A GIS based Land Capability Classification of Guang Watershed, Highlands of Ethiopia

Strong Convergence of an Algorithm about Strongly Quasi- Nonexpansive Mappings for the Split Common Fixed-Point Problem in Hilbert Space

Optimization of Impulse Response for Borehole Gamma Ray. Logging

ANALYZING THE IMPACT OF HISTORICAL DATA LENGTH IN NON SEASONAL ARIMA MODELS FORECASTING

Key words: Explicit upper bound for the function of sum of divisor σ(n), n 1.

Distribution of Radionuclide Concentration with Proximity to the Lagoon in Lagos State, Southwestern Nigeria

High Temperature Electronic Properties of a Microwave Frequency Sensor GaN Schottky Diode

Efficient Numerical Solution of Diffusion Convection Problem of Chemical Engineering

Fixed Point Result in Probabilistic Metric Space

Parametric Evaluation of a Parabolic Trough Solar Collector

Sound Pressure Level, Power Level, Noisiness and Perceived. Noise Level of some commonly used Generators in Pankshin

The Correlation Functions and Estimation of Global Solar Radiation Studies Using Sunshine Based Model for Kano, Nigeria

Developments In Ecological Modeling Based On Cellular Automata

Relative Response Factor for Lamivudine and Zidovudine Related substances by RP-HPLC with DAD detection

First Order Linear Non Homogeneous Ordinary Differential Equation in Fuzzy Environment

Gamma Ray Spectrometric Analysis of Naturally Occurring Radioactive Materials (NORMS) in Gold Bearing Soil using NaI (Tl) Technique

Optical Characterization of Polyvinyl alcohol - Ammonium Nitrate Polymer Electrolytes Films

Comparative Analysis of MIT Rule and Lyapunov Rule in Model Reference Adaptive Control Scheme

Forecasting Precipitation Using SARIMA Model: A Case Study of. Mt. Kenya Region

A Comparison of Numerical Methods for Solving the Unforced Van Der Pol s Equation

Application of Finite Markov Chain to a Model of Schooling

Proposed Thermal Circuit Model for the Cost Effective Design of Fin FET

Effect of (FeSO 4 ) Doping on Optical Properties of PVA and PMMA Films

Production Of Neutralinos As Adark Matter Via Z 0 Propagator

The Estimation of the Oxide Ion polarizability using the

Multicollinearity and A Ridge Parameter Estimation Approach

Static Properties of Liquid Neon from Molecular Dynamics Simulation

Property Analysis of Quantum dot Cuboid Nanocrystals with Different Nanostructures

Innovative Systems Design and Engineering ISSN (Paper) ISSN (Online) Vol 3, No 10, 2012

Numerical Solution of a Degenerate Parabolic Equation from Boltzmann-Fokker-Planck Equation. Application to Radiotherapy

Implementation of Interpretive Structural Model and Topsis in Manufacturing Industries for Supplier Selection

Technical Report No. 1/12, February 2012 JACKKNIFING THE RIDGE REGRESSION ESTIMATOR: A REVISIT Mansi Khurana, Yogendra P. Chaubey and Shalini Chandra

MHD Peristaltic Flow of a Couple Stress Fluids with Heat and Mass. Transfer through a Porous Medium

Journal of Natural Sciences Research ISSN (Paper) ISSN (Online) Vol.4, No.12, 2014

Journal of Environment and Earth Science ISSN (Paper) ISSN (Online) Vol.5, No.3, 2015

Analytical Study of Rainfall and Temperature Trend in. Catchment States and Stations of the Benin- Owena River. Basin, Nigeria

Nanofabrication and characterization of gold metal by Pulsed Laser. Ablation in SDS solution

Statistical Modelling of the Relationship Between Bearing Capacity and Fines Content of Soil using Square Footing G.O. Adunoye 1*, O.A.

Modeling Power Losses in Electric Vehicle BLDC Motor

Ab-initio Calculations of Structural, Electronic, Elastic and Mechanical Properties of REIn 3 and RETl 3 (RE= Yb & Lu) Intermetallic Compounds

Improved Ridge Estimator in Linear Regression with Multicollinearity, Heteroscedastic Errors and Outliers

A Generalized Multi-Group Discriminant Function Procedure for Classification: an Application To Ten Groups Of Yam Species

A Climatology Analysis of the Petrie Creek Catchment Maroochydore Australia Using TrainCLIM Simulation for the Implication of Future Climate Change

MIXED POISSON DISTRIBUTIONS ASSOCIATED WITH HAZARD FUNCTIONS OF EXPONENTIAL MIXTURES

Reserve Estimation from Geoelectrical Sounding of the Ewekoro Limestone at Papalanto, Ogun State, Nigeria.

Clastic Shelf Depositional Environment: Facies, Facies Sequences and Economic Values

Radiation and Magneticfield Effects on Unsteady Mixed Convection Flow over a Vertical Stretching/Shrinking Surface with Suction/Injection

Spectral Analysis of the Residual Magnetic Anomalies Overpategi and Egbako Area of the of the Mddle Niger Basin, Nigeria

Using Geographical Information System (GIS) Techniques in Mapping Traffic Situation along Selected Road Corridors in Lagos Metropolis, Nigeria

Calculation of Power Loss and Voltage Regulation for Different Static Loads

Higher Order Duality for Vector Optimization Problem over Cones Involving Support Functions

Alternative Biased Estimator Based on Least. Trimmed Squares for Handling Collinear. Leverage Data Points

New Mixed Integer Programming for Facility Layout Design without Loss of Department Area

ENHANCING THE EFFICIENCY OF THE RIDGE REGRESSION MODEL USING MONTE CARLO SIMULATIONS

Transcription:

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 A Proposed Nth Order Jackknife Ridge Estimator for Linear Regression s Mbe Egom Nja Dept. of Mathematics, Federal University Lafia, Nigeria. e-mail address: mbe_nja@yahoo.com Abstract Several remediation measures have been developed to circumvent the problem of collinearity in General Linear Regression s. These include the Generalized Ridge, Jackknife Ridge, second- order Jackknife Ridge estimation procedures. In this paper, an nth-order Jackknife Ridge estimator is developed using canonical parameter transformation. Using the MATLAB version 7 software, parameter estimates, biases and variances of these estimators are computed to show their behavior and strengths. The results show that the parameter estimates are basically the same for all the methods. There is variance reduction at the Generalized Ridge estimator and at the ordered Jackknife Ridge estimators, though the Generalized Ridge estimator is slightly superior in this respect. As the order of Jackknife Ridge estimator increases, the variance decreases up to a certain nth-order and remains constant thereafter. Where variances of two consecutive estimators are the same or nearly so, the last but one estimator is considered optimal. This establishes a convergence criterion for the sequence of Jackknife Ridge estimators. It is shown from the five illustrative design matrices that higher order Jackknife Ridge estimators are superior to lower order Jackknife Ridge estimators in terms of bias. Thus further solving the problem of bias introduced by the Ridge estimator. Keywords: Canonical transformation; collinearity; mean square error; positive definite matrix; squared bias; variance inflation. 1. INTRODUCTION One of the major consequences of multicollinearity on the Ordinary Least Squares (OLS) method of estimation is that it produces large variances for the estimated regression coefficients (Batah et al, 28; Batah, 211; Khurana, chaubey et al, 212). This leads to poor prediction in certain regions of the regression space (Lesaffre & Marx,1993) as one is unable to determine the effect of each predictor on the response. A set of variables is said to be collinear if one or more variables in the set can be expressed exactly or nearly as a linear combination of the others in the set. To remedy this situation, several measures have been offered. These include: the Ridge regression method (Hoerl & Kendal, 197; Lesaffre & Marx, 1993, Carley & Natalia, 24, Belsley, et al 198; Hawkins & Yin, 22; Kleinbaum, et al, 1998; Maddala, 1992; Ngo et al, 23; Mardikyan & Cetin, 28), the iterative principal component method (Marx& Smith,199b). Ridge regression seeks to find a set of regression coefficients that is more stable in the sense of having a small Mean Square Error (MSE) since multicollinearity (collinearity) usually results in ordinary least square estimators that may have extremely large variances (Nelder & Wedderburn, 1972, Carley et al 24). The ridge technique enlarges the small eigenvalue(s), thus decreasing the MSE which is defined as = = where is the jth eigenvalue of. Large MSE suggests that estimated parameters may be far from the true ones. That is the variances of parameter estimates are inflated. Khurana et al (212) developed the second-order Jackknife Ridge estimator (J2R) to further reduce the bias associated with the Ordinary Ridge estimator. The J2R estimator is superior to the Jackknife ridge estimator (JRE) in terms of bias. They also showed that the Jackknife Ridge estimator (JRE) is superior to the Generalized Ridge estimator (GRE) which in turn has a lower bias than the Modified Jackknife Ridge estimator (MJRE). In this paper a third-order Jackknife estimator is developed and a general nth-order Jackknife estimator is proposed as a result. The following design matrices each with collinearity among columns are used for illustration: X 1 = [1,1,1,1; 2,3,4,5; 4,6,8,1; 15,22,19,4] X 2 = [1,1,1,1; 2,3,2,4; 4,6,4,8; 15,22,19,4] X 3 = [1,1,1,1; 1,2,3,4; 2,4,6,8; 15,22,19,4] X 4 = [1,1,1,1; 3,4,3,5; 6,8,6,1; 15,22,19,4] X 5 = [1,1,1,1; 4,5,4,6; 8,1,8,12; 15,22,19,4] The result of the analysis shows that the bias of the Jackknife Ridge Estimators, beginning from the first order up to the fifth order reduces at the speed of 1-4 per order. For instance, using matrix 1, the second eigenvalue λ 2, and the 4 th category, the sequence of bias for the indicated estimators reduces at the speed of 1-4 per order and is 4

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 as follows:.397 1-3, 4.943 1-7,.614 1-11,.763 1-15,.949 1-19. The variances of the Jackknife estimators are basically constant but smaller than the variance of the Ordinary Least Squares estimator. Thus the advantage of the Ridge Regression estimator over the Ordinary Least Squares estimator is further improved by higher order Jackknife Ridge estimator. In section 1, the introduction is provided. In section 2, the model and Ridge estimators are introduced. Section 3 contains the proposed estimators. The comparison of the bias of the estimators is made in section 4, while section 5 contains the comparison of the MSE of the estimators. Section 6 contains the illustrative examples. The conclusion is made in section 7. 2. TH MODEL AND RIDGE ESTIMATORS Consider a multiple linear regression model =+ (2.1) Where Y is an 1 vector of observations, is a 1 vector of unknown regression coefficients, X is an matrix of explanatory variables,,,! and is an 1 vector of errors, the elements of which are assumed to be independently and identically normally distributed with = $% &$'= ( The ordinary least square estimator known as best linear unbiased estimator corresponding to (2.1) is given as ) = (2.2) Although the OLS estimator in (2.2) is unbiased, it has the problem of inflated variance where collinearity is present which may result in estimators that are not in tune with the researcher s prior belief. As a remediation measure, Hoerl & Kennard (197) proposed the Generalized Ridge estimator (GRE). The Ridge estimator solves the problem inflated variance but is biased. It is given as ) *+ = +,( (2.3) for some biasing constant,. To alleviate the problem of bias in Ridge regression, Singh et al (1986) proposed an Almost Unbiased Ridge estimator (AUGRR) using the Jackknife technique which was actually introduced by Quenouille(1956).The Ridge regression estimator has undergone several modifications over the years. Batah et al (28) introduced the modified Jackknife Ridge estimator (MJRE) by combining the ideas of GRE and Jackknife Ridge estimators. Batah (211) again introduced another variant of the Jackknife estimator known as the Generalized Jackknife Ridge estimator (GJR) together with its associated Generalized Jackknife Ordinary Ridge estimator (GOJR).The Generalized Ridge estimator (GRE) leads to a reduction in the sampling variance whereas the Jackknife Ridge estimator (JRE) leads to the reduction of bias. Khurana, Chaubey & Chandra (212) suggested a new Ridge estimator called the second- order Jackknife Ridge estimator (J2R) to further reduce bias. In this paper, a generalization for the nth-order Jackknife Ridge estimator is proposed for further reduction of bias. A convergence criterion is designed to select the optimal n so that an estimator with minimum n is selected for the most reduced bias and minimum variance. In canonical form model (1) can be written as =- + (2.4) Where -=, is the matrix of eigenvectors of. - -=/= =/=%$1,,,! where 2 is the ith eigenvalue of = = =( 3 Then OLS estimator of is given by 4=- - - =/ - Thus the OLS estimator of is given as ) 567 = 4 567 (2.5) The ordinary Ridge regression estimator (ORE) of is ) 5*+ = 4 5*+ =( '9 4=( '9 / - (2.6) Where ' =' = ='! =',' (r = biasing constant) =%$1' /=%$1 2 This is generalized to obtain the Generalized Ridge estimators of as 4 <*+ = /+= - = /+= / 4= ( =/+= = ( =9 4 (2.7) where J is a diagonal element whose non-negative entries are eigenvalues of >. R is a diagonal matrix whose non-negative entries and biasing constants. ) <*+ = 4 <*+ = ( =9 4 <*+ (2.8) 5

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 The Jackknife Ridge Estimator for is 4 @*+= ( =9 4 567 = ( = 9 4 567 (2.9) The second-order Jackknife Ridge Estimator is 4 @* = ( = 9 4 567 Where = ==9 = (2.1) 3. THE PROPOSED ESTIMATOR By extending the second-order Jackknife Ridge estimator to a proposed third-order Ridge estimator, a general nth-order Ridge estimator is proposed. The proposed third-order Jackknife Ridge estimator (J3R) denoted by 4 @B* is defined as 4 @B* = ( = 9 B 4 567 = ( = 9 B 4 567 (3.1) The proposed nth-order Jackknife Ridge regression estimator (JnR) is defined as 4 @C* = ( = 9 C 4 567 (3.2) where n=,1,2,. It is observed that an optimal n is achieved when the sequence of the variances of the estimator converges to the variance of the ordinary Least Squares estimator. Dh$D F &$'G 4 @C* H &$' 4 567 $F. 4. COMPARISON OF THE BIAS OF J2R, J3R AND JnR ESTIMATORS. The following theorems compare the performance of J2R and J3R and that of J(n-1)R and JnR in terms of bias. this comparison is similar to that of khurana et al (212). Theorem 1 Let R be a diagonal matrix with non-negative entries. Then the third-order Jackknife estimator 4 @B*2, reduces the bias of the second-order Jackknife Ridge estimator 4 @*2, assuming ' 2 >. The Jackknife Ridge Estimator for is 4 @*+ = ( =9 4 567 = ( = 9 4 567 The second-order Jackknife Ridge Estimator is 4 @* = ( = 9 4 567 KhL'L = ==9 = Bias G 4 @*+ H=G 4 @*+ H 4 567 (4.1) = 4 = 9 4 567 = = 9 4 567 Similarly biasg 4 @B* H=G 4 @B* H 4 567 (4.2) = = 9 4 567 and biasg 4 @B* H=G 4 @B* H 4 567 (4.3) = = 9 B 4 567 Let. 2 denote the absolute value of the Dh component. ThenNO$F 4 @* H 2 = P Q P R P Q 2 = Q P Q P R P Q P R P Q 2 = P S P R P S 2 (4.4) and NO$F 4 @B* H 2 = P T P R P Q 2 = U P P R P U 2 (4.5) Thus O$FG 4 @* H 2 O$FG 4 @B* H 2 = P S U P R P S 2 P P R P U 2 (4.6) = G P Q P S RP P V H P P R P U > $F ' 2 > This proves that the third-order Jackknife reduces the bias of the second-order Jackknife ridge estimator.it has earlier been proven (Khurana et al 212) that the second-order Jackknife reduces the bias of the Jackknife ridge estimator which in turn reduces the bias of the ordinary ridge estimator.this paper proposes a general extension of the third-order Jackknife Ridge estimator to an nth-order Jackknife Ridge estimator since it is shown that higher-order Jackknife Ridge estimators have lower biases. Theorem 2 Let R be a W diagonal matrix with non-negative entries. Then the nth-order Jackknife estimator 4 @C*2 reduces the bias of the (n-1)th-order Jackknife Ridge estimator 4 @C*2, assuming ' 2 ϵ(,1). The Jackknife, second-order Jackknife and third-order Jackknife Ridge Estimators for are given respectively as: 6

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 4 @*+ = ( = 9 4 567, 4 @* = ( = 9 4 567 and 4 @B* = ( = 9 B 4 567 where = ==9 = Thus the (n-1)th-order Jackknife Ridge estimator for can be defined as 4 @C* =X( = 9 C Y 4 567 (4.7) and the nth-order Jackknife Ridge estimator is 4 @C* = ( = 9 C 4 567 O$F G 4 @C* H=G 4 @C* H 4 567 (4.8) = = 9 C 4 567 Let. 2 denote the absolute value of the Dh component Then O$FG 4 @C* H 2 = X P Y Z[\ P R P Q 2 = ] Z[\^Z[\ P O$FG 4 @C* H 2 = P Z Z Q P R P Z[\ P R P Q 2 = Z[\ Q P R P QZ[\ 2 (4.9) P R P Q= P P R P QZ 2 (4.1) Hence NO$FG 4 @C* HN 2 NO$FG 4 @C* HN 2 = Z[\ Q P Z Q P R P QZ[\ 2 P P R P QZ 2 (4.11) = 2 2 +' 2 C ]' 2 C _ 2 +' 2` ' 2 C^> since ' 2 a,1 This proves that the Dh b'%l' Jackknife estimator reduces the bias of the 1Dh b'%l' Jackknife Ridge estimator. The Bias of the ordered Jackknife Ridge estimators can be written as follows: BiasG 4 @* H=X 4 @* Y = ( = 9 = ( = 9 4 = = 9 = = 9 Similarly, BiasG 4 @B* H= = 9 B = = 9 B Bias 4 @C* = = 9 C = = 9 C Khurana, Chaubey & Chandra (212) proved that the difference of total squared biases of the Jackknife Ridge Estimator (JRE) and second-order Jackknife Ridge Estimator (J2R) of, given by c = _ O$FG) @*+ H 2 O$FG) @* H 2` = > _d 9 = 9 = e d >` (4.12) where do$f 4=O$F) is non-negative. It is strictly positive if at least one' 2, =1,, F bfd&l. In the theorem that follows, they further proved the same result for the difference between the modified Jackknife Ridge (MJR) and the Generalized Ridge Estimators (GRE). Theorem 3 Let R be a (p p) diagonal matrix with non-negative entries, then the difference of total squared biases of the modified Jackknife Ridge (MJR) and Generalized Ridge Estimators (GRE) of β as given by c = _NO$FG) f@* HN 2 NO$FG ) <*+ HN 2 ` is positive. 4 f@* = ( =9 ( =9 4= ( 9 ф= Where ф=(+=9 = 9 and = ==9 = O$FG 4 f@* H= 9 ф= 4 <*+ = ( =9 4 ThusO$F 4 <*+ = =9 Component wise O$FG 4 f@* H 2 O$F 4 <*+ 2 = P P P R 2 P P P R 2 P = P hr i P j P ki P i P Q P R P = P P Q P R P T 2 Gj P ki P H Ql 2 P P R P 2 (4.13) (4.14) (4.15) 7

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 which is positive. Hence c = _NO$FG) f@* HN 2 NO$FG ) <*+ HN 2 ` = > _d 9 ф= 9 = d >` = > _d 9 ф=9 ф= 9 =9 =d >` = > _d9 ф= = 9 d >` = P P Q P R P T 2 which is positive definite. Singh et al (1986) proved that the total squared bias for the second-order Jackknife Ridge estimator is less than that of the Jackknife Ridge estimator. Putting this together with the foregoing theorem, we have the following ordering: OG) @* H<OG) @*+ H<OG) <*+ H<O) f@* where O=DbD$n squared bias. Following the theorem that compares the bias of the second-order and third-order Jackknife Ridge estimators and that of the (n-1)th-order and nth-order Jackknife Ridge estimators stated and proven in this study, a general ordering for the total squared biases for the Modified Jackknife and ordered Jackknife Ridge estimators can be written as O) @C* <OG) @C* H<OG) @* H<OG) @*+ H<OG) <*+ H<O) f@* 5. COMPARISON OF THE MSE OF J2R, J3R AND JnR ESTIMATORS Batah et al (28) proved that the Mean Square Error (MSE) of the Modified Jackknife Ridge (MJR) estimator is smaller than that of the Jackknife Ridge estimator. Khurana, Chaubey & Chandra (212) proved by the theorem that follows that the JR estimator dominates the J2R estimator by MSE evaluation. Theorem 4 Let R be a (p p) diagonal matrix with non-negative entries. Then the difference of the MSE matrix of the second-order Jackknife estimator and the Jackknife Ridge estimator, =G 4 @* H 4 @*+ is a positive definite matrix if and only if the following inequality is satisfied: > _p q+9 = > = 9 p ` 1 where p=9 =,= ==9 (4.17) = and q= ( 9 = / ( 9 = _ (+9 = +(9 = ` (4.18) Using the expression of the variance-covariance matrix of the least squares estimator, that is s$' 4= - t - = / and the expression for JRE as earlier stated, it can be written that (4.19) s$'g 4 @*+ H= u 4 @*+ G 4 @*+ Hvu 4 @*+ G 4 @*+ Hv = ( 9 = s$' 4 ( 9 = = ( 9 = / ( 9 = (4.2) Further using the expression for O$FG 4 @*+ H from (4.1), it can be written that G 4 @*+ H= ( 9 = / ( 9 = +9 = = (4.21) 9 Similarly using the expression for s$' 4 @* as given by s$'g 4 @* H= ( 9 = / ( 9 = and the expression foro$f 4 @* from(4.2) we have G 4 @* H= ( 9 = / ( 9 = +9 = = 9 (4.22) from (4.21) and (4.22), 1 becomes = q+9 = = 9 9 = = 9 where q= ( 9 = / ( 9 = ( 9 = / ( 9 = = ( 9 = / ( 9 = _ (+9 = (+9 = (` = ( 9 = / ( 9 = _ (+9 = +( (+9 = (` = ( 9 = / ( 9 = _ (+9 = +(9 = ` But ( 9 = is a diagonal matrix with positive entries and _ (+9 = +(9 = ` is also positive definite Thus M is also positive definite and hence the difference is positive definite if and only if p p is positive definite. p p =p q+9 = = 9 p (4.16) 8

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 The matrix q+9 = t = 9 in the above equation is symmetric positive definite. Therefore, it is concluded that p p is positive definite if and only if _p q+9 = = 9 p ` 1 = ( 9 = / ( 9 = _ (+9 = +( (+9 = (` = ( 9 = / ( 9 = _ (+9 = +(9 = ` (4.23) The matrix M is positive definite and hence the difference is positive definite if and only if p p is positive definite. p p =p q+9 = B = 9 B p (4.24) The matrix q+9 = B t = 9 B 4.24 is symmetric positive definite. Hence using lemma 1 we conclude that p p is positive definite if and only if the following inequality is satisfied. _p q+9 = B = 9 B p ` 1 Theorem 5 Let R be a (p p) diagonal matrix with non-negative entries. Then the difference of the MSE matrix of the third order Jackknife estimator and the second-order Jackknife Ridge estimator, = (4.25) @B* @* is a positive definite matrix if and only if the following inequality is satisfied: _p +9 = B = 9 B p `-1 (4.26) 1 wherep = 9 =, = ==9 (4.27) = and 9 = `M= p 9 = / p 9 = _ (+9 = (4.28) +( Using the expression of the variance covariance matrix of the least squares estimator, that is &$' 4= - t - and the expression for /2= given in (2.1), we have. &$'G 4 @* H= u 4 @* G @* Hvu 4 @* G 4 @* H v = ( 9 = &$' 4 ( 9 = = ( 9 = / ( 9 = (4.29) Also using the expression for Bias 4 @* from (4.2), we obtain G 4 @* H= ( 9 = / ( 9 = +9 = = 9 (4.3) Similarly using the expression for Bias 4 @B* given by (4.3) we have &$'G 4 @B* H= ( 9 = B / 9 = B (4.31) G 4 @B* H= ( 9 = B / ( 9 = B +9 = B = 9 B (4.32) Thus (4.33) =G @B* H @* = q+9 = B = 9 B 9 = = 9 where q= ( 9 = B / ( 9 = B ( 9 = / ( 9 = = ( 9 = / ( 9 = _ (+9 = ( = ( 9 = / ( 9 = _ ( 9 = ( 9 = ( These proofs (theorems 3 and 4) show that the /2= does not improve on /= and /3= does not improve on /2= in terms of and by extension we conclude that /= does not improve on / 1= in terms of. Variance The&$' 4 567 = - - = (4.35) / = { { 4 }~ {, 4=- - - b' = G{ ƒh { ƒ C! C! &$'G 4 @*+ H= ( 9 = / ( 9 = (4.36) Where \ ( 9 = =%$1 \ R \,, P \ R \, &$'G 4 @* H= ( 9 = / ( 9 = (4.38) &$'G 4 @B* H= ( 9 = B / ( 9 = B (4.39) &$'G 4 @C* H= ( 9 = C / ( 9 = C (4.4) These variances are now computed and compared using illustration examples. (4.34) R (4.37) 6. ILLUSTRATIVE EXAMPLES The biases and variances of the OLS, GR, JR, J2R and J3R estimators were computed for five design matrices each having two collinear explanatory variables. Results are reported for the and estimates in each of the 9

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 five design cases. Comparison is therefore made for these estimators in terms of bias and variances. The results are tabulated in the tables shown in the appendix. 7. DISCUSSION The proposed third order Jackknife estimator is superior to the second order Jackknife which in turn is superior to the ordinary Jackknife estimator in terms of bias (table 3). It is also proven that the nth order Jackknife Ridge estimator is superior to the (n-1)th Jackknife estimator by bias evaluation. It can then be concluded that higher order Jackknife Ridge estimators have lower biases which approach zero as the order n increases. In terms of variance, the higher order Jackknife estimators are superior to the ordinary least square estimator but they converge to the ordinary least estimator with increasing n (table 2). The Jackknife Ridge and the ordered Jackknife Ridge estimators are basically the same in value but differ from the Ordinary Least Square estimators in all the five illustrative examples (table 1). This shows that the proposed higher order Jackknife estimators are superior to the existing Ordinary Ridge, Generalized Ridge and second-order Jackknife Ridge estimators. REFERENCES 1. Batah, F.S.M, Ramanathan, T.V., Gore, S.D.(28). The Efficiency of Modified Jackknife and Ridge Type Regression Estimators: A comparison. Surveys in Mathematics and its applications, 3 111-122 2. Batah, F.S.M.(211). A new Estimator By Generalized Modified Jackknife Regression Estimator: Journal of Basarah Researches (Sciences), 37(4) 138-149. 3. Belsley, D, A; Edwin, K; Welsch, R.E. (198). Regression Diagnostics: Identifying influential data and sources of collinearity, Wiley, New York, 198 4. Carley, Kathleen M., Natalia Y. K. (24). A Network Optimization Approach for Improving Organizational : Carnegie Mellon University, School of computer science, Institute for Software Research International, Technical Report CMU-ISRI-4-12. 5. Hawkins, D.M; Yin, X. (22). A faster algorithm for ridge regression. Computational statistics and data analysis, 4, 253-262. 6. Hoerl, A. E and Kenard, R.W. (197). Ridge regression: Biased estimation for non-orthogonal problems. Communications in Statistics- Theory and methods, 4 15-123 7. Khurana, M., Chaubey, Y.P.and Chandra, S. (212). Jackknifing the Ridge Regression Estimator: A Revisit: Technical Report 12(1) 1-19 8. Kleinbaun, D.G.; Kupper,L. L; Muller, K.E. Nizam a. (1998). Applied Regression Analysis and other multivariable methods. Duxbury press, Washington. 9. Lesaffre, E; Marx, B.D. (1993). Collinearity in Generalyzed Linear Regression. COMMUN. STATIST- THEORY METH; 22(7) 1933-1952. 1. Maddala, G.S. (1992). Introduction to Econometrics. Macmillan, New York. 11. Mardikyan, S; Cetin, E. (28). Efficient choice of Biasing constant for Ridge Regression International Journal of Mathematical sciences, 3(11), 527-536 12. Nelder, J; Wedderburn, R.W.M.(1972). Generalyzed Linear Models. Journal of the Royal statistical society, A 135,37-384. 13. Ngo, S.H; Kemeny, S; Deak A. (23). Performance of the ridge regression method as applied to complex linear and nonlinear models. Chemometrics and intelligent Laboratory systems, 67,69-78. 14.Quenonille, M.H. (1956). Notes on bias in estimation. Biometrika, 43 353-36. 15.Singh,B., Chaubey, Y.P. and Divivedi, T. D. (1986). An almost Unbiased ridge estimator.sankhya, B48 342-36. 1

ISSN 2224-584 (Paper) ISSN 2225-522 (Online) Vol.3, No.8, 213 Appendix Table 1 : Parameter estimates Matrix 1 Matrix 2 Matrix 3 Matrix 4 Matrix 5 OLS GR JR J2R J3R.827 1.2156.8177.8178.135.9229.1349.1349 3.2 2.9316 3.1997 3.1997-14.4799-14.4799 5.7926 31.5286-88.7767-8.751-14.2486-14.2486 5.9836-151.2359 199.851-2.1993-14.2539-14.2539 5.75 31.2979-71.8928-9.8686-12.4487-12.4487 5.6449-138.548 153.9237-15.4917-14.2539-14.2539 5.751 31.523-86.5238-8.92-1.877-1.877 5.3486-149.1688 19.851-19.2119-14.4764-14.4764 5.7919 31.5276-88.715-8.7112-8.2765-8.2765 4.8613-151.1533 199.4192-2.1524.8178.1349 3.1997-14.4798-14.4798 5.7925 31.5282-88.7712-8.764-6.2844-6.2844 4.487-151.247 199.8561-2.1961 Table 2: Variances of the OLS, GR, JR, J2R, J3R, J4R, J5R. OLS GR JR J2R J3R J4R J5R Matrix 1 1.959.583.175 1.6.585.174 1.957.583.175 1.942.582.174 1.953.5872.177 1.953.587.177 Matrix 2 9.652.189 Matrix 3.186 Matrix 4 9.575.189 Matrix 5 5.9621.4823.44 6.9612.182 3.319.185 6.9553.1876 3.7134.4196.436 8.79.189 3.3466.186 8.7825 1.884 5.4438.4814.44 9.611.189.186 9.575 1.884 5.9386.4823.44 9.653.189.186 9.575 1.884 5.9611.4823.44 9.653.189.186 9.652.189 5.9621.4823.44 Table 3 : Biases for OLS, GR, JR, J2R, J3R, J4R, J5R Eigenvalues GR JR J2R J3R J4R J5R 1.953.587.177 9.653.189.186 9.652.189 5.9621.4823.44 Matrix 1.92.15.357 Matrix 2-2.762 Matrix 3 Matrix 4.2125 12.387 Matrix 5 47.9291-22.98.696.12 1-3.16 1-3.397 1-3 -2.7626 2.7626 1.611-4.8442.1469 1.263 1-7.28 1-7 4.943 1-7 -2.7626 2.7626.4469 -.2125.65.158 1-11.25 1-11.614 1-11 -2.7626 2.7626.198 -.96.3.197 1-15.32 1-15.763 1-15 -2.7626 2.7626.8815 -.424.129.245 1-19.4 1-19.949 1-19 -2.7626 2.7626.3915.1889.57 11

This academic article was published by The International Institute for Science, Technology and Education (IISTE). The IISTE is a pioneer in the Open Access Publishing service based in the U.S. and Europe. The aim of the institute is Accelerating Global Knowledge Sharing. More information about the publisher can be found in the IISTE s homepage: http:// CALL FOR PAPERS The IISTE is currently hosting more than 3 peer-reviewed academic journals and collaborating with academic institutions around the world. There s no deadline for submission. Prospective authors of IISTE journals can find the submission instruction on the following page: http:///journals/ The IISTE editorial team promises to the review and publish all the qualified submissions in a fast manner. All the journals articles are available online to the readers all over the world without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Printed version of the journals is also available upon request of readers and authors. IISTE Knowledge Sharing Partners EBSCO, Index Copernicus, Ulrich's Periodicals Directory, JournalTOCS, PKP Open Archives Harvester, Bielefeld Academic Search Engine, Elektronische Zeitschriftenbibliothek EZB, Open J-Gate, OCLC WorldCat, Universe Digtial Library, NewJour, Google Scholar