Improved estimation of the linear regression model with autocorrelated errors

Similar documents
Performance of the 2shi Estimator Under the Generalised Pitman Nearness Criterion

GLS detrending and unit root testing

Estimating Seemingly Unrelated Regressions with First Order Autoregressive Disturbances

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

Mean squared error matrix comparison of least aquares and Stein-rule estimators for regression coefficients under non-normal disturbances

Improved Multivariate Prediction in a General Linear Model with an Unknown Error Covariance Matrix

Stein-Rule Estimation under an Extended Balanced Loss Function

AUTOCORRELATION. Phung Thanh Binh

Journal of Asian Scientific Research COMBINED PARAMETERS ESTIMATION METHODS OF LINEAR REGRESSION MODEL WITH MULTICOLLINEARITY AND AUTOCORRELATION

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

ON VARIANCE COVARIANCE COMPONENTS ESTIMATION IN LINEAR MODELS WITH AR(1) DISTURBANCES. 1. Introduction

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Testing for a unit root in an ar(1) model using three and four moment approximations: symmetric distributions

THE PERFORMANCE OF SERIAL CORRELATION PRELIMINARY TEST ESTIMATORS UNDER ASYMMETRY LOSS FUNCTIONS S. N. Nakale 1

Working Papers in Econometrics and Applied Statistics

Chapter 14 Stein-Rule Estimation

HETEROSKEDASTICITY, TEMPORAL AND SPATIAL CORRELATION MATTER

A Course on Advanced Econometrics

OMISSION OF RELEVANT EXPLANATORY VARIABLES IN SUR MODELS: OBTAINING THE BIAS USING A TRANSFORMATION

The exact bias of S 2 in linear panel regressions with spatial autocorrelation SFB 823. Discussion Paper. Christoph Hanck, Walter Krämer

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Tightening Durbin-Watson Bounds

Bayesian Estimation of Regression Coefficients Under Extended Balanced Loss Function

Ch.10 Autocorrelated Disturbances (June 15, 2016)

LECTURE 13: TIME SERIES I

11.1 Gujarati(2003): Chapter 12

GMM estimation of spatial panels

Likely causes: The Problem. E u t 0. E u s u p 0

Case of single exogenous (iv) variable (with single or multiple mediators) iv à med à dv. = β 0. iv i. med i + α 1

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.

Dummy Variable Model in pooling Data & a production model in Agriculture and Industry Sectors in Egypt

Robustness of Simultaneous Estimation Methods to Varying Degrees of Correlation Between Pairs of Random Deviates

On the Efficiencies of Several Generalized Least Squares Estimators in a Seemingly Unrelated Regression Model and a Heteroscedastic Model

Heteroskedasticity-Robust Inference in Finite Samples

Efficiency Tradeoffs in Estimating the Linear Trend Plus Noise Model. Abstract

MSE Performance of the Weighted Average. Estimators Consisting of Shrinkage Estimators

TESTING FOR NORMALITY IN THE LINEAR REGRESSION MODEL: AN EMPIRICAL LIKELIHOOD RATIO TEST

A Practical Guide for Creating Monte Carlo Simulation Studies Using R

Generalized Random Coefficient Estimators of Panel Data Models: Asymptotic and Small Sample Properties

Scanner Data and the Estimation of Demand Parameters

Bayes Estimation in Meta-analysis using a linear model theorem

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED

1 Overview. 2 Data Files. 3 Estimation Programs

Journal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh

Improved Ridge Estimator in Linear Regression with Multicollinearity, Heteroscedastic Errors and Outliers

Multicollinearity and A Ridge Parameter Estimation Approach

Christopher Dougherty London School of Economics and Political Science

Bootstrapping the Confidence Intervals of R 2 MAD for Samples from Contaminated Standard Logistic Distribution

Sociedad de Estadística e Investigación Operativa

Economics 620, Lecture 13: Time Series I

Marginal Likelihood-based One-sided LR Test for Testing Higher Order Autocorrelation in Presence of Nuisance Parameters -A Distance Based Approach

TIME SERIES DATA ANALYSIS USING EVIEWS

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

Contextual Effects in Modeling for Small Domains

The Estimation of Simultaneous Equation Models under Conditional Heteroscedasticity

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

Steven Cook University of Wales Swansea. Abstract

Formulary Applied Econometrics

Bias Correction Methods for Dynamic Panel Data Models with Fixed Effects

Long memory and changing persistence

Simultaneous Equations and Weak Instruments under Conditionally Heteroscedastic Disturbances

The Bayesian Approach to Multi-equation Econometric Model Estimation

WEAKER MSE CRITERIA AND TESTS FOR LINEAR RESTRICTIONS IN REGRESSION MODELS WITH NON-SPHERICAL DISTURBANCES. Marjorie B. MCELROY *

1. GENERAL DESCRIPTION

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic

Large Sample Properties of Estimators in the Classical Linear Regression Model

Generalized Estimators of Stationary random-coefficients Panel Data models: Asymptotic and Small Sample Properties

POSTERIOR ANALYSIS OF THE MULTIPLICATIVE HETEROSCEDASTICITY MODEL

ON EFFICIENT FORECASTING IN LINEAR REGRESSION MODELS

MSE Performance and Minimax Regret Significance Points for a HPT Estimator when each Individual Regression Coefficient is Estimated

Empirical Market Microstructure Analysis (EMMA)

The Size and Power of Four Tests for Detecting Autoregressive Conditional Heteroskedasticity in the Presence of Serial Correlation

This chapter reviews properties of regression estimators and test statistics based on

Simulating Uniform- and Triangular- Based Double Power Method Distributions

Generalized Least Squares Theory

AN EFFICIENT GLS ESTIMATOR OF TRIANGULAR MODELS WITH COVARIANCE RESTRICTIONS*

COLLABORATION OF STATISTICAL METHODS IN SELECTING THE CORRECT MULTIPLE LINEAR REGRESSIONS

R = µ + Bf Arbitrage Pricing Model, APM

Predicting bond returns using the output gap in expansions and recessions

Autocorrelation. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Autocorrelation POLS / 20

Least Squares Estimation-Finite-Sample Properties

Econometrics. 9) Heteroscedasticity and autocorrelation

Limited Dependent Variable Panel Data Models: A Bayesian. Approach

Is the Basis of the Stock Index Futures Markets Nonlinear?

Weighing matrices and their applications

Cross Sectional Time Series: The Normal Model and Panel Corrected Standard Errors

1 Outline. 1. Motivation. 2. SUR model. 3. Simultaneous equations. 4. Estimation

Economics 583: Econometric Theory I A Primer on Asymptotics

Optimal Jackknife for Unit Root Models

INTRODUCTORY REGRESSION ANALYSIS

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

A Monte Carlo Sampling Approach to Testing Separate Families of Hypotheses: Monte Carlo Results

Specification Test for Instrumental Variables Regression with Many Instruments

LM threshold unit root tests

Econometrics Summary Algebraic and Statistical Preliminaries

y it = α i + β 0 ix it + ε it (0.1) The panel data estimators for the linear model are all standard, either the application of OLS or GLS.

A radial basis function artificial neural network test for ARCH

Comparative Analysis of Linear and Bilinear Time Series Models

A New Approach to Robust Inference in Cointegration

nniniininuuii ''!!EIIIIII! IlllullllllllI AD-A RAND CORP SANTA MONICA CA F/6 l1/1

Transcription:

University of Wollongong Research Online Faculty of Business - Economics Working Papers Faculty of Business 1990 Improved estimation of the linear regression model with autocorrelated errors A Chaturvedi University of Allahabad Tran Van Hoa University of Wollongong Ram Lal University of Allahabad Recommended Citation Chaturvedi, A; Hoa, Tran Van; and Lal, Ram, Improved estimation of the linear regression model with autocorrelated errors, Department of Economics, University of Wollongong, Working Paper 90-14, 1990, 11. http://ro.uow.edu.au/commwkpapers/327 Research Online is the open access institutional repository for the University of Wollongong. For further information contact the UOW Library: research-pubs@uow.edu.au

THE UNIVERSITY OF WOLLONGONG DEPARTMENT OF ECONOMICS IMPROVED ESTIMATION OF THE LINEAR REGRESSION MODEL WITH AUTOCORRELATED ERRORS A. Chaturvedi Department of Mathematics and Statistics University of Allahabad Allahabad, India Tran Van Hoa Department of Economics The University of Wollongong Wollongong NSW 2500 Australia and Ram Lai Department of Statistics University of Allahabad Allahabad, India Working Paper 90-14 Co-ordinated by Dr Charles Harvie & Di Kelly PO Box 1144 [Northfields Avenue], Wollongong NSW 2500 Australia Phone: [042] 213 66 or 213 555. Telex 29022. Cable: UNIOFWOL. Fax [042] 213 725 ISSN 1035-4581 ISBN 0-86418-161-2

ABSTRACT For estimating the coefficient vector of a linear regression model with first order autoregressive disturbances, a family of Stein-rule estimators based on the two-step Prais-Winsten estimating procedure is considered and an Edgeworth type asymptotic expansion for its distribution is derived. The performance of this family of estimators relative to the two-step Prais-Winsten estimator is also derived under a squared error loss function. Keywords: Linear regression model, autocorrelated errors, squared error loss function, improved estimators, Stein estimators, Prais-Winsten estimators, Edgeworth expansions, generalised least squares estimators. Acknowledgement. The first author gratefully acknowledges the research grant from the Department of Economics, University of Wollongong, Wollongong, Australia, during his visit there in 1989.

1. INTRODUCTION For estimating the coefficient vector of a linear regression model with disturbances following a first order autoregressive scheme, several estimators have been analysed with the help of empirical methods [see for example Kramer (1980), Maeshiro (1976), Park and Mitchell (1980) and Spitzer (1979)]. Rothenberg (1984) considered the case when the error covariance matrix depends on a finite number of parameters, and derived the approximate expression for the distribution of the two-step generalised least squares (GLS) estimator. Magee et. al. (1985) considered several classes of two-step Cochrane-Orcutt and Prais-Winsten estimators which arise from the choice of the various estimated autocorrelation coefficients and obtained the large sample expressions of the dispersion matrices, and analysed the efficiency properties of the estimators with the help of a numerical experiment. Magee (1985) also adopted a general method to derive Nagar expansions for iterative estimators and applied it to obtain the approximate dispersion matrices for the iterated Prais-Winsten and maximum likelihood estimators. While several families of improved or shrinkage estimators with superior properties in terms of a quadratic loss function have been developed for the linear regression model with i.i.d. disturbances [see Judge and Bock (1978) and Vinod and Ullah (1981)], no study dealing with improved estimation in the case of autocorrelated disturbances has been reported. In this paper, we present an attempt in this direction and consider a general family of improved or shrinkage estimators based on the two-step Prais- Winsten estimator for the coefficient vector of a linear regression model with disturbances generated by a first order autoregressive scheme. Edgeworth type asymptotic expansions for the distribution of the proposed family of estimators are derived and the risk under a quadratic loss function of the

resulting estimators is compared with that of the two-step Prais-Winsten estimator.1 The results of a numerical experiment are also reported. 2. THE MODEL AND THE FAMILY OF ESTIMATORS Consider the linear regression model: y = X(3 + u (1) where y is a T x 1 vector of observations on the dependent variable, X is a T x p matrix of observations on p independent variables, (3 is a p x 1 vector of unknown regression coefficients and u is a T x 1 disturbance vector. The elements of u are assumed to follow an AR(1) process: ut = put-i +Gt (t = 1,2,...,T) where p ( p < 1) is the unknown autocorrelation coefficient. Further E( t)= 0 Thus E(u) = 0 and E(u u') = a2 I where o2 \ //(1 - p2 ) and X is a T x T matrix with (i,j)-th element as plhi. The ordinary least squares (OLS) estimator of (3 is given by 1 The results presented in this paper apply equally well to Stein rule estimator based on any feasible GLS estimator involving a first order efficient estimator of p, e.g., two step or iterated Prais- Winsten estimators, the maximum likelihood estimator or the estimators derived from the Durbin-Watson statistic.

b = (X'X)-1 X'y while the GLS estimator is (3* = (X'P'PX)-1 X'P'Py, where P is T x T triangular matrix with (i,j)-th element (1 - p2)* if i = j = 1, 1 if i = j = 2,3,-p if i = j + 1 = 2,3,...,T and 0 otherwise. Since p is usually unknown, Prais and Winsten (1954) suggested the replacement of p by its consistent estimator2 t-1^ L A ut A ut+1 a t=1 P = f ---------- (2> v A 2 I t=1 ut A, A where ut is the t-th element of the vector u = y - Xb. They therefore obtained the following estimator of (3: A A A. A A P = (X p ' p X) 1 X'p'py A where P is obtained by replacing p by p in p for (3: Now we define the following family of improved or shrinkage estimators 2 The original estimator of p proposed by Prais and Winsten involved in the denominator sum of A squares of ut s from t equal to 2 to T-1. However, the adjustment in the denominator made here does not A i A affect the results in the paper which only depend on p to order 0p(T'*) whereas the adjustment affects p to order Op(T_1).

m = i-k j /i /\ (3' X' P p x (4) Obviously for k = 0, (3(k) reduces to (3. A A A In the following section, we study the asymptotic distribution of (3(k). 3. ASYMPTOTIC DISTRIBUTION OF THE FAMILY OF ESTIMATORS Let us introduce the following notation: q = T(x,z -lx)-1,q = i - x n x ', 0 = P'Q 1(3, M = 1T - X(X'X)-1X\ V = It n! x ' QX n i + <2 n '! P p' n '! 9 t)' ji(k) - - Q! [pfk) - p], a THEOREM: Let the matrix X 'X j tends to a finite nonsingular matrix as T tends to infinity, I P I < 1 and the coefficient vector p is non-null. Then the asym ptotic distribution o f n(k), to order 0 ( T 1), is normal with mean vector fi and dispersion m atrix V.

Using the above theorem, we can easily obtain the following approximate expressions for the bias vector, to order 0 (T 1), and the mean squared error matrix, to order 0("P2), of $(k) as: (5) E [&(k> - p ] - - ko2 0T :[& (k) - p ] [& (k) - p ]' (1 - p 2) k a 2 a + p2y2 Q X'QXa + ^ p {(4 + k) p p' - 2 6Q} (6) Obviously, if we take k = 0 in (6), we get the asymptotic expression for the mean squared error matrix of p [see Ullah et. al. (1983)]. Consider the quadratic loss function L((3) = (p - (3)' C ((3 - p) where p is an arbitrary estimator of p, C is a p x p symmetric, positive definite matrix, and denote the risk associated with p by R[PJ. Then, to order 0(T '2), the approximate expressions for R[p(k)J is given by: r [ M G2 r ( 1 -D2 ) = [tr (QC) + ~ 2j 2 tr (Q X'Q x QC) ka2 B'CVP + 6T {(4 + k) 0 ^ ' 2 tr (QC)}1 <7) Let X{.) be the maximum characteristic root of the matrix inside the bracket and tr (Q C) M.) '

Since (P'Cp/0) < X (QC), a sufficient conditon for the difference A A between the risks of (3 andp (k), to order 0(T~2), to be non-negative is given by 0 < k < 2 (d - 2). (8) In particular, if we take C = Q -1, the expression (7) reduces to R [to] T 1-p2 k a 2 P + Zo^o tr (X'QXQ) + {(4 + k) - 2p} p2t2 0T (9) and the dominance condition (8) becomes 0 < k < 2 (p - 2) ; p > 2. (10) It can be easily verified that the optimum value of k obtained by minimizinc expression (9) for Fl[(3(k)j is p - 2 = ko (say). For this optimum value ko of k, the relative gain in efficiency of $(ko) over p, to order 0(T-1), is given by A = _ r [S] - R [ t o > ] = (p-2)2 o j j l -p2) T5 (1 + p2-2 y p) where 5 = (P'X'Xp/T), y= ((3'X'DXp/2(3' X X{3) and D is a T X T matrix with (i.j)-th element 1 if i = j ± 1 and 0 otherwise; i,j=1,2,...t. 4. A SIMULATION STUDY

In the theorem above, the expansion for the distirubtion of R[$ (k)] is derived for the case of [3 being a non-null vector. Thus (4). may not be a reasonable approximation for R[(3 (k)] for any given T, no matter how large, and for (3 in the neightbourhood of 0. As a result, the derivation of the relative gain in efficiency of (3(ko) over (3 in this special case is analytically difficult if not intractable. To circumvent the problem, we have instead used a Monte Carlo simulation to investigate the relative gain of our estimator. In the simulation, we formulate a total of 110 linear regression models with autocorrelated disturbances. Each model differs from the other in terms of the true values of (3 and the magnitudes of the autocorrelation coefficient p. For (3, their values are significant only at the fourth or higher digit after the decimal point. For p, the values range from -1.00 to 1.00 with an increment of 0.20. For (3, the values are based on those used by Chi and Judge (1985) in their simulation study on improved estimators. The data of the independent variables are obtained from an early Monte Carlo study by Kmenta and Gilbert (1968). The disturbance variance o 2 is arbitrarily set equal to 100 for all cases. The simulation results on the relative gain in efficiency of P(ko) over {3 in all 110 linear regression models are given in Table 1. The relative gain is denoted by R(ML/S) and defined as 100[MSE((3)/MSE((3o) -1], where MSE((3) is the mean squared errors for the estimator (3, and simialrly for po- In all linear regression models, the relative gain is computed for illustration from 100 statistical trials. The general conclusion that emerges from the simulation results reported in Table 1 appears to indicate the uniform gain in efficiency of the

estimator p(ko) over the estimator pin all 110 linear regression models in which P is in the neighbourhood of zero. Thus, the expansion for the distribution of R[p (k)] as given in the theorem appears to be valid generally. As expected however, the relative gain, while dependent on the value of p, does not appear to be a monotonously increasing or decreasing function of p. The exact relationship between the gain and k, o2, and T is given in (11). 5. DERIVATION OF THE RESULTS Suppose J is a diagonal matrix with first and last diagonal elements equal to one and remaining diagonal elements equal to (1 - p2), and B is a TxT symmetric matrix with (i,j)th element -p if i = j, I if i = j ± 1 and 0 otherwise. Following Ullah et. al. (1983), to order 0(T 1), 7t(k) can be written as: k (k) where C-j = j Q X 'I^ u e-1 - - p ( T p 2)T «x 'JQ 2-'u

with 0-j = ~ u'mbmu To^ 0-1 = - u'mbmu i TMu - a 2 Now, for any p x 1 vector h the cumulant generating function of Jt(k), to order 0 (T '1), is given by K(h) = E [exp{i h'7t(k)}] = -i -^ 7 = (h'^h P) + log E [exp f ~ h'q* X 'Z '1u 0VT cwt { 1 + i^ ( h 'n - ic - i) - (h'9-ic-i)2 + i ^ (h'q-! P) J h'h + log [1 - kry2 p (h'q* X'QXQ^h) + - (h'h - h'q-* pp ^-^h)] 0 T 0 = i h i -} h'vh + 0{T$). Now using the inversion formula

oo oo ffo) = I... J exp [K(h) - ih'jt]dh, -00-00 where f (%) denotes the pdf of 7c(k), we obtain the results of the theorem.

REFERENCES Chi, X.W. and Judge, G.G. (1985), "On Assessing the Precision of Stein's Estimator", Economics Letters. Vol. 18, pp. 143-148. " Judge, G.G. and M.E. Bock (1978), "The Statistical Implications of Pre-Test and Stein-Rule Estimators in Econometrics" North-Holland, Amsterdam. Judge, G.G., W.E. Griffiths, R.C. Hill and T.C. Lee (1980), The Theory and Practice of Econometrics. John Wiley and Sons, New York. Kmenta, J. and Gilbert, R.F. (1968), "Small Ample Properties of Alternative Estimators in Seemingly Unrelated Regressions", Journal of the American Statistical Association. Vol. 63, pp. 1180-1200. Kramer, W. (1980), "Finite Sample Efficiency of Ordinary Least Squares in the Linear Regression Model with Autocorrelated Errors", Journal of the American Statistical Association. 75, p. 1005-1009. Maeshiro, A. (1976), "Autoregressive Transformation, Trended Independent Variables and Autocorrelated disturbance Terms", Review of Economics and.statistics, 58, p. 497-500. Magee, L. (1985), "Efficiency of Iterative Estimators in the Regression Model with AR(1) Disturbances", Journal of Econometrics. 29, p.275-287. Magee, L., A. Ullah and V.K. Srivastava (1985), "Efficiency of Estimators in the Regression Model with First Order Autoregressive Errors", M.L. King and D.E.A. Giles eds., Specification Analysis in Linear Model Essavs in honour of Donald Cochrane (Routledge and Kegan Paul, London). Park, R.E. and B.M. Mitchell (1980), Estimating the Autocorrelated Error Model with Trended Data, Journal of Econometrics. 13, p. 185-201. Prais, S.J. and C.B. Winsten (1954), "Trend Estimators and Serial Correlation", Unpublished Cowles Commission Discussion Paper, Chicago. Rothenberg, T.J. (1984), "Approximate Normality of Generalised Least Squares Estimates", Econometrics. 52, No. 4, p. 811-825. Spitzer, J.J. (1979), "Small Sample Properties of Nonlinear Least Squares and Maximum Likelihood Estimators in the contest of Autocorrelated Errors", Journal of the American Statistical Association. 74, p. 41-47. Ullah, A., V.K. Srivastava, L. Magee and A. Srivastava (1983), "Estimation of Linear Regression Model with Autocorrelated Disturbances", Journal of Time Series Analysis. 4, No. 2, p. 127-135. Vinod, H.D. and A. Ullah (1981), Recent Advances in Regression Method. Marcel Dekker, New York.