ECONOMETRIC THEORY. MODULE VI Lecture 19 Regression Analysis Under Linear Restrictions

Similar documents
Variable Selection and Model Building

Multiple Linear Regression Analysis

ECONOMETRIC THEORY. MODULE XVII Lecture - 43 Simultaneous Equations Models

Chapter 14 Stein-Rule Estimation

Variable Selection and Model Building

Lecture 3: Multiple Regression

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.

Simple Linear Regression Analysis

The regression model with one fixed regressor cont d

Chapter 11 Handout: Hypothesis Testing and the Wald Test

OMISSION OF RELEVANT EXPLANATORY VARIABLES IN SUR MODELS: OBTAINING THE BIAS USING A TRANSFORMATION

Advanced Econometrics I

Chapter 11 Specification Error Analysis

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

LINEAR REGRESSION ANALYSIS

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model

Econ 583 Homework 7 Suggested Solutions: Wald, LM and LR based on GMM and MLE

Wednesday, October 17 Handout: Hypothesis Testing and the Wald Test

Tutorial 3: Optimisation

Lecture 4: Heteroskedasticity

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Sociedad de Estadística e Investigación Operativa

Problem Set 2 Solutions

LECTURE 10: NEYMAN-PEARSON LEMMA AND ASYMPTOTIC TESTING. The last equality is provided so this can look like a more familiar parametric test.

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity

Solutions: Monday, October 15

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises

5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares

Stein-Rule Estimation under an Extended Balanced Loss Function

ECON 3150/4150, Spring term Lecture 6

Econometrics of Panel Data

1. The OLS Estimator. 1.1 Population model and notation

Linear Regression. Junhui Qian. October 27, 2014

The regression model with one stochastic regressor.

Course information EC2020 Elements of econometrics

Functional Form. Econometrics. ADEi.

Monday, October 15 Handout: Multiple Regression Analysis Introduction

Chapter 10: Multiple Regression Analysis Introduction

3. Linear Regression With a Single Regressor

An overview of applied econometrics

Business Economics BUSINESS ECONOMICS. PAPER No. : 8, FUNDAMENTALS OF ECONOMETRICS MODULE No. : 3, GAUSS MARKOV THEOREM

Brief Suggested Solutions

Applied Econometrics (QEM)

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Habilitationsvortrag: Machine learning, shrinkage estimation, and economic theory

Mean squared error matrix comparison of least aquares and Stein-rule estimators for regression coefficients under non-normal disturbances

Parameter Estimation and Fitting to Data

Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE

Applied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics

Applied Quantitative Methods II

Applied Econometrics (QEM)

Heteroskedasticity ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

Topic 3: Inference and Prediction

Linear Models and Estimation by Least Squares

Bayesian Estimation of Regression Coefficients Under Extended Balanced Loss Function

The outline for Unit 3

7) Important properties of functions: homogeneity, homotheticity, convexity and quasi-convexity

Homoskedasticity. Var (u X) = σ 2. (23)

EconS 301. Math Review. Math Concepts

Simultaneous equation model (structural form)

Projektpartner. Sonderforschungsbereich 386, Paper 163 (1999) Online unter:

Econometrics I Lecture 3: The Simple Linear Regression Model

Topic 3: Inference and Prediction

New York University Department of Economics. Applied Statistics and Econometrics G Spring 2013

Using regression to study economic relationships is called econometrics. econo = of or pertaining to the economy. metrics = measurement

ECON 497: Lecture Notes 10 Page 1 of 1

Introduction to Econometrics Chapter 4

LECTURE 5 HYPOTHESIS TESTING

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University

ECON 255 Introduction to Mathematical Economics

Environmental Econometrics

TESTING FOR NORMALITY IN THE LINEAR REGRESSION MODEL: AN EMPIRICAL LIKELIHOOD RATIO TEST

CHAPTER 2: Assumptions and Properties of Ordinary Least Squares, and Inference in the Linear Regression Model

Lab 07 Introduction to Econometrics

Ma 3/103: Lecture 24 Linear Regression I: Estimation

Introduction to Estimation Methods for Time Series models. Lecture 1

405 ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati

Answers to Problem Set #4

E 600 Chapter 4: Optimization

Regression with time series

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

Class 7. Prediction, Transformation and Multiple Regression.

LECTURE 11. Introduction to Econometrics. Autocorrelation

Business Statistics. Lecture 10: Correlation and Linear Regression

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Statistics II. Management Degree Management Statistics IIDegree. Statistics II. 2 nd Sem. 2013/2014. Management Degree. Simple Linear Regression

ECO 310: Empirical Industrial Organization Lecture 2 - Estimation of Demand and Supply

The Finite Sample Properties of the Least Squares Estimator / Basic Hypothesis Testing

1. Basic Model of Labor Supply

Chapter 2 The Simple Linear Regression Model: Specification and Estimation

Journal of Multivariate Analysis. Use of prior information in the consistent estimation of regression coefficients in measurement error models

Least Squares Estimation-Finite-Sample Properties

1. The Multivariate Classical Linear Regression Model

Econometrics Summary Algebraic and Statistical Preliminaries

Econ 510 B. Brown Spring 2014 Final Exam Answers

ESTIMATION OF PARAMETERS OF GENERALISED COBB-DOUGLAS PRODUCTIONFUNCTIONAL MODEL

First Welfare Theorem

The Necessity of the Transversality Condition at Infinity: A (Very) Special Case

Mathematical Economics: Lecture 16

Transcription:

ECONOMETRIC THEORY MODULE VI Lecture 9 Regression Analysis Under Linear Restrictions Dr Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

One of the basic objectives in any statistical modeling is to find good estimators of parameters In the context of multiple linear regression model, the ordinary least squares estimator b= X ' X X ' y is the best linear unbiased estimator of y X ε = + ( ) Several approaches have been attempted in the literature to improve further the OLSE One approach to improve the estimators is the use of extraneous information or prior information In applied work, such prior information may be available about the regression coefficients For example, in economics, the constant returns to scale imply that the exponents in a Cobb-Douglas production function should sum to unity In another example, absence of money illusion on the part of consumers implies that the sum of money income and price elasticities in a demand function should be zero These types of constraints or the prior information may be available from i some theoretical considerations ii past experience of the experimenter iii empirical investigations iv some extraneous sources etc To utilize such information in improving the estimation of regression coefficients, it can be expressed in the form of i exact linear restrictions ii stochastic linear restrictions iii inequality restrictions We consider the use of prior information in the form of exact and stochastic linear restrictions in the model y = X + ε where y is a ( n ) vector of observations on study variable, X is a ( n k) matrix of observations on explanatory variables X, X,, X, is a ( k ) vector of regression coefficients and is a vector of disturbance terms k ε ( n )

Exact linear restrictions Suppose the prior information binding the regression coefficients is available from some extraneous sources which can be expressed in the form of exact linear restrictions as r = R where r is a ( q ) vector and R is a ( q k) matrix with rank(r) = q (q < k) The elements in r and R are known Some examples of exact linear restriction r = R are as follows: (i) If there are two restrictions with k = 6 like = 4 + + = 3 4 5 then 0 00000 r =, R = 00 00 3 (ii) If k = 3 and suppose = 3, then [ ] R [ ] r = 3, = 00 (iii) If k = 3 and suppose : : 3 :: ab : b : 0 a 0 r = 0, R 0 b = 0 0 ab The ordinary least squares estimator b= ( X ' X) X ' y does not uses the prior information It does not obey the restrictions in the sense that r Rb So the issue is how to use the sample information and prior information together in finding an improved estimator of

Restricted least squares estimation 4 The restricted least squares estimation method enables the use of sample information and prior information simultaneously In this method, choose r = R such that the error sum of squares is minimized subject to linear restrictions This can be achieved using the Lagrangian multiplier technique Define the Lagrangian function S( λ, ) = ( yx) '( yx) λ'( Rr) where λ is a ( k ) vector of Lagrangian multiplier Using the result that if a and b are vectors and A is a suitably defined matrix, then a ' Aa = ( A + A') a a ab ' = b, a we have S( λ, ) = X ' X X ' y R' λ' = 0 (*) S( λ, ) = R r = 0 λ Pre-multiplying equation (*) by RX ( ' X), we have R RX ( ' X) X' y RX ( ' X) R' λ' = 0 or R Rb R X X R λ = ( ' ) ' ' 0 ' R( X ' X ) R ' ( Rb r) λ = using RX ( ' X) R' > 0

5 Substituting λ in equation (*), we get or X ' X X ' y+ R' R( X ' X) R' ( Rb r) = 0 ( ) X ' X = X ' yr' R( X ' X) R' ( Rbr) ( X ' X) Pre-multiplying by yields ˆ R = ( X ' X) X ' y+ ( X ' X) R' R( X ' X) R' ( rrb) = b+ ( X ' X) R' R( X ' X) R' ( rrb) This estimation is termed as restricted regression estimator of Properties of restricted regression estimator The restricted regression estimator obeys the exact restrictions, ie, r = Rˆ R To verify this, consider ˆ RR = R b ( X ' X) R' { R( X ' X) } ( r Rb) + = Rb + r Rb = r

6 Unbiasedness The estimation error of ˆR is ( ) ( ) ( ) ˆ R = b + ( X ' X) R' R X ' X R' R Rb = I ( X ' X) R{ R( X ' X) R' } R ( b ) = D b ( ) where ( ) D= I ( X ' X) R R X ' X R' R Thus E ( ˆ R ) = DE ( b ) ˆR = 0 implying that is an unbiased estimator of

7 3 Covariance matrix ˆR The covariance matrix of is V ˆ = E ˆ ˆ ' ( R) ( R )( R ) ( )( ) = DE b b ' D ' = DV ( b) D ' ( ' ) = σ D X X which can be obtained as follows: D' ( X ' X) σ ( X ' X) R' R( X ' X) R' R' ( X ' X) = σ Consider ( ' ) = ( ' ) ( ' ) ' ( ' ) ' ( ' ) D X X X X X X R R X X R R X X { } ( ) ( ) { ( ) } D( X ' X) D' = ( X ' X) ( X ' X) R' R( X ' X) R' R X ' X I X ' X R' R X ' X R' R ( X ' X) ( X ' X) R' R( X ' X) R' ' R( X ' X) ( X ' X) R' R( X ' X) R' R( X ' X) = + ( X ' X) R' R( X ' X) R' R X ( ' X) R' R( X ' X) R' R( X ' X) ( X X) ( X X) R R( X X) R R( X X) = ' ' ' ' ' ' '

8 Maximum likelihood estimation under exact restrictions Assuming ε ~ N(0, σ I), the maximum likelihood estimator of and can also be derived such that it follows The Lagrangian function as per the maximum likelihood procedure can be written as ( yx )'( y X ) L(,, ) exp σ λ = λ' ( Rr) πσ σ λ ( ) where is a q vector of Lagrangian multipliers The normal equations are obtained by partially differentiating the log likelihood function with respect to and and equated to zero as ( σ λ) ( σ λ) Let, σ and λ denote the maximum likelihood estimators of σ, and λ, respectively which are obtained by solving R R equations (), () and (3) as follows: From equation (), we get optimal n as σ, ln L,, = ( X ' X X ' y) + R' λ = 0 () σ ln L,, λ ( R r) = = 0 () ( σ λ) n ( ) ( ) ln L,, yx ' yx = + = 0 (3) 4 σ σ σ λ λ σ r = R λ = ( ' ) ' ( ) R X X R r R σ

Substituting in equation () gives 9 where = X ' X X ' y is the maximum likelihood estimator of without restrictions From equation (3), we get The Hessian matrix of second order partial derivatives of The restricted least squares and restricted maximum likelihood estimators of σ for λ R = + ( X ' X) R' R( X ' X) R' ( rr ) ( ) σ = R ( yx R) '( yx R) n and σ is positive definite at = and R σ = σr are same whereas they are different Test of hypothesis It is important to test the hypothesis H : r = R 0 H : r R before using it in the estimation procedure The construction of the test statistic for this hypothesis is detailed in the module 3 on multiple linear regression model The resulting test statistic is ( r Rb)' R( X ' X ) R ' ( r Rb) q F = ( y Xb) '( y Xb n k which follows a F-distribution with q and (n - k) degrees of freedom under H 0 The decision rule is to reject level of significance whenever F F α ( qn, k) H at 0 α