Multiple Regression Analysis

Size: px
Start display at page:

Download "Multiple Regression Analysis"

Transcription

1 Multiple Regression Analysis y = x 1 + x +... k x k + u 6. Heteroskedasticity

2 What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables, the variance of the unobserved error, u, was constant! If this is not true, that is if the variance of u is different for different values of the x s, then the errors are heteroskedastic! Example: We estimate returns to education but ability is unobservable, and we think the variance in ability differs by educational attainment

3 Example of Heteroskedasticity f(y x)... E(y x) = x x 1 x x 3 x

4 Why Worry About Heteroskedasticity?! We didn t need homoskedasticity to show that our OLS estimates are unbiased and consistent! However, the estimated standard errors of the hats are biased with heteroskedasticity! i.e we can not use the usual t or F or LM statistics for drawing inferences! And, this problem doesn t go away as n increases! Our estimates are also no longer BLUE --could find other estimators with smaller variance

5 Variance with Heteroskedasticity! Suppose that in a simple (univariate) regression model, the first 4 Gauss-Markov assumptions hold but Var(u i x i )= i (i.e. variance depends on x)! Then, the OLS estimator is just: ˆβ 1 = β 1 + Var( ˆβ ) 1 = ( x i x ) u i ( x i x )! We can show that: σ i! Without the subscript i we could show that this equals /SST x ( x i x ), where SST x = x i x SST x ( )

6 Heteroskedasticity Robust SE! So, we need some way to estimate this! A valid estimator of this when i is: ( x i x ) û i, where û i are the OLS residuals SST x! For the general multiple regression model, a valid estimator of the var(betahat j ) with heteroskedasticity: Var ( ˆβ ) j = ˆr ijû i SSR j! Where r ij is the i th residual from regressing x j on all the other x s and SSR j is the SSR from this regression

7 Heteroskedasticity Robust SE! To see this recall that under homoskedasticity: Var( ˆβ ) j = = r ij σ = σ r ij SSR j σ ( r ) = σ ij SST j 1 R j ( ) ( r ) ij! With heteroskedasticity we can t pull out front!

8 Robust Standard Errors! Now that we have a consistent estimate of the variance, the square root can be used as a standard error for inference! Typically call these robust standard errors! Sometimes the estimated variance is corrected for degrees of freedom by multiplying by n/(n k 1)! As n tends to infinity it s all the same, though! Note: the robust standard errors only have asymptotic justification

9 Robust Standard Errors (cont)! Can use the robust SEs to construct robust t- statistic, if heteroskedasticity is a problem (just substitute in robust SE when forming t-stat)! With small sample sizes t statistics formed with robust standard errors will not have a distribution close to the t, and inferences will not be correct As sample size gets large, this problem disappears (i.e. het-robust t-stats are asymptotically t-distributed)! Thus, if homoskedasticity holds it is better to use the usual standard errors for small samples! In EViews, robust standard errors are easily obtained.

10 A Robust LM Statistic! We can compute LM statistics that are robust to heteroskedasticity to test multiple exclusion restrictions! Where, y= x 1 + x + + k x k +u and we want to test H 0 : 1 =0, =0 1. Run OLS on the restricted model and save the residuals û. Regress each of the excluded variables on all of the included variables (q different regressions) and save each set of residuals r 1, r i.e. x 1 on x 3.x k and x on x 3.x k

11 A Robust LM Statistic (cont) 3. Create a variable equal to 1 for all i observations and regress this on r 1 * û, r * û with no intercept! The heteroskedasticity-robust LM statistic turns out to be n SSR 1, where SSR 1 is the sum of squared residuals from this final regression! LM still has a Chi-squared distribution with q degrees of freedom! I offer no explanation as to why or how this works as the derivation is tedious! Just know it can be done.

12 Testing for Heteroskedasticity! We still may want to know if heteroskedasticity is present if, for example, we have a small sample or if we want to know the form of heteroskedasticity! Essentially we want to test H 0 : Var(u x 1, x,, x k ) =! This is equivalent to H 0 : E(u x 1, x,, x k ) = E(u ) = {E(u)=0}! So to test for heteroskedasticity we want to test whether u is related to any of the x s

13 The Breusch-Pagan Test! If assume the relationship between u and x j will be linear, can test as a linear restriction! So, for u = x k x k + v, this means testing H 0 : 1 = = = k = 0! Don t observe the error, but can estimate it with the residuals from the OLS regression! Thus, regress the residuals squared on all the x s! The R from this regression can be used to form an F or LM statistic to test for overall significance

14 The Breusch-Pagan Test (cont) F-Statistic:! The F statistic is just the reported F statistic for overall significance of the regression! F = [R /k]/[(1 R )/(n k 1)], which is distributed F k, n - k - 1 LM-Statistic (Breusch-Pagan)! The LM statistic is LM = nr, which is distributed k Note that we did not have to run an auxilliary regression in this case! Could carry out this test for any of the x s individually (instead of for all x s together)

15 The White Test! The Breusch-Pagan test will detect any linear forms of heteroskedasticity! The White test allows for nonlinearities by using squares and crossproducts of all the x s! e.g. With two x s estimate:! û = x 1 + x + 3 x x + 5 x 1 x +erro r! Still just using an F or LM to test whether all the x j, x j, and x j x h are jointly significant! This can get to be unwieldy pretty quickly as number of x variables gets bigger.

16 Alternate form of the White test! We can get all the squares and cross products on the RHS in an easier way! If we recognize that the fitted values from OLS, yhats, are a function of all the x s! Thus, yhat will be a function of the squares and cross-products of the x j s! So another way to do the White test is to regress the residuals squared on yhat and yhat u ˆ = δ + δ yˆ + δ yˆ error! Then, as before, use the R to form an F or LM statistic

17 Alternate form (cont)! Here the null hypothesis is H 0 : 1 =0, =0! So, there are only restrictions regardless of the number of independent variables in the original model! It is important to test for functional form first otherwise we could reject H 0 even if it is true Make sure your functional form is right, before doing heteroskedasticity tests.! e.g. Don t estimate a model with experience only when should have experience and experience Your heteroskedasticity test (using the misspecified model) would lead to rejection of the null, when the null is true.

18 Weighted Least Squares! While it s always possible to estimate robust standard errors for OLS estimates (see above) the OLS estimates themselves may not be efficient! If we know something about the specific form of the heteroskedasticity, we can obtain more efficient estimates than OLS! The basic idea is going to be to transform the model into one that has homoskedastic errors! We call this the weighted least squares approach

19 Heteroskedasticity is known up to a multiplicative constant! Suppose the heteroskedasticity can be modeled as Var(u x) = h(x) h(x) is some function of the x s, greater than zero Here h(x) is assumed to be known (Case 1)! The trick is to figure out what h(x) looks like! Thus, for a random sample from the population! s = Var(u i x) = s h i! How can we use this information to get homoskedastic errors and therefore estimate betas more efficiently?

20 Heteroskedasticity is known! First note that: E u i h i x! And, therefore, ( ) = 0, since h i is a function of the x's ( ) = σ, since Var u i x ( ) = σ h i Var u i h i x! So, if we divided our whole equation by the square root of h i we would have a model where the error is homoskedastic y i h i = β 0 h i + β 1 x 1i h i β k x ki h i + u i h i

21 Heteroskedasticity is known y * = β x * + β x * β x * + u * i 0 i0 1 i1 k ik i! So, estimate the transformed equation to get more efficient estimates of the parameters! Notice that the s have the same interpretation as in the original model Example:! Cigs= Cigprice+ Rest+ + k Income +u! Might think that var(u i x i )= *income! i.e. that h(x)=income

22 Generalized Least Squares! Estimating the transformed equation by OLS is an example of generalized least squares (GLS)! GLS will be BLUE as long as the original model satisfies Gauss-Markov except homoskedasticity! GLS is a weighted least squares (WLS) procedure where each squared residual is weighted by the inverse of Var(u i x i )! So, less weight is given to observations with a higher error variance

23 Weighted Least Squares! While it is intuitive to see why performing OLS on a transformed equation is appropriate, it can be tedious to do the transformation! Weighted least squares is a way of getting the same thing, without the transformation! Idea is to minimize the weighted sum of squares (weighted by 1/h i ) minimize: ( y i b 0 b 1 x i1... b k x ) ik h i! This is the same as the transformed regression! EViews will do this without transforming variables

24 More on WLS! WLS is great if we know what Var(u i x i ) looks like! In most cases, won t know form of heteroskedasticity (Case )

25 Feasible GLS! In the more typical case where you don t know the form of heteroskedasticity, you can estimate h(x i )! Typically, we start with the assumption of a fairly flexible model, such as! Var(u x) = exp( x k x k ), where h (x i ) = exp( x k x k )! Use exp( ) to ensure that h(x i )>0! Since we don t know the s, we must estimate them

26 Feasible GLS (continued)! Our assumption implies that u = exp( x k x k )v! Where E(v x) = 1, then if E(v) = 1(independent): ln(u ) = x k x k + e! Where E(e) = 0 and e is independent of x! Intercept has a different interpretation ( 0 0 )! Now, we know that û is an estimate of u, so we can estimate this by OLS! regress log(û ) on the x s

27 Feasible GLS (continued)! We then obtain the fitted values from this regression (g i hats)! Then the estimates of the h i s is obtained as h i hat = exp(g i hat)! Now all we need to do is to run a WLS regression using 1/ exp(ghat) as the weights Notes:! FGLS is biased (due to use of h i hats) but consistent and asymptotically more efficient than OLS! Remember, the estimates ß give the marginal effect of x j on y j not x j * on y j *

28 WLS Wrapup! F-statistics after WLS (whether R or SSR form) are a bit tricky! Need to form the weights from the unrestricted model and use those to estimate WLS for the restricted model as well (i.e use same weighted x s)! The OLS and WLS estimates will be different due to sampling error! However, if they are very different then it s likely that some other Gauss-Markov assumption is false e.g. (0 conditional mean)

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

Topic 7: Heteroskedasticity

Topic 7: Heteroskedasticity Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic

More information

Course Econometrics I

Course Econometrics I Course Econometrics I 4. Heteroskedasticity Martin Halla Johannes Kepler University of Linz Department of Economics Last update: May 6, 2014 Martin Halla CS Econometrics I 4 1/31 Our agenda for today Consequences

More information

Chapter 8 Heteroskedasticity

Chapter 8 Heteroskedasticity Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized

More information

Heteroskedasticity (Section )

Heteroskedasticity (Section ) Heteroskedasticity (Section 8.1-8.4) Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Heteroskedasticity 1 / 44 Consequences of Heteroskedasticity for OLS Consequences

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

the error term could vary over the observations, in ways that are related

the error term could vary over the observations, in ways that are related Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Lab 11 - Heteroskedasticity

Lab 11 - Heteroskedasticity Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

Introductory Econometrics

Introductory Econometrics Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.

More information

ECO375 Tutorial 7 Heteroscedasticity

ECO375 Tutorial 7 Heteroscedasticity ECO375 Tutorial 7 Heteroscedasticity Matt Tudball University of Toronto Mississauga November 9, 2017 Matt Tudball (University of Toronto) ECO375H5 November 9, 2017 1 / 24 Review: Heteroscedasticity Consider

More information

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

The Simple Regression Model. Simple Regression Model 1

The Simple Regression Model. Simple Regression Model 1 The Simple Regression Model Simple Regression Model 1 Simple regression model: Objectives Given the model: - where y is earnings and x years of education - Or y is sales and x is spending in advertising

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Motivation for multiple regression

Motivation for multiple regression Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1 Review - Interpreting the Regression If we estimate: It can be shown that: where ˆ1 r i coefficients β ˆ+ βˆ x+ βˆ ˆ= 0 1 1 2x2 y ˆβ n n 2 1 = rˆ i1yi rˆ i1 i= 1 i= 1 xˆ are the residuals obtained when

More information

Heteroskedasticity. (In practice this means the spread of observations around any given value of X will not now be constant)

Heteroskedasticity. (In practice this means the spread of observations around any given value of X will not now be constant) Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set so that E(u 2 i /X i ) σ 2 i (In practice this means the spread

More information

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X. Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.

More information

CHAPTER 6: SPECIFICATION VARIABLES

CHAPTER 6: SPECIFICATION VARIABLES Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero

More information

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that

More information

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE 1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

The Simple Linear Regression Model

The Simple Linear Regression Model The Simple Linear Regression Model Lesson 3 Ryan Safner 1 1 Department of Economics Hood College ECON 480 - Econometrics Fall 2017 Ryan Safner (Hood College) ECON 480 - Lesson 3 Fall 2017 1 / 77 Bivariate

More information

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018 Econometrics I KS Module 1: Bivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: March 12, 2018 Alexander Ahammer (JKU) Module 1: Bivariate

More information

Multiple Linear Regression CIVL 7012/8012

Multiple Linear Regression CIVL 7012/8012 Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for

More information

Lecture 8: Heteroskedasticity. Causes Consequences Detection Fixes

Lecture 8: Heteroskedasticity. Causes Consequences Detection Fixes Lecture 8: Heteroskedasticity Causes Consequences Detection Fixes Assumption MLR5: Homoskedasticity 2 var( u x, x,..., x ) 1 2 In the multivariate case, this means that the variance of the error term does

More information

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity OSU Economics 444: Elementary Econometrics Ch.0 Heteroskedasticity (Pure) heteroskedasticity is caused by the error term of a correctly speciþed equation: Var(² i )=σ 2 i, i =, 2,,n, i.e., the variance

More information

Econometrics I Lecture 3: The Simple Linear Regression Model

Econometrics I Lecture 3: The Simple Linear Regression Model Econometrics I Lecture 3: The Simple Linear Regression Model Mohammad Vesal Graduate School of Management and Economics Sharif University of Technology 44716 Fall 1397 1 / 32 Outline Introduction Estimating

More information

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors: Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility

More information

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

Simple Linear Regression: The Model

Simple Linear Regression: The Model Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random

More information

Testing Linear Restrictions: cont.

Testing Linear Restrictions: cont. Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n

More information

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects Economics 113 Simple Regression Models Simple Regression Assumptions Simple Regression Derivation Changing Units of Measurement Nonlinear effects OLS and unbiased estimates Variance of the OLS estimates

More information

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of Wooldridge, Introductory Econometrics, d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of homoskedasticity of the regression error term: that its

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance. Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham

Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham Last name (family name): First name (given name):

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can

More information

Week 11 Heteroskedasticity and Autocorrelation

Week 11 Heteroskedasticity and Autocorrelation Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity

More information

Heteroskedasticity ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

Heteroskedasticity ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD Heteroskedasticity ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD Introduction For pedagogical reasons, OLS is presented initially under strong simplifying assumptions. One of these is homoskedastic errors,

More information

Diagnostics of Linear Regression

Diagnostics of Linear Regression Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions

More information

1 Motivation for Instrumental Variable (IV) Regression

1 Motivation for Instrumental Variable (IV) Regression ECON 370: IV & 2SLS 1 Instrumental Variables Estimation and Two Stage Least Squares Econometric Methods, ECON 370 Let s get back to the thiking in terms of cross sectional (or pooled cross sectional) data

More information

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Multiple Regression Analysis. Part III. Multiple Regression Analysis Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant

More information

Econometrics. Week 6. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 6. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 6 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 21 Recommended Reading For the today Advanced Panel Data Methods. Chapter 14 (pp.

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

Answer Key: Problem Set 6

Answer Key: Problem Set 6 : Problem Set 6 1. Consider a linear model to explain monthly beer consumption: beer = + inc + price + educ + female + u 0 1 3 4 E ( u inc, price, educ, female ) = 0 ( u inc price educ female) σ inc var,,,

More information

The general linear regression with k explanatory variables is just an extension of the simple regression as follows

The general linear regression with k explanatory variables is just an extension of the simple regression as follows 3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because

More information

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 7: the K-Varable Linear Model IV

More information

Simple Linear Regression Model & Introduction to. OLS Estimation

Simple Linear Regression Model & Introduction to. OLS Estimation Inside ECOOMICS Introduction to Econometrics Simple Linear Regression Model & Introduction to Introduction OLS Estimation We are interested in a model that explains a variable y in terms of other variables

More information

Heteroskedasticity Example

Heteroskedasticity Example ECON 761: Heteroskedasticity Example L Magee November, 2007 This example uses the fertility data set from assignment 2 The observations are based on the responses of 4361 women in Botswana s 1988 Demographic

More information

Linear Regression with 1 Regressor. Introduction to Econometrics Spring 2012 Ken Simons

Linear Regression with 1 Regressor. Introduction to Econometrics Spring 2012 Ken Simons Linear Regression with 1 Regressor Introduction to Econometrics Spring 2012 Ken Simons Linear Regression with 1 Regressor 1. The regression equation 2. Estimating the equation 3. Assumptions required for

More information

Statistical Inference with Regression Analysis

Statistical Inference with Regression Analysis Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

Inference in Regression Model

Inference in Regression Model Inference in Regression Model Christopher Taber Department of Economics University of Wisconsin-Madison March 25, 2009 Outline 1 Final Step of Classical Linear Regression Model 2 Confidence Intervals 3

More information

Lecture 4: Testing Stuff

Lecture 4: Testing Stuff Lecture 4: esting Stuff. esting Hypotheses usually has three steps a. First specify a Null Hypothesis, usually denoted, which describes a model of H 0 interest. Usually, we express H 0 as a restricted

More information

Rockefeller College University at Albany

Rockefeller College University at Albany Rockefeller College University at Albany PAD 705 Handout: Suggested Review Problems from Pindyck & Rubinfeld Original prepared by Professor Suzanne Cooper John F. Kennedy School of Government, Harvard

More information

Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 2 Fall 2016 Instructor: Martin Farnham

Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 2 Fall 2016 Instructor: Martin Farnham Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 2 Fall 2016 Instructor: Martin Farnham Last name (family name): First name (given name): Student ID

More information

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C =

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C = Economics 130 Lecture 6 Midterm Review Next Steps for the Class Multiple Regression Review & Issues Model Specification Issues Launching the Projects!!!!! Midterm results: AVG = 26.5 (88%) A = 27+ B =

More information

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information. STA441: Spring 2018 Multiple Regression This slide show is a free open source document. See the last slide for copyright information. 1 Least Squares Plane 2 Statistical MODEL There are p-1 explanatory

More information

Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama

Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama Course Packet The purpose of this packet is to show you one particular dataset and how it is used in

More information

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations. Exercises for the course of Econometrics Introduction 1. () A researcher is using data for a sample of 30 observations to investigate the relationship between some dependent variable y i and independent

More information

1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11

1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11 Econ 495 - Econometric Review 1 Contents 1 Linear Regression Analysis 4 1.1 The Mincer Wage Equation................. 4 1.2 Data............................. 6 1.3 Econometric Model.....................

More information

Basic econometrics. Tutorial 3. Dipl.Kfm. Johannes Metzler

Basic econometrics. Tutorial 3. Dipl.Kfm. Johannes Metzler Basic econometrics Tutorial 3 Dipl.Kfm. Introduction Some of you were asking about material to revise/prepare econometrics fundamentals. First of all, be aware that I will not be too technical, only as

More information

Sample Problems. Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them.

Sample Problems. Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them. Sample Problems 1. True or False Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them. (a) The sample average of estimated residuals

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43 Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression

More information

ECON 497: Lecture Notes 10 Page 1 of 1

ECON 497: Lecture Notes 10 Page 1 of 1 ECON 497: Lecture Notes 10 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 10 Heteroskedasticity Studenmund Chapter 10 We'll start with a quote from Studenmund:

More information

Measuring the fit of the model - SSR

Measuring the fit of the model - SSR Measuring the fit of the model - SSR Once we ve determined our estimated regression line, we d like to know how well the model fits. How far/close are the observations to the fitted line? One way to do

More information

Regression and Stats Primer

Regression and Stats Primer D. Alex Hughes dhughes@ucsd.edu March 1, 2012 Why Statistics? Theory, Hypotheses & Inference Research Design and Data-Gathering Mechanics of OLS Regression Mechanics Assumptions Practical Regression Interpretation

More information

Financial Econometrics

Financial Econometrics Material : solution Class : Teacher(s) : zacharias psaradakis, marian vavra Example 1.1: Consider the linear regression model y Xβ + u, (1) where y is a (n 1) vector of observations on the dependent variable,

More information

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata EC312: Advanced Econometrics Problem Set 3 Solutions in Stata Nicola Limodio www.nicolalimodio.com N.Limodio1@lse.ac.uk The data set AIRQ contains observations for 30 standard metropolitan statistical

More information

Contest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2.

Contest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2. Updated: November 17, 2011 Lecturer: Thilo Klein Contact: tk375@cam.ac.uk Contest Quiz 3 Question Sheet In this quiz we will review concepts of linear regression covered in lecture 2. NOTE: Please round

More information

2. Linear regression with multiple regressors

2. Linear regression with multiple regressors 2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions

More information

EC402 - Problem Set 3

EC402 - Problem Set 3 EC402 - Problem Set 3 Konrad Burchardi 11th of February 2009 Introduction Today we will - briefly talk about the Conditional Expectation Function and - lengthily talk about Fixed Effects: How do we calculate

More information

Ordinary Least Squares Regression

Ordinary Least Squares Regression Ordinary Least Squares Regression Goals for this unit More on notation and terminology OLS scalar versus matrix derivation Some Preliminaries In this class we will be learning to analyze Cross Section

More information

mrw.dat is used in Section 14.2 to illustrate heteroskedasticity-robust tests of linear restrictions.

mrw.dat is used in Section 14.2 to illustrate heteroskedasticity-robust tests of linear restrictions. Chapter 4 Heteroskedasticity This chapter uses some of the applications from previous chapters to illustrate issues in model discovery. No new applications are introduced. houthak.dat is used in Section

More information

5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is

5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is Practice Final Exam Last Name:, First Name:. Please write LEGIBLY. Answer all questions on this exam in the space provided (you may use the back of any page if you need more space). Show all work but do

More information

Applied Quantitative Methods II

Applied Quantitative Methods II Applied Quantitative Methods II Lecture 4: OLS and Statistics revision Klára Kaĺıšková Klára Kaĺıšková AQM II - Lecture 4 VŠE, SS 2016/17 1 / 68 Outline 1 Econometric analysis Properties of an estimator

More information

Applied Econometrics (MSc.) Lecture 3 Instrumental Variables

Applied Econometrics (MSc.) Lecture 3 Instrumental Variables Applied Econometrics (MSc.) Lecture 3 Instrumental Variables Estimation - Theory Department of Economics University of Gothenburg December 4, 2014 1/28 Why IV estimation? So far, in OLS, we assumed independence.

More information