Introductory Econometrics
|
|
- Maurice Gallagher
- 5 years ago
- Views:
Transcription
1 Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010
2 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution. Mean of this distribution is zero E [ε i ] = 0 / E [ε] = 0 3. Variance of this distribution is constant Var [ε i ] = σ (homoskedasticity) 4. Disturbances are not autocorrelated Cov [ε i ε j ] = In matrix notation: Var [ε] = σ I n Can be summarized as: ε NID(0, σ I n ) 5. Disturbances are not correlated with the explanatory variable cov(x ik, ε i ) = 0 / Cov [X, ε] = 0 (consistency assumption) 6. Explanatory variables are not linearly dependent (no multicolinearity)
3 When all the assumptions are satis ed OLS estimator bβ is a normally distributed random variable h i 9 E b β k = β = h i k ) OLS estimator is unbiased E bβ = β ; h i 9 Var b σ β k = = SST k(1 R h i k ) Var bβ = X T X 1 σ ; ) OLS estimator is e cient (has the lowest possible variance) Thus: OLS estimator is BLUE (best linear unbiased estimator) OLS estimator is consistent (is e cient and unbiased when n! )
4 Homoskedasticity vs. Heteroskedasticity
5 Variance-covariance matrix of disturbance term ε = 6 4 ε 1 ε... ε n Var[ε] = 6 4 Var[ε 1 ] Cov[ε 1, ε ]... Cov[ε 1, ε n ] Cov[ε, ε 1 ] Var[ε ]... Cov[ε, ε n ] Cov[ε n, ε 1 ] Cov[ε n, ε ] Var[ε n ] Homoskedasticity: Var [ε i ] = E [ε i ] (E [ε i ]) = E ε i = σ No autocorrelation: Cov [ε i, ε j ] = E [ε i ε j ] E [ε i ]E [ε j ] = E [ε i ε j ] = 0 Var[ε] = 6 4 σ σ σ = σ I n
6 Homoskedasticity vs. heteroskedasticity Homoskedastic disturbance: σ σ σ Var[ε] = = σ I n Var[ε] = 6 4 σ σ σ i Heteroskedastic disturbance: = 6 4 σ ω σ ω σ ω n = σ Ω or: Var [ε i ] = σ ω i
7 x Picturing heteroskedasticity in -variable case y
8 What happens to OLS estimates under heteroskedasticity? y i = β 0 + β 1 x i1 + β x i β k x ik + ε i y = Xβ + ε where E [x ik ε i ] = 0, E [ε i ε j ] = 0, but Var [ε i ] = σ ω i 6= const. or E X T ε = 0, but Cov [ε] = σ OLS estimate: b β = X T X 1 X T y h i E bβ h i Var bβ = β (by assumption E [εjx] = 0) ) β b is unbiased 1 1 = X X T X T Var [ε] X X X T = 1 = X X T X T σ X 1 X X T ) h i 1 Var bβ is di erent than X T X σ
9 What happens to OLS estimates under heteroskedasticity? h i Var bβ 1 1 = σ X X T X T ΩX X T X We used to estimate the OLS estimator variance by: h i dvar bβ = bσ X X 1 1 T = s X T e X = X T X n k 1 h h i E Var d bβ 1 h i Is this a good estimator of Var bβ? h ii Var bβ = E s X X 1 1 T σ X X T X T ΩX X T X in large samples s σ 1 h i 1 = E σ X T X X T X X T ΩX X T X 6= 0 if cov(ω X) 6= 0
10 What happens to OLS estimates under heteroskedasticity? Standard estimate of OLS standard errors are biased in small samples Standard estimate of OLS standard errors are biased even in large samples if heteroskedasticity is correlated with some explanatory variables We can not perform reliable hypothesis testing OLS estimator is no longer BLUE (Best Linear Unbiased Estimator)
11 Example - marginal propensity to save Model 1: OLS estimates using the 100 observations Dependent variable: sav coefficient std. error t-ratio p-value const inc size educ Unadjusted R-squared = Adjusted R-squared = F-statistic (3, 96) =.8957 (p-value = 0.045)
12 Heteroskedasticity-robust standard errors OLS estimator b β is unbiased even under heteroskedasticity The only thing we need to be careful about are the standard errors of coe cients h i Knowing that Var bβ = X T X 1 X T σ ΩX X T X 1 rather than h i Var bβ = σ X T X 1, h i let us nd a consistent estimator of Var bβ White (1980) showed that X T σ ΩX can be estimated by 1 n e i x i x T i Heteroskedasticity-robust variance of OLS estimator is estimated as: h i dvar bβ = 1 n X T X 1 e i x i x T i 1 X T X
13 Heteroskedasticity-robust standard errors h i Var bβ = X T X 1 X T σ ΩX X T X x 11 x 1... x n1 x 1 x... x n x 1k x k... x nk X T σ ΩX = σ σ σ i = σ i x i x T i x 11 x 1... x 1k 1 x 1 x... x k x n1 x n... x nk dvar bβ = 1 n X T X 1 e i x i x T i X T X 1 is a good estimate of the above
14 Example - marginal propensity to save Model : OLS estimates using the 100 observations Heteroskedasticity-robust standard errors Dependent variable: sav coefficient std. error t-ratio p-value const inc size educ Unadjusted R-squared = Adjusted R-squared = F-statistic (3, 96) =.3654 (p-value = )
15 Why don t we always apply heteroskedasticity-robust standard errors? Robust standard errors can be used for valid hypothesis testing only in large samples t = b β β! t(n k 1) b n! se( b β) In small samples robust t-statistics might be distributed di erently We prefer to use robust standard errors only where presence of heteroskedasticity is justi ed We would like to test whether hetoerskedasticity is present
16 Testing for heteroskedasticity y i = β 0 + β 1 x i1 + β x i β k x ik + ε i Under homoskedasticity (H 0 ): Var[ε] = σ I or Var (ε i ) = σ (const) Under heteroskedasticity (H A ): Var[ε] = σ Ω or Var (ε i ) 6= const Let us remember that all these assumptions are conditional on the explanatory variables, i.e. Under homoskedasticity (H 0 ): Var[εjx] = σ I or Var (ε i jx i ) = σ (const) Under heteroskedasticity (H A ): Var[εjx] = σ Ω or Var (ε i jx i ) 6= const
17 Testing for heteroskedasticity Under homoskedasticity(h 0 ): Var[εjx] = σ I or Var (ε i jx i ) = σ (const) Under heteroskedasticity(h A ): Var[εjx] = σ Ω or Var (ε i jx i ) 6= const Another assumption states that E [ε i jx i ] = 0, thus: Var (ε i jx i ) = E ε i jx i E [ε i jx i ] = E ε i jx i H 0 : E ε i jx i = const H A : E ε i jx i 6= const Estimate ε i by residuals e i and nd out if E e i jx i = const
18 Testing for heteroskedasticity H 0 : E ε i jx i = const H A : E ε i jx i 6= const Estimate ε i by residuals e i and nd out if E e i jx i = const e i = δ 0 + δ 1 x i1 + δ x i δ k x ik + u i H 0 : δ 1 = δ =... = δ k = 0 H A : at least one of the deltas is signi cant Use the F-test for overall signi cance of the above regression: F = (SSR R SSR U )/k SSR U /(n k 1) = Ru /k (1 Ru )/(n k 1) (because SSR R = SST R ) R R = 0)
19 Testing for heteroskedasticity e i = δ 0 + δ 1 x i1 + δ x i δ k x ik + u i H 0 (homoskedasticity): δ 1 = δ =... = δ k = 0 H A (heteroskedasticity): at least one of the deltas is signi cant Test statistics: F = Ru /k F (k, n k 1) (1 Ru )/(n k 1) We reject the null hypothesis (reject homoskedasticity) if the test statistics is higher than the appropriate critical value.
20 Example - marginal propensity to save Model 4: OLS estimates using the 100 observations Dependent variable: e_sq coefficient std. error t-ratio p-value const E E inc size.91615e E educ E E Unadjusted R-squared = Adjusted R-squared = F-statistic (3, 96) = (p-value = 0.158)
21 Heteroskedasticity - summary In small samples it always means estimating biased standard errors of coe cients by OLS In large samples OLS estimates of coe cients standard errors are biased only if heteroskedasticity is correlated with explanatory variables Heteroskedasticity robust standard errors might not produce a t-distribudion in small samples In large samples robust standard errors produce a t-distribudion Especially in small samples we would like to test for presence of heteroskedasticity before applying robust standard errors
22 Special form of heteroskedasticity Heteroskedasticity is problematic when correlated with X We could model this relationship by Var (ε i jx i ) = σ h(x i ) h(x i ) > 0 Assume we know h(x i ) y i = β 0 + β 1 x i1 + β x i β k x ik + ε i where E [εjx] = 0, E [ε i ε j jx] = 0 8i, j and Var [εjx] = σ h(x) Note that although Var [εjx] = E ε jx = σ h(x), ε Var p jx = 1 h(x) h(x) Var ε jx = 1 h(x) σ h(x) = σ ε Thus, Var p jx = 0 h(x) ε Moreover E p jx = p 1 E [εjx] = p 1 0 = 0 h(x) h(x) h(x)
23 Special form of heteroskedasticity y i = β 0 + β 1 x i1 + β x i β k x ik + ε i ε where Var p ε jx = 0 and E p jx = 0 h(x) h(x) y i p h(xi ) = β 0 1 p h(xi ) + β 1 x i1 p h(xi ) β k x ik p h(xi ) + ε i p h(xi ) y i = β 0 x i0 Satis es the assumptions that E ε jx = 0, E + β xi β k xik + εi h εi ε j jxi = 0 8i, j and Var [ε jx ] = σ
24 Weighted Least Squares (WLS) y i p h(xi ) = β 0 1 p h(xi ) + β 1 x i1 p h(xi ) β k x ik p h(xi ) + ε i p h(xi ) y i = β 0 x i0 + β x i β k x ik + ε i OLS estimates of this equation (β 0, β 1,...,β k ) are called the WLS estimates Each observation (including the constant term) is weighted by 1 p h(xi ) WLS estimators are more e cient than the OLS estimators in presence of heteroskedasticity The Weighted Least Squares (WLS) estimation is a special case of the Generalized Least Squares (GLS) estimation.
25 Feasible GLS - estimating the heteroskedasticity function In the above example we knew what is the form of heteroskedasticity Usually, we do not know this form We can assume some general functional form and estimate it using the data The assumed functional form of heteroskedasticity: Var (ε i jx i ) = σ exp (δ 0 + δ 1 x i1 + δ x i δ k x i Thus, we assume that h(x i ) = exp (δ 0 + δ 1 x i1 + δ x i δ k x ik ) We use exponential function to assure that h(x i ) is positive We can write: Var (ε i jx i ) = E ε i jx i e i jx i v i, where v i is a random variable wit e i = σ exp (δ 0 + δ 1 x i1 + δ x i δ k x ik ) v i
26 Feasible GLS - estimating the heteroskedasticity function Original regression: y i = β 0 + β 1 x i1 + β x i β k x ik + ε i Heteroskedasticity form : Var (ε i jx i ) = σ exp (δ 0 + δ 1 x i1 + δ x i δ k x ik ) log ei log ei e i = σ exp (δ 0 + δ 1 x i1 + δ x i δ k x ik ) v i assuming that v i is independent of x i = log σ + (δ 0 {z } v + δ 1 x i1 + δ x i δ k x ik ) + log v i {z } u i = α 0 + δ 1 x i1 + δ x i δ k x ik + u i, where E [u i jx i ] = 0, because E [v i jx i ] = 1 and log(1) = 0
27 Feasible GLS - estimating the heteroskedasticity function Original regression: y i = β 0 + β 1 x i1 + β x i β k x ik + ε i Var(ε i ) = σ h(x i ) h(x i ) = exp (δ 0 + δ 1 x i1 + δ x i δ k x ik ) log [h(x i )] = δ 0 + δ 1 x i1 + δ x i δ k x ik We can estimate h(x i ) by regressing the log of squared e i from the original regression on all explanatory variables log ei = α0 + δ 1 x i1 + δ x i δ k x ik + u i Note, that tted values, log \ (e i ) are estimates of α 0 + δ 1 x i1 + δ x i δ k x ik and thus exp log \(e i ) = σ \ h(x i ) can be then used to estimate original equation by
28 Feasible GLS - the procedure 1 Estimate the original equation by OLS y i = β 0 + β 1 x i1 + β x i β k x ik + ε i and record e i Calculate log ei 3 Estimate the following regression by OLS: log ei = α0 + δ 1 x i1 + δ x i δ k x ik + u i and record the tted values log(e \ i ) 4 Calculate h(x [ i ) = exp log(e \ i ) - i.e. estimate h(x i ) by the exponential of tted values 5 Finally, estimate the original equation by WLS, using 1 [h(x i ) y i = β 0 + β 1 x i1 + β x i β k x ik + ε i as weights.
29 Summary What is heteroskedasticity What happens to OLS estimates under heteroskedasticity? How to test for presence of heteroskedasticity? Three methods to deal with heteroskedasticity heteroskedasticity-robust standard errors Weighted Least Squares (Generalized Least Squares) Feasible Generalized Least Squares
Econometrics Multiple Regression Analysis: Heteroskedasticity
Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties
More informationPanel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43
Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression
More informationMultiple Regression Analysis
Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,
More informationFöreläsning /31
1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +
More informationChapter 8 Heteroskedasticity
Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized
More informationHeteroskedasticity. Part VII. Heteroskedasticity
Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least
More informationIntermediate Econometrics
Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the
More information1. The OLS Estimator. 1.1 Population model and notation
1. The OLS Estimator OLS stands for Ordinary Least Squares. There are 6 assumptions ordinarily made, and the method of fitting a line through data is by least-squares. OLS is a common estimation methodology
More information1 The Multiple Regression Model: Freeing Up the Classical Assumptions
1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator
More informationTopic 7: Heteroskedasticity
Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic
More informationLECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity
LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists
More informationRecent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data
Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)
More informationEconometrics - 30C00200
Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business
More informationGLS. Miguel Sarzosa. Econ626: Empirical Microeconomics, Department of Economics University of Maryland
GLS Miguel Sarzosa Department of Economics University of Maryland Econ626: Empirical Microeconomics, 2012 1 When any of the i s fail 2 Feasibility 3 Now we go to Stata! GLS Fixes i s Failure Remember that
More informationIntroduction to Econometrics. Heteroskedasticity
Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory
More informationSemester 2, 2015/2016
ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can
More informationHeteroskedasticity (Section )
Heteroskedasticity (Section 8.1-8.4) Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Heteroskedasticity 1 / 44 Consequences of Heteroskedasticity for OLS Consequences
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationMultiple Regression Analysis: Heteroskedasticity
Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing
More informationx i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.
Exercises for the course of Econometrics Introduction 1. () A researcher is using data for a sample of 30 observations to investigate the relationship between some dependent variable y i and independent
More informationLeast Squares Estimation-Finite-Sample Properties
Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions
More informationLecture 4: Heteroskedasticity
Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan
More informationECONOMETRICS FIELD EXAM Michigan State University May 9, 2008
ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 Instructions: Answer all four (4) questions. Point totals for each question are given in parenthesis; there are 00 points possible. Within
More informationLab 11 - Heteroskedasticity
Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction
More informationBasic Econometrics - rewiev
Basic Econometrics - rewiev Jerzy Mycielski Model Linear equation y i = x 1i β 1 + x 2i β 2 +... + x Ki β K + ε i, dla i = 1,..., N, Elements dependent (endogenous) variable y i independent (exogenous)
More informationReliability of inference (1 of 2 lectures)
Reliability of inference (1 of 2 lectures) Ragnar Nymoen University of Oslo 5 March 2013 1 / 19 This lecture (#13 and 14): I The optimality of the OLS estimators and tests depend on the assumptions of
More informationSpatial Regression. 3. Review - OLS and 2SLS. Luc Anselin. Copyright 2017 by Luc Anselin, All Rights Reserved
Spatial Regression 3. Review - OLS and 2SLS Luc Anselin http://spatial.uchicago.edu OLS estimation (recap) non-spatial regression diagnostics endogeneity - IV and 2SLS OLS Estimation (recap) Linear Regression
More informationEnvironmental Econometrics
Environmental Econometrics Syngjoo Choi Fall 2008 Environmental Econometrics (GR03) Fall 2008 1 / 37 Syllabus I This is an introductory econometrics course which assumes no prior knowledge on econometrics;
More informationAdvanced Econometrics I
Lecture Notes Autumn 2010 Dr. Getinet Haile, University of Mannheim 1. Introduction Introduction & CLRM, Autumn Term 2010 1 What is econometrics? Econometrics = economic statistics economic theory mathematics
More informationEconometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018
Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More informationWooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of
Wooldridge, Introductory Econometrics, d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of homoskedasticity of the regression error term: that its
More informationHomoskedasticity. Var (u X) = σ 2. (23)
Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This
More informationViolation of OLS assumption- Multicollinearity
Violation of OLS assumption- Multicollinearity What, why and so what? Lars Forsberg Uppsala University, Department of Statistics October 17, 2014 Lars Forsberg (Uppsala University) 1110 - Multi - co -
More informationEconomics 620, Lecture 13: Time Series I
Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is
More informationProblem set 1 - Solutions
EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed
More informationFreeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94
Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations
More informationOrdinary Least Squares Regression
Ordinary Least Squares Regression Goals for this unit More on notation and terminology OLS scalar versus matrix derivation Some Preliminaries In this class we will be learning to analyze Cross Section
More informationCourse Econometrics I
Course Econometrics I 4. Heteroskedasticity Martin Halla Johannes Kepler University of Linz Department of Economics Last update: May 6, 2014 Martin Halla CS Econometrics I 4 1/31 Our agenda for today Consequences
More informationLECTURE 13: TIME SERIES I
1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural
More informationThe Simple Regression Model. Simple Regression Model 1
The Simple Regression Model Simple Regression Model 1 Simple regression model: Objectives Given the model: - where y is earnings and x years of education - Or y is sales and x is spending in advertising
More informationViolation of OLS assumption - Heteroscedasticity
Violation of OLS assumption - Heteroscedasticity What, why, so what and what to do? Lars Forsberg Uppsala Uppsala University, Department of Statistics October 22, 2014 Lars Forsberg (Uppsala University)
More informationIris Wang.
Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?
More informationTopic 4: Model Specifications
Topic 4: Model Specifications Advanced Econometrics (I) Dong Chen School of Economics, Peking University 1 Functional Forms 1.1 Redefining Variables Change the unit of measurement of the variables will
More informationChapter 2. Dynamic panel data models
Chapter 2. Dynamic panel data models School of Economics and Management - University of Geneva Christophe Hurlin, Université of Orléans University of Orléans April 2018 C. Hurlin (University of Orléans)
More informationIntroductory Econometrics
Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity
More informationOLS, MLE and related topics. Primer.
OLS, MLE and related topics. Primer. Katarzyna Bech Week 1 () Week 1 1 / 88 Classical Linear Regression Model (CLRM) The model: y = X β + ɛ, and the assumptions: A1 The true model is y = X β + ɛ. A2 E
More information1 A Non-technical Introduction to Regression
1 A Non-technical Introduction to Regression Chapters 1 and Chapter 2 of the textbook are reviews of material you should know from your previous study (e.g. in your second year course). They cover, in
More informationEconometrics Midterm Examination Answers
Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i
More informationChapter 6: Endogeneity and Instrumental Variables (IV) estimator
Chapter 6: Endogeneity and Instrumental Variables (IV) estimator Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 15, 2013 Christophe Hurlin (University of Orléans)
More informationGraduate Econometrics Lecture 4: Heteroskedasticity
Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model
More informationFinansiell Statistik, GN, 15 hp, VT2008 Lecture 15: Multiple Linear Regression & Correlation
Finansiell Statistik, GN, 5 hp, VT28 Lecture 5: Multiple Linear Regression & Correlation Gebrenegus Ghilagaber, PhD, ssociate Professor May 5, 28 Introduction In the simple linear regression Y i = + X
More informationExercises (in progress) Applied Econometrics Part 1
Exercises (in progress) Applied Econometrics 2016-2017 Part 1 1. De ne the concept of unbiased estimator. 2. Explain what it is a classic linear regression model and which are its distinctive features.
More informationi) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test.
1. Explain what is: i) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test. Answer: i) It is the probability
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationCHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More informationInstrumental Variables and Two-Stage Least Squares
Instrumental Variables and Two-Stage Least Squares Generalised Least Squares Professor Menelaos Karanasos December 2011 Generalised Least Squares: Assume that the postulated model is y = Xb + e, (1) where
More informationECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University
ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University Instructions: Answer all four (4) questions. Be sure to show your work or provide su cient justi cation for
More informationEconomics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects
Economics 113 Simple Regression Models Simple Regression Assumptions Simple Regression Derivation Changing Units of Measurement Nonlinear effects OLS and unbiased estimates Variance of the OLS estimates
More informationHeteroscedasticity and Autocorrelation
Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation
More informationRegression used to predict or estimate the value of one variable corresponding to a given value of another variable.
CHAPTER 9 Simple Linear Regression and Correlation Regression used to predict or estimate the value of one variable corresponding to a given value of another variable. X = independent variable. Y = dependent
More informationLECTURE 2 LINEAR REGRESSION MODEL AND OLS
SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another
More informationAdvanced Quantitative Methods: ordinary least squares
Advanced Quantitative Methods: Ordinary Least Squares University College Dublin 31 January 2012 1 2 3 4 5 Terminology y is the dependent variable referred to also (by Greene) as a regressand X are the
More informationIntroductory Econometrics. Lecture 13: Hypothesis testing in the multiple regression model, Part 1
Introductory Econometrics Lecture 13: Hypothesis testing in the multiple regression model, Part 1 Jun Ma School of Economics Renmin University of China October 19, 2016 The model I We consider the classical
More informationTesting Linear Restrictions: cont.
Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n
More informationthe error term could vary over the observations, in ways that are related
Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may
More informationPANEL DATA RANDOM AND FIXED EFFECTS MODEL. Professor Menelaos Karanasos. December Panel Data (Institute) PANEL DATA December / 1
PANEL DATA RANDOM AND FIXED EFFECTS MODEL Professor Menelaos Karanasos December 2011 PANEL DATA Notation y it is the value of the dependent variable for cross-section unit i at time t where i = 1,...,
More informationWooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model
Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model Most of this course will be concerned with use of a regression model: a structure in which one or more explanatory
More information5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares
5.1 Model Specification and Data 5. Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares Estimator 5.4 Interval Estimation 5.5 Hypothesis Testing for
More informationQuantitative Techniques - Lecture 8: Estimation
Quantitative Techniques - Lecture 8: Estimation Key words: Estimation, hypothesis testing, bias, e ciency, least squares Hypothesis testing when the population variance is not known roperties of estimates
More informationExercise sheet 3 The Multiple Regression Model
Exercise sheet 3 The Multiple Regression Model Note: In those problems that include estimations and have a reference to a data set the students should check the outputs obtained with Gretl. 1. Let the
More informationLinear Regression. y» F; Ey = + x Vary = ¾ 2. ) y = + x + u. Eu = 0 Varu = ¾ 2 Exu = 0:
Linear Regression 1 Single Explanatory Variable Assume (y is not necessarily normal) where Examples: y» F; Ey = + x Vary = ¾ 2 ) y = + x + u Eu = 0 Varu = ¾ 2 Exu = 0: 1. School performance as a function
More informationAnswer Key: Problem Set 6
: Problem Set 6 1. Consider a linear model to explain monthly beer consumption: beer = + inc + price + educ + female + u 0 1 3 4 E ( u inc, price, educ, female ) = 0 ( u inc price educ female) σ inc var,,,
More informationModel Mis-specification
Model Mis-specification Carlo Favero Favero () Model Mis-specification 1 / 28 Model Mis-specification Each specification can be interpreted of the result of a reduction process, what happens if the reduction
More informationECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47
ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with
More informationRegression Models - Introduction
Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent
More informationHeteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.
Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =
More informationLecture 8: Instrumental Variables Estimation
Lecture Notes on Advanced Econometrics Lecture 8: Instrumental Variables Estimation Endogenous Variables Consider a population model: y α y + β + β x + β x +... + β x + u i i i i k ik i Takashi Yamano
More informationEconometrics of Panel Data
Econometrics of Panel Data Jakub Mućk Meeting # 2 Jakub Mućk Econometrics of Panel Data Meeting # 2 1 / 26 Outline 1 Fixed effects model The Least Squares Dummy Variable Estimator The Fixed Effect (Within
More informationA Course on Advanced Econometrics
A Course on Advanced Econometrics Yongmiao Hong The Ernest S. Liu Professor of Economics & International Studies Cornell University Course Introduction: Modern economies are full of uncertainties and risk.
More informationLab 07 Introduction to Econometrics
Lab 07 Introduction to Econometrics Learning outcomes for this lab: Introduce the different typologies of data and the econometric models that can be used Understand the rationale behind econometrics Understand
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple
More informationSingle-Equation GMM: Endogeneity Bias
Single-Equation GMM: Lecture for Economics 241B Douglas G. Steigerwald UC Santa Barbara January 2012 Initial Question Initial Question How valuable is investment in college education? economics - measure
More informationHeteroskedasticity and Autocorrelation
Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity
More informationMa 3/103: Lecture 24 Linear Regression I: Estimation
Ma 3/103: Lecture 24 Linear Regression I: Estimation March 3, 2017 KC Border Linear Regression I March 3, 2017 1 / 32 Regression analysis Regression analysis Estimate and test E(Y X) = f (X). f is the
More informationECON2228 Notes 7. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 41
ECON2228 Notes 7 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 6 2014 2015 1 / 41 Chapter 8: Heteroskedasticity In laying out the standard regression model, we made
More informationHeteroskedasticity. We now consider the implications of relaxing the assumption that the conditional
Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect
More informationLECTURE 10: MORE ON RANDOM PROCESSES
LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more
More informationLecture 4: Linear panel models
Lecture 4: Linear panel models Luc Behaghel PSE February 2009 Luc Behaghel (PSE) Lecture 4 February 2009 1 / 47 Introduction Panel = repeated observations of the same individuals (e.g., rms, workers, countries)
More informationThe returns to schooling, ability bias, and regression
The returns to schooling, ability bias, and regression Jörn-Steffen Pischke LSE October 4, 2016 Pischke (LSE) Griliches 1977 October 4, 2016 1 / 44 Counterfactual outcomes Scholing for individual i is
More informationWe begin by thinking about population relationships.
Conditional Expectation Function (CEF) We begin by thinking about population relationships. CEF Decomposition Theorem: Given some outcome Y i and some covariates X i there is always a decomposition where
More information1. The Multivariate Classical Linear Regression Model
Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The
More informationEconomics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2
Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions
More informationEconometrics. 7) Endogeneity
30C00200 Econometrics 7) Endogeneity Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Common types of endogeneity Simultaneity Omitted variables Measurement errors
More informationRegression and Statistical Inference
Regression and Statistical Inference Walid Mnif wmnif@uwo.ca Department of Applied Mathematics The University of Western Ontario, London, Canada 1 Elements of Probability 2 Elements of Probability CDF&PDF
More information1 Correlation between an independent variable and the error
Chapter 7 outline, Econometrics Instrumental variables and model estimation 1 Correlation between an independent variable and the error Recall that one of the assumptions that we make when proving the
More informationMultiple Regression Analysis. Part III. Multiple Regression Analysis
Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant
More informationEcon 836 Final Exam. 2 w N 2 u N 2. 2 v N
1) [4 points] Let Econ 836 Final Exam Y Xβ+ ε, X w+ u, w N w~ N(, σi ), u N u~ N(, σi ), ε N ε~ Nu ( γσ, I ), where X is a just one column. Let denote the OLS estimator, and define residuals e as e Y X.
More information1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE
1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +
More information