Model Mis-specification

Size: px
Start display at page:

Download "Model Mis-specification"

Transcription

1 Model Mis-specification Carlo Favero Favero () Model Mis-specification 1 / 28

2 Model Mis-specification Each specification can be interpreted of the result of a reduction process, what happens if the reduction process that has generated E (y X) omits some relevant information? We shall consider two general cases of mis-specification. Mis-specification related to the choice of variables included in the regressions, Mis-specification related to wrong assumptions on the properties of the error terms. Favero () Model Mis-specification 2 / 28

3 Choice of variables under-parameterization (the estimated model omits variables included in the DGP) over-parameterization (the estimated model includes more variables than the DGP). Favero () Model Mis-specification 3 / 28

4 Under-parameterization Given the DGP: y = X 1 β 1 +X 2 β 2 +ɛ, (1) for which hypotheses (??) (??) hold, the following model is estimated: y = X 1 β 1 + ν. (2) The OLS estimates are given by the following expression: β up 1 = ( X 1 X 1) 1 X 1 y, (3) Favero () Model Mis-specification 4 / 28

5 Under-parameterization while the OLS estimates which are obtained by estimation of the DGP, are: β 1 = ( X 1 M 2X 1 ) 1 X 1 M 2 y. (4) The estimates in (4) are best linear unbiased estimators (BLUE) by construction, while the estimates in (3) are biased unless X 1 and X 2 are uncorrelated. To show this, consider: β 1 = ( X 1 X ) ) 1 1 (X 1 y X 1 X 2 β 2 (5) = β up 1 D β 2, (6) where D is the vector of coefficients in the regression of X 2 on X 1 and β 2 is the OLS estimator obtained by fitting the DGP. Favero () Model Mis-specification 5 / 28

6 Illustration Given the DGP: the following model is estimated by OLS y = X X ɛ 1, (7) X 2 = 0.8X 1 + ɛ 2 (8) y = X 1 β 1 + ν. (9) (a) The OLS estimate of β 1 will be 0.5 (b) The OLS estimate of β 1 will be 0 (c) The OLS estimate of β 1 will be 0.9 (d) The OLS estimate of β 1 will have a mean of 0.9 Favero () Model Mis-specification 6 / 28

7 Under-parameterization Note that if E (y X 1, X 2 ) = X 1 β 1 +X 2 β 2, E (X 1 X 2 ) = X 1 D, then, E (y X 1 ) = X 1 β 1 +X 1 Dβ 2 = X 1 α. Therefore the OLS estimator in the under-parameterized model is a biased estimator of β 1, but an unbiased estimator of α. Then, if the objective of the model is forecasting and X 1 is more easily observed than X 2, the under-parameterized model can be safely used. On the other hand, if the objective of the model is to test specific predictions on parameters, the use of the under-parameterized model delivers biased results. Favero () Model Mis-specification 7 / 28

8 Over-parameterization Given the DGP, y = X 1 β 1 + ɛ, (10) for which hypotheses (??) (??) hold, the following model is estimated: y = X 1 β 1 + X 2 β 2 + v. (11) The OLS estimator of the over-parameterized model is β op 1 = ( X 1 M 2X 1 ) 1 X 1 M 2 y, (12) Favero () Model Mis-specification 8 / 28

9 Over-parameterization while, by estimating the DGP, we obtain: β 1 = ( X 1 X 1) 1 X 1 y. (13) By substituting y from the DGP, one finds that both estimators are unbiased and the difference is now made by the variance. In fact we have: ( βop ) var 1 X 1, X 2 = σ 2 ( X 1 M ) 1 2X 1, (14) ) var ( β1 X 1, X 2 = σ 2 ( X 1 X 1 1). (15) Favero () Model Mis-specification 9 / 28

10 Over-parameterization Remember that if two matrices A and B are positive definite and A B is positive semidefinite, then also the matrix B 1 A 1 is positive semidefinite. We have to show that X 1 X 1 X 1 M 2X 1 is a positive semidefinite matrix. Such a result is almost immediate: X 1 X 1 X 1 M 2X 1 = X 1 (I M 2 ) X 1 = X 1 Q 2X 1 = X 1 Q 2Q 2 X 1. We conclude that over-parameterization impacts on the efficiency of estimators and the power of the tests of hypotheses. Favero () Model Mis-specification 10 / 28

11 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation model and generalize it to the case in which the hypotheses of diagonality and constancy of the conditional variances-covariance matrix of the residuals do not hold: y = Xβ + ɛ, (16) ( ) ɛ n.d. 0, σ 2 Ω, where Ω is a (T T) symmetric and positive definite matrix. When the OLS method is applied to model (16), it delivers estimators which are consistent but not efficient; moreover, the traditional formula for the variance-covariance matrix of the OLS estimators, σ 2 (X X) 1, is wrong and leads to an incorrect inference. Favero () Model Mis-specification 11 / 28

12 Heteroscedasticity, Autocorrelation, and the GLS estimator Using the standard algebra, it can be shown that the correct formula for the variance-covariance matrix of the OLS estimator is: σ 2 ( X X ) 1 X ΩX ( X X ) 1. To find a general solution to this problem, remember that the inverse of a symmetric definite positive matrix is also symmetric and definite positive and that for a given matrix Φ, symmetric and definite positive, there always exists a (T T) non-singular matrix K, such that K K =Ω 1 and KΩK = I T. Favero () Model Mis-specification 12 / 28

13 Heteroscedasticity, Autocorrelation, and the GLS estimator Consider the regression model obtained by pre-multiplying both the right-hand and the left-hand sides of (16) by K: Ky = KXβ + Kɛ, (17) ) Kɛ n.d. (0, σ 2 I T. The OLS estimator of the parameters of the transformed model (17) satisfies all the conditions for the applications of the Gauss Markov theorem; therefore, the estimator β GLS = ( X K KX ) 1 X K Ky ( 1 = X Ω X) 1 X Ω 1 y, known as the generalised least squares (GLS) estimator, is BLUE. The variance of the GLS estimator, conditional upon X, becomes ) Var ( β ( 1 GLS X = Σ β = σ 2 X Ω X) 1. Favero () Model Mis-specification 13 / 28

14 Heteroscedasticity, Autocorrelation, and the GLS estimator Consider, for example, the variances of the OLS and the GLS estimators. Using the fact that if A and B are positive definite and A B is positive semidefinite, then B 1 A 1 is also positive semidefinite, we have: ( ) X Ω 1 X ( X X ) ( X ΩX ) 1 ( X X ) = X K KX ( X X ) ( X K 1 ( K ) 1 X ) 1 (X X ) = X K ( I ( K ) 1 X ( X K 1 ( K ) 1 X ) 1 X K 1 ) KX where = X K M W M WKX, M W = ( I W ( W W ) 1 W ), (18) W = ( K ) 1 X. (19) Favero () Model Mis-specification 14 / 28

15 Heteroscedasticity, Autocorrelation, and the GLS estimator The applicability of the GLS estimator requires an empirical specification for the matrix K. We consider here three specific applications where the appropriate choice of such a matrix leads to the solution of the problems in the OLS estimator generated, respectively, by the presence of first-order serial correlation in the residuals, by the presence of heteroscedasticity in the residuals by the presence of both of them. Favero () Model Mis-specification 15 / 28

16 Correction for Serial Correlation (Cochrane-Orcutt) Consider first the case of first-order serial correlation in the residuals. We have the following model: y t = x tβ+u t, u t = ρu t 1 + ɛ t, ( ) ɛ t n.i.d. 0, σ 2 ɛ, which, using our general notation, can be re-written as: y = Xβ + ɛ, (20) ( ) ɛ n.d. 0, σ 2 Ω, σ 2 = σ 2 ɛ 1 ρ 2, (21) Favero () Model Mis-specification 16 / 28

17 Correction for Serial Correlation (Cochrane-Orcutt) = 1 ρ ρ 2.. ρ T 1 ρ 1 ρ.. ρ T 2 ρ ρ T 2.. ρ 1 ρ ρ T 1 ρ T 2.. ρ 1 In this case, the knowledge of the parameter ρ allows the empirical implementation of the GLS estimator. An intuitive procedure to implement the GLS estimator can then be the following: 1 estimate the vector β by OLS and save the vector of residuals û t ; 2 regress û t on û t 1 to obtain an estimate ρ of ρ; 3 construct the transformed model and regress (y t ρy t 1 ) on (x t ρx t 1 ) to obtain the GLS estimator of the vector of parameters of interest. Note that the above procedure, known as the Cochrane Orcutt procedure, can be repeated until convergence. Favero () Model Mis-specification 17 / 28.

18 Correction for Heteroscedasticity (White) In the case of heteroscedasticity, our general model becomes y = Xβ + ɛ, ɛ n.d. (0,Ω), σ σ Ω = σ 2 T σ 2 T In this case, to construct the GLS estimator, we need to model heteroscedasticity choosing appropriately the K matrix. Favero () Model Mis-specification 18 / 28

19 Correction for Heteroscedasticity (White) White (1980) proposes a specification based on the consideration that in the case of heteroscedasticity the variance-covariance matrix of the OLS estimator takes the form: σ 2 ( X X ) 1 X ΩX ( X X ) 1, which can be used for inference, once an estimator for Φ is available. The following unbiased estimator of Φ is proposed: Ω= û û û 2 T û 2 T. Favero () Model Mis-specification 19 / 28

20 Correction for Heteroscedasticity (White) This choice for Ω leads to the following degrees of freedom corrected heteroscedasticity consistent parameters covariance matrix estimator: Σ W β = T ( X X ) ) 1 (X X ) 1 T k ( T û 2 t X t X t t=1 This estimator corrects for the OLS for the presence of heteroscedasticity in the residuals without modelling it. Favero () Model Mis-specification 20 / 28

21 Correction for heteroscedasticity and serial correlation (Newey-West) The White covariance matrix assumes serially uncorrelated residuals. Newey and West(1987) have proposed a more general covariance estimator that is robust to heteroscedasticity and autocorrelation of the residuals of unknow form. This HAC (heteroscedasticity and autocorrelation consistent) coefficient covariance estimators is given by: Σ NW β = ( X X ) 1 T ˆ Ω ( X X ) 1 where ˆ Ω is a long-run covariance estimators Favero () Model Mis-specification 21 / 28

22 Correction for heteroscedasticity and serial correlation (Newey-West) Ω = Γ (0) + Γ (j) = p j=1 [ 1 j ( T û t û t j X t X t j t=1 p + 1 ) 1 T ] [ Γ (j) + Γ ( j)], (22) note that in absence of serial correlation Ω = Γ (0) and we are back to the White Estimator. Implementation of the estimator requires a choice of p, which is the maximum lag at whcih correlation is still present. The weighting scheme adopted guarantees a positive definite estimated covariance matrix by multiplying the sequence of the Γ (j) s by a sequence of weights that decreases as j increases. Favero () Model Mis-specification 22 / 28

23 Testing the CAPM The time-series performance of the CAPM is questioned by the significance of the alpha coefficients. (consider for example the case of the 25 FF protfolios) We also know that other factors beside the market are significant in explaining excess returns Fama-Mac Beth propose to exploit cross sectional information to produce a different test of the CAPM Favero () Model Mis-specification 23 / 28

24 Fama-MacBeth(1973) Consider the 25 portfolios and run for each of them the CAPM regression over the sample 1962:1 2014:6. These regressions deliver 25 betas. ( ) ( r i t r f t = α i + β i r m t r f t ) + u i t Take now a second-step regression in which the cross-section of the average (over the sample 1962:1-2014:6) monthly returns on the 25 portfolios are projected on the 25 betas: r i = γ 0 + γ 1 β i + u i Under the null of the CAPM i) residuals should be randomly distributed around the regression line, ii) γ 0 = E ( r f ), γ 1 = E ( r m r f ) Favero () Model Mis-specification 24 / 28

25 Fama-MacBeth(1973) Favero () Model Mis-specification 25 / 28

26 Fama-MacBeth(1973) The cross-sectional regression strongly rejects the CAPM. Note however that this regression is affected by an inference problem caused by the correlation of residuals in the cross-section regression. Fama and MacBeth (1973) address this problem by estimating month-by-month cross-section regressions of monthly returns on the betas obtained on the full sample. The time series means of the monthly slopes and intercepts, along with the standard errors of the means, are then used to run the appropriate tests Favero () Model Mis-specification 26 / 28

27 Fama-MacBeth(1973) TABL The application of the Fama-MacBeth on the sample 1962:1 2014:6 delivers the following results γ 0 γ 1 Mean St.Dev Obs t-stat An alternative route would be to construct Heteroscedasticity and Autocorrelation Consistent(HAC) estimators. Favero () Model Mis-specification 27 / 28

28 A Multifactor model The testing procedure could be easily extended to multifactor models. In which the first stage regression in general can be written as ( ) ( r i t r f t = α i + β 1,i r m t and the second stage regression: r i = γ 0 + γ 1 β 1,i + ) r f t + n j=2 n j=1 γ j β j,i + u i β j,i F j,t + u i t Under the null of the CAPM i) residuals should be randomly distributed around the regression line, ii) γ 0 = E ( r f ), γ 1 = E ( r m r f ),γ j = E ( F j ) Favero () Model Mis-specification 28 / 28

Heteroscedasticity and Autocorrelation

Heteroscedasticity and Autocorrelation Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation

More information

The Linear Regression Model

The Linear Regression Model The Linear Regression Model Carlo Favero Favero () The Linear Regression Model 1 / 67 OLS To illustrate how estimation can be performed to derive conditional expectations, consider the following general

More information

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares L Magee Fall, 2008 1 Consider a regression model y = Xβ +ɛ, where it is assumed that E(ɛ X) = 0 and E(ɛɛ X) =

More information

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e., 1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

Interpreting Regression Results

Interpreting Regression Results Interpreting Regression Results Carlo Favero Favero () Interpreting Regression Results 1 / 42 Interpreting Regression Results Interpreting regression results is not a simple exercise. We propose to split

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation 1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption

More information

Interpreting Regression Results -Part II

Interpreting Regression Results -Part II Interpreting Regression Results -Part II Carlo Favero Favero () Interpreting Regression Results -Part II / 9 The Partitioned Regression Model Given the linear model: y = Xβ + ɛ, Partition X in two blocks

More information

Economics 308: Econometrics Professor Moody

Economics 308: Econometrics Professor Moody Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

Week 11 Heteroskedasticity and Autocorrelation

Week 11 Heteroskedasticity and Autocorrelation Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random

More information

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017 Summary of Part II Key Concepts & Formulas Christopher Ting November 11, 2017 christopherting@smu.edu.sg http://www.mysmu.edu/faculty/christophert/ Christopher Ting 1 of 16 Why Regression Analysis? Understand

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Econ 423 Lecture Notes

Econ 423 Lecture Notes Econ 423 Lecture Notes (hese notes are modified versions of lecture notes provided by Stock and Watson, 2007. hey are for instructional purposes only and are not to be distributed outside of the classroom.)

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful? Journal of Modern Applied Statistical Methods Volume 10 Issue Article 13 11-1-011 Estimation and Hypothesis Testing in LAV Regression with Autocorrelated Errors: Is Correction for Autocorrelation Helpful?

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1 Review - Interpreting the Regression If we estimate: It can be shown that: where ˆ1 r i coefficients β ˆ+ βˆ x+ βˆ ˆ= 0 1 1 2x2 y ˆβ n n 2 1 = rˆ i1yi rˆ i1 i= 1 i= 1 xˆ are the residuals obtained when

More information

Steps in Regression Analysis

Steps in Regression Analysis MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4) 2-1 Steps in Regression Analysis 1. Review the literature and develop the theoretical model 2. Specify the model:

More information

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10 Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

11.1 Gujarati(2003): Chapter 12

11.1 Gujarati(2003): Chapter 12 11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed

More information

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic Chapter 6 ESTIMATION OF THE LONG-RUN COVARIANCE MATRIX An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic standard errors for the OLS and linear IV estimators presented

More information

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

CHAPTER 6: SPECIFICATION VARIABLES

CHAPTER 6: SPECIFICATION VARIABLES Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused

More information

ASSET PRICING MODELS

ASSET PRICING MODELS ASSE PRICING MODELS [1] CAPM (1) Some notation: R it = (gross) return on asset i at time t. R mt = (gross) return on the market portfolio at time t. R ft = return on risk-free asset at time t. X it = R

More information

Lecture 24: Weighted and Generalized Least Squares

Lecture 24: Weighted and Generalized Least Squares Lecture 24: Weighted and Generalized Least Squares 1 Weighted Least Squares When we use ordinary least squares to estimate linear regression, we minimize the mean squared error: MSE(b) = 1 n (Y i X i β)

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

STAT Regression Methods

STAT Regression Methods STAT 501 - Regression Methods Unit 9 Examples Example 1: Quake Data Let y t = the annual number of worldwide earthquakes with magnitude greater than 7 on the Richter scale for n = 99 years. Figure 1 gives

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Simple Linear Regression Model & Introduction to. OLS Estimation

Simple Linear Regression Model & Introduction to. OLS Estimation Inside ECOOMICS Introduction to Econometrics Simple Linear Regression Model & Introduction to Introduction OLS Estimation We are interested in a model that explains a variable y in terms of other variables

More information

Predicting bond returns using the output gap in expansions and recessions

Predicting bond returns using the output gap in expansions and recessions Erasmus university Rotterdam Erasmus school of economics Bachelor Thesis Quantitative finance Predicting bond returns using the output gap in expansions and recessions Author: Martijn Eertman Studentnumber:

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)

More information

Economic modelling and forecasting

Economic modelling and forecasting Economic modelling and forecasting 2-6 February 2015 Bank of England he generalised method of moments Ole Rummel Adviser, CCBS at the Bank of England ole.rummel@bankofengland.co.uk Outline Classical estimation

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

LECTURE 11. Introduction to Econometrics. Autocorrelation

LECTURE 11. Introduction to Econometrics. Autocorrelation LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE 1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +

More information

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance. Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =

More information

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Ch.10 Autocorrelated Disturbances (June 15, 2016) Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks Chapter 5 Classical linear regression model assumptions and diagnostics Introductory Econometrics for Finance c Chris Brooks 2013 1 Violation of the Assumptions of the CLRM Recall that we assumed of the

More information

Introductory Econometrics

Introductory Econometrics Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.

More information

Lecture 1: intro. to regresions

Lecture 1: intro. to regresions Lecture : intro. to regresions Class basics: Coach, Bro., Prof.,Not Dr.; grading (exams, HW, papers), Quizes Simple regression model (one X): Salary = b0 + b*experience + error Multiple regression model

More information

Likely causes: The Problem. E u t 0. E u s u p 0

Likely causes: The Problem. E u t 0. E u s u p 0 Autocorrelation This implies that taking the time series regression Y t X t u t but in this case there is some relation between the error terms across observations. E u t 0 E u t E u s u p 0 Thus the error

More information

Appendix A: Review of the General Linear Model

Appendix A: Review of the General Linear Model Appendix A: Review of the General Linear Model The generallinear modelis an important toolin many fmri data analyses. As the name general suggests, this model can be used for many different types of analyses,

More information

Econ 582 Fixed Effects Estimation of Panel Data

Econ 582 Fixed Effects Estimation of Panel Data Econ 582 Fixed Effects Estimation of Panel Data Eric Zivot May 28, 2012 Panel Data Framework = x 0 β + = 1 (individuals); =1 (time periods) y 1 = X β ( ) ( 1) + ε Main question: Is x uncorrelated with?

More information

Regression with time series

Regression with time series Regression with time series Class Notes Manuel Arellano February 22, 2018 1 Classical regression model with time series Model and assumptions The basic assumption is E y t x 1,, x T = E y t x t = x tβ

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches

Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches HE CEER FOR HE SUDY OF IDUSRIAL ORGAIZAIO A ORHWESER UIVERSIY Working Paper #0055 Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches By Mitchell A. Petersen * Kellogg School of

More information

FE670 Algorithmic Trading Strategies. Stevens Institute of Technology

FE670 Algorithmic Trading Strategies. Stevens Institute of Technology FE670 Algorithmic Trading Strategies Lecture 8. Robust Portfolio Optimization Steve Yang Stevens Institute of Technology 10/17/2013 Outline 1 Robust Mean-Variance Formulations 2 Uncertain in Expected Return

More information

Regression #4: Properties of OLS Estimator (Part 2)

Regression #4: Properties of OLS Estimator (Part 2) Regression #4: Properties of OLS Estimator (Part 2) Econ 671 Purdue University Justin L. Tobias (Purdue) Regression #4 1 / 24 Introduction In this lecture, we continue investigating properties associated

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Reading Assignment. Distributed Lag and Autoregressive Models. Chapter 17. Kennedy: Chapters 10 and 13. AREC-ECON 535 Lec G 1

Reading Assignment. Distributed Lag and Autoregressive Models. Chapter 17. Kennedy: Chapters 10 and 13. AREC-ECON 535 Lec G 1 Reading Assignment Distributed Lag and Autoregressive Models Chapter 17. Kennedy: Chapters 10 and 13. AREC-ECON 535 Lec G 1 Distributed Lag and Autoregressive Models Distributed lag model: y t = α + β

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

10 Panel Data. Andrius Buteikis,

10 Panel Data. Andrius Buteikis, 10 Panel Data Andrius Buteikis, andrius.buteikis@mif.vu.lt http://web.vu.lt/mif/a.buteikis/ Introduction Panel data combines cross-sectional and time series data: the same individuals (persons, firms,

More information

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. 9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).

More information

14 Week 5 Empirical methods notes

14 Week 5 Empirical methods notes 14 Week 5 Empirical methods notes 1. Motivation and overview 2. Time series regressions 3. Cross sectional regressions 4. Fama MacBeth 5. Testing factor models. 14.1 Motivation and Overview 1. Expected

More information

Regression of Time Series

Regression of Time Series Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models Journal of Finance and Investment Analysis, vol.1, no.1, 2012, 55-67 ISSN: 2241-0988 (print version), 2241-0996 (online) International Scientific Press, 2012 A Non-Parametric Approach of Heteroskedasticity

More information

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

Formulary Applied Econometrics

Formulary Applied Econometrics Department of Economics Formulary Applied Econometrics c c Seminar of Statistics University of Fribourg Formulary Applied Econometrics 1 Rescaling With y = cy we have: ˆβ = cˆβ With x = Cx we have: ˆβ

More information

Section 3: Simple Linear Regression

Section 3: Simple Linear Regression Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction

More information

Econometric Methods. Prediction / Violation of A-Assumptions. Burcu Erdogan. Universität Trier WS 2011/2012

Econometric Methods. Prediction / Violation of A-Assumptions. Burcu Erdogan. Universität Trier WS 2011/2012 Econometric Methods Prediction / Violation of A-Assumptions Burcu Erdogan Universität Trier WS 2011/2012 (Universität Trier) Econometric Methods 30.11.2011 1 / 42 Moving on to... 1 Prediction 2 Violation

More information

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data Chapter 5 Panel Data Models Pooling Time-Series and Cross-Section Data Sets of Regression Equations The topic can be introduced wh an example. A data set has 0 years of time series data (from 935 to 954)

More information

Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches

Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches January, 2005 Estimating Standard Errors in Finance Panel Data Sets: Comparing Approaches Mitchell A. Petersen Kellogg School of Management, orthwestern University and BER Abstract In both corporate finance

More information

The Statistical Property of Ordinary Least Squares

The Statistical Property of Ordinary Least Squares The Statistical Property of Ordinary Least Squares The linear equation, on which we apply the OLS is y t = X t β + u t Then, as we have derived, the OLS estimator is ˆβ = [ X T X] 1 X T y Then, substituting

More information

Lesson 17: Vector AutoRegressive Models

Lesson 17: Vector AutoRegressive Models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Vector AutoRegressive models The extension of ARMA models into a multivariate framework

More information

Reliability of inference (1 of 2 lectures)

Reliability of inference (1 of 2 lectures) Reliability of inference (1 of 2 lectures) Ragnar Nymoen University of Oslo 5 March 2013 1 / 19 This lecture (#13 and 14): I The optimality of the OLS estimators and tests depend on the assumptions of

More information

Properties of the least squares estimates

Properties of the least squares estimates Properties of the least squares estimates 2019-01-18 Warmup Let a and b be scalar constants, and X be a scalar random variable. Fill in the blanks E ax + b) = Var ax + b) = Goal Recall that the least squares

More information

in the time series. The relation between y and x is contemporaneous.

in the time series. The relation between y and x is contemporaneous. 9 Regression with Time Series 9.1 Some Basic Concepts Static Models (1) y t = β 0 + β 1 x t + u t t = 1, 2,..., T, where T is the number of observation in the time series. The relation between y and x

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 16, 2013 Outline Univariate

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) The Simple Linear Regression Model based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #2 The Simple

More information

GLS and FGLS. Econ 671. Purdue University. Justin L. Tobias (Purdue) GLS and FGLS 1 / 22

GLS and FGLS. Econ 671. Purdue University. Justin L. Tobias (Purdue) GLS and FGLS 1 / 22 GLS and FGLS Econ 671 Purdue University Justin L. Tobias (Purdue) GLS and FGLS 1 / 22 In this lecture we continue to discuss properties associated with the GLS estimator. In addition we discuss the practical

More information

Simple Linear Regression: The Model

Simple Linear Regression: The Model Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random

More information

Estimation under Heteroscedasticity: A Comparative Approach Using Cross-Sectional Data

Estimation under Heteroscedasticity: A Comparative Approach Using Cross-Sectional Data Estimation under Heteroscedasticity: A Comparative Approach Using Cross-Sectional Data Adebayo Agunbiade 1 Olawale Adeboye 2* 1. Department of Mathematical Sciences, Faculty of Science, Olabisi Onabanjo

More information

2.1 Linear regression with matrices

2.1 Linear regression with matrices 21 Linear regression with matrices The values of the independent variables are united into the matrix X (design matrix), the values of the outcome and the coefficient are represented by the vectors Y and

More information

Problem Set #6: OLS. Economics 835: Econometrics. Fall 2012

Problem Set #6: OLS. Economics 835: Econometrics. Fall 2012 Problem Set #6: OLS Economics 835: Econometrics Fall 202 A preliminary result Suppose we have a random sample of size n on the scalar random variables (x, y) with finite means, variances, and covariance.

More information

Finite Sample Performance of A Minimum Distance Estimator Under Weak Instruments

Finite Sample Performance of A Minimum Distance Estimator Under Weak Instruments Finite Sample Performance of A Minimum Distance Estimator Under Weak Instruments Tak Wai Chau February 20, 2014 Abstract This paper investigates the nite sample performance of a minimum distance estimator

More information

MFE Financial Econometrics 2018 Final Exam Model Solutions

MFE Financial Econometrics 2018 Final Exam Model Solutions MFE Financial Econometrics 2018 Final Exam Model Solutions Tuesday 12 th March, 2019 1. If (X, ε) N (0, I 2 ) what is the distribution of Y = µ + β X + ε? Y N ( µ, β 2 + 1 ) 2. What is the Cramer-Rao lower

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information