Heteroskedasticity and Autocorrelation

Size: px
Start display at page:

Download "Heteroskedasticity and Autocorrelation"

Transcription

1 Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 1 / 39

2 Learning objectives To understand the concepts of heteroskedasticiy and serial correlation. To identify the consequences of the presence of heteroskedasticiy and/or serial correlation on the properties of the OLS estimator. To identify the consequences of the presence of heteroskedasticity and/or serial correlation on the inference based on the OLS estimator. To detect the presence of heteroskedasticity and/or serial correlation. To carry out inference robust to the presence of heteroskedasticity and/or serial correlation based on the OLS estimator. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 2 / 39

3 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 3 / 39

4 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 4 / 39

5 Multiple Regression Model assumptions. Assumptions. A.1. The model in the population can be written as: Y t = β 1 + β 2X 2t + β 3X 3t β k X kt + u t t = 1,.., T A.2. No perfect collinearity A.3. Zero conditional mean: E(u t X 2, X 3,..., X k ) = 0 t = 1, 2,..., T A.4. Homoskedasticity (constant variance): Var(u t X 2, X 3,..., X k ) = σu 2 t = 1, 2,..., T. A.5. No autocorrelation: Cov(u t, u s X 2, X 3,..., X k ) = 0 t s. A.6. Normality: The errors u t are independent of X and they are identically normally distributed. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 5 / 39

6 Multiple Regression Model assumptions. Matrix form. A.1. The model in the population can be written as: Y = Xβ + u A.2. No perfect collinearity A.3. Zero conditional mean: E(u X) = 0 A.4. + A.5. Homoskedasticity (constant variance) + No autocorrelation: σu σu V (u X) = E(uu X) = 0 0 σu = σu 2 I T σu 2 A.6. Normality. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 6 / 39

7 Multiple Regression Model assumptions. Under these assumptions, conditional on X: A. The OLS estimator ˆβ, is linear unbiased efficient in the Gauss-Markov sense. B. The estimator of the variance of the error term, ˆσ 2 u, is unbiased. C. ˆβ N(β, V ( ˆβ)) where V ( ˆβ) = σ 2 u (X X) 1 D. Test statistics: t = ˆβ j βj 0 (SSRR SSRUR)/q H0 t(t k) F = ˆσ ˆβj SSR UR/(T k) H0 F(q, T k) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 7 / 39

8 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 8 / 39

9 Heteroskedasticity. Concept. Homoskedasticity = The variance of the disturbances u i is constant i, conditional on X. Heteroskedasticity = The variance of the disturbances u i is not constant i, conditional on X, because it depends on one or several variables. The presence of heteroskedasticity is quite frequent when working with cross-section data. Let s assume a regression model that determines consumption as a function of income. The variance of the error term might be expected to increase as income increases. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 9 / 39

10 Heteroskedasticity. Homoskedastic error term F(Y X) Y F(Y X=X 0 ) F(Y X=X 1 ) F(Y X=X 2 ) F(Y X=X 3 ) F(Y X=X 4 ) X Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 10 / 39

11 Heteroskedasticity. Heteroskedastic error term F(Y X) Y F(Y X=X X 0 ) F(Y X=X 1 ) F(Y X=X 2 ) F(Y X=X 3 ) F(Y X=X 4 ) X Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 11 / 39

12 Heteroskedasticity. Concept. Heteroskedasticity V (u i X) = σ 2 i i = 1, 2,..., N The covariance matrix of the error term conditional on X is: σ σ V (u X) = E(uu X) = 0 0 σ σn 2 This covariance matrix will be denoted by V (u). = Σ Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 12 / 39

13 Heteroskedasticity. Properties of the OLS estimator. Conditional on X, the OLS estimator is: Linear: ˆβ = (X X) 1 X Y = β + (X X) 1 X u Unbiased: E( ˆβ X) = E[(β + (X X) 1 X u) X] = β Not Efficient. Given that the error term in heteroskedastic, assumption A4 is not satisfied and it is not possible to apply the Gauss-Markov theorem. ˆβ OLS has NOT the smallest variance within the class of linear and unbiased estimators. It can be proved that there is another estimator with a smaller variance when the structure of heteroskedasticity is known: Generalized Least Squares Estimator (GLS): uses the information about the covariance matrix of u to estimate the coefficients β. The derivation of this estimator is out of the scope of this book. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 13 / 39

14 Heteroskedasticity. Inference using the OLS estimator, conditional on X. When the assumptions of the GLRM are satisfied: ˆβ N(β, σ 2 (X X) 1 ). When the error term is heteroskedastic, the covariance matrix of the error term is V (u) = Σ, then what is the distribution of the OLS estimator??? ˆβ N[β, (X X) 1 X ΣX(X X) 1 ] Proof: V ( ˆβ X) = E[( ˆβ E( ˆβ))( ˆβ E( ˆβ)) X] = E[( ˆβ β)( ˆβ β) X] = = E[((X X) 1 X u)((x X) 1 X u) X] = E[(X X) 1 X uu X(X X) 1 X] = = (X X) 1 X ΣX(X X) 1 Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 14 / 39

15 Heteroskedasticity. Inference using the OLS estimator conditional on X. Hypothesis testing based on the t and F statistics is NOT adequate and it can be misleading. The problem is that the usual estimator of V ( ˆβ) is not appropriate because it is BIASED. ˆV ( ˆβ) = ˆσ 2 (X X) 1 The t and F statistics do not follow the usual Student-t and F-Snedecor distributions. t = ˆβ j βj 0 H 0 (SSR R SSR UR )/q??? F = ˆσ ˆβj SSR UR /(N k) H 0??? What are the distributions of the t and F statistics??? Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 15 / 39

16 Heteroskedasticity. Detection. A. Graphic analysis. Estimate the model by OLS and compute the residuals. Plot the OLS residuals against the regressors that may be the cause of the heteroskedasticity. The plots below show the behaviour of the error term under the homoskedasticity assumption and the behaviour of a heteroskedastic error term. u i u i 0 0 Homoskedasticity X i Heteroskedasticity X i Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 16 / 39

17 Heteroskedasticity. Detection. B. Heteroskedasticity tests. Goldfeld-Quandt test. Breusch-Pagan test. White test. Consider the linear regression model: Y i = β 1 + β 2 X 2i + β 3 X 3i β k X ki + u i i = 1, 2,..., N. The set-up of the tests to detect heteroskedasticity is always the same: H 0 : Homoskedasticity: V (u i ) = σi 2 = σ2 i H a : Heteroskedasticity: V (u i ) = σi 2 See Example 7.1 for applications. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 17 / 39

18 Heteroskedasticity. Goldfeld-Quandt test. This simple test is used when it is believed that only one variable Z, usually one of the regressors, is the cause of the heteroskedasticity. That is, it assumes that the heteroskedasticity takes the form σi 2 = E(u i) 2 = h(z) where h is an increasing (decreasing) function of Z. Procedure: 1. Identify the variable Z (usually one of the regressors) that is related with σ 2 i. 2. Sort the observations by Z (increasing/decreasing). 3. Split the sample in three parts: the first N 1 observations, the last N 2 observations and delete the central observations. Usually, one third of the total number observations is deleted. 4. Estimate by OLS two regressions: one using the first N 1 observations and another one using the last N 2 observations. SSR2/(N2 k2) 5. Test statistic: GQ = SSR 1/(N 1 k 1) H 0 F((N 2 k 2), (N 1 k 1)) where SSR 1 and SSR 2 are the sum of squared residuals of the first and second regressions, respectively. 6. Reject the null hypothesis of homoskedasticity at the α % significance level if: GQ > F α((n 2 k 2), (N 1 k 1)) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 18 / 39

19 Heteroskedasticity. Breusch-Pagan test. This test is based in the assumption that the heteroskedasticity takes the form σ 2 i = E(u i) 2 = h(z i α), z i = [1 z1i,..., zpi] is a vector of known variables, α = [α 0 α 1,..., α p] is a vector of unknown coefficients and h( ) is any function that only takes positive values. The null hypothesis of homoskedasticity is: H 0 : α 1 = α 3 =... = α p = 0 Procedure: 1. Estimate the regression model by OLS, compute the residuals û and estimate the variance of the error term as follows: σ 2 = û 2 /N. 2. Regress û 2 i / σ2 on z i by OLS and compute the explained sum of squares (SSE). 3. Test statistic: LM = SSE 2 H 0,a χ 2 (p) 4. Reject the null hypothesis of homoskedasticity at the α % significance level if: LM = SSE/2 > χ 2 α (p) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 19 / 39

20 Heteroskedasticity. White test. This is a very flexible test because it is not necessary to make any assumption about the structure of homoskedasticity. In this sense, it is said to be a robust test. Procedure: 1. Estimate the regression model by OLS and compute the residuals û. 2. Undertake an auxiliary regression analysis. Regress the squared residuals from the original regression model onto a set of regressors that contain the original regressors, the cross-products of the regressors and the squared regressors. Compute the coefficient of determination of this auxiliary regression. 3. Test statistic: LM = NR 2 H 0,a χ 2 (p) where R 2 is the coefficient of determination of the auxiliary regression, N is the sample size and p is the number of estimated coefficients in the auxiliary regression minus Reject the null hypothesis of homoskedasticity at the α % significance level if: NR 2 > χ 2 α (p). Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 20 / 39

21 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 21 / 39

22 Autocorrelation. Concept. Autocorrelation : there are linear relationships among the error terms for different observations: cov(u t u s X) = E(u t u s X) 0 t, s t s The presence of autocorrelation in the error term is quite frequent when working with time series data. The covariance matrix of the error term is: V (u X) = E(uu X) = This covariance will be denoted by V (u). σ 2 σ 12 σ σ 1T σ 21 σ 2 σ σ 2T σ 31 σ 32 σ 2... σ 3T σ T 1 σ T 2 σ T 3... σ 2 Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 22 / 39 = Σ

23 Autocorrelation. Consequences of autocorrelation. A. Properties of the OLS estimators, conditional on X: The OLS estimator is linear, unbiased, but it is Not Efficient, in the sense of minimum variance because the Gauss-Markov theorem cannot be applied under autocorrelation. It can be proved that there is another estimator with a smaller variance when the structure of the autocorrelation is known: Generalized Least Squares Estimator (GLS). B. Inference using the OLS estimator. When assumption A.5 is not satisfied and the error term is autocorrelated, the true covariance matrix of the OLS estimator conditional on X is: V ( ˆβ) = (X X) 1 X ΣX(X X) 1 As a consequence, the true distribution of the t and F statistics used for testing linear restrictions is unknown. Inference based on the usual Student-t and F-Snedecor distributions is not adequate and may lead to wrong conclusions. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 23 / 39

24 Autocorrelation. Detection. A. Graphic analysis. Procedure: estimate by OLS the regression model and compute the residuals û. Plot the OLS residuals against time to analyse whether there is any pattern in the evolution of the error term that may suggest the presence of autocorrelation. The plot below shows the temporal evolution of the error terms when there is: Positive autocorrelation: clusters of positive and negative error terms. Negative autocorrelation: the error terms alternate sign, negative, positive, negative,... u t u t 0 t 0 t Positive autocorrelation Negative autocorrelation Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 24 / 39

25 Autocorrelation. Detection. B. Autocorrelation tests. 1. Durbin-Watson test (only valid for determinist regressors). 2. Breusch-Godfrey test. Consider the linear regression model: Y t = β 1 + β 2 X 2t + β 3 X 3t β k X kt + u t t = 1, 2,..., T. The set-up of the autocorrelation tests is: H 0 : NO autocorrelation (cov(u tu s) = E(u tu s) = 0, t s) H a : Autocorrelation (cov(u tu s) = E(u tu s) 0, t s) It is ALWAYS necessary to specify the autocorrelation structure under the alternative hypothesis. See Example 7.2 for applications. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 25 / 39

26 Autocorrelation. Durbin-Watson test. H a : First order autocorrelation in the error term, that is, the error term follows an autorregresive process of order 1: AR(1): u t = ρ u t 1 + v t v t NID(0, σ 2 v) ρ < 1 The presence of autocorrelation depends on the value of ρ. ρ = 0 : No autocorrelation ρ > 0 : Positive autocorrelation ρ < 0 : Negative autocorrelation Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 26 / 39

27 Autocorrelation. Durbin-Watson test. Positive autocorrelation. Procedure: { H0 : ρ = 0 (NO autocorrelation) 1. H a : u t = ρ u t 1 + v t ρ > 0 (Positive autocorrelation) T (û t û t 1) 2 2. Test statistic: DW = t=2 T H 0??? û 2 t t=1 where û t are the OLS residuals of the regression model. T û tû t 1 It can be proved that: DW 2 (1 ˆρ) where: ˆρ = t=2 T û 2 t 1 Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 27 / 39 t=2

28 Autocorrelation. Durbin-Watson test. Positive autocorrelation. { ˆρ = 0 DW 2 Therefore: ˆρ (0, 1] DW [0, 2) 3. Decision rule: Durbin and Watson did not derive the exact distribution of the test statistic under the null because this distribution depends on the sample. But they estimated the critical values depending on the sample size and the number of regressors of the regression model for a given level of significance. They were able to estimate an upper limit (d U ) and a lower limit (d L). If DW < d L, the null hypothesis is rejected: there is evidence in the sample of first order positive autocorrelation. If DW > d U, the null hypothesis is not rejected: there is no evidence in the sample of first order positive autocorrelation. If d L < DW < d U, the test is inconclusive. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 28 / 39

29 Autocorrelation. Durbin-Watson test. Positive autocorrelation. d L d U 0 2 Reject H d Evidence of positive autocorrelation Uncertainty region DO NOT Reject H d NO autocorrelation Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 29 / 39

30 Autocorrelation. Durbin-Watson test. Negative autocorrelation. Procedure: 1. { H0 : ρ = 0 (NO autocorrelation) H a : u t = ρ u t 1 + v t ρ < 0 (Negative autocorrelation) 2. Test statistic DW = T (û t û t 1) 2 t=2 T H 0??? t=1 û 2 t It can be proved: DW 2 (1 ˆρ) { ˆρ = 0 DW 2 Therefore: ˆρ (0, 1] DW (2, 4] Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 30 / 39

31 Autocorrelation. Durbin-Watson test. Negative autocorrelation. Durbin-Watson estimated the critical values depending on the sample size and the number of regressors of the model for a given significance level. They estimated only an upper limit (d U ) and a lower limit (d L ). If DW > 4 d L, the null hypothesis is rejected: there is evidence in the sample of first order negative autocorrelation. If 2 < DW < 4 d U, the null hypothesis is not rejected: there is no evidence in the sample of first order negative autocorrelation. If 4 d U < DW < 4 d L, the test is inconclusive. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 31 / 39

32 Autocorrelation. Durbin-Watson test. Negative autocorrelation. 4 - d U 4 - d L 2 DO NOT Reject H d NO autocorrelation Uncertainty region Reject H d Evidence of negative autocorrelation 4 Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 32 / 39

33 Autocorrelation. Durbin-Watson test. d L d U 4 - d U 4 - d L 0 Z ject H d Evidence of positive autocorrelation Uncertainty region DO NOT Reject H d NO autocorrelation 2 4 Uncertainty region Z ject H d Evidence of negative autocorrelation Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 33 / 39

34 Autocorrelation. Breusch-Godfrey test. This test may be used to test the presence of first order autocorrelation but it is designed to test the presence of autocorrelation of higher orders. Assume that the error term follows an autorregresive model of order q: { H0 : ρ 1 = ρ 2 =... = ρ q = 0 (NO autocorrelation) H a : Autocorrelation of order q Procedure: 1. Estimate the regression model by OLS and compute the residuals û. 2. Estimate an auxiliary regression of û t on X 2t, X 3t,..., X kt, û t 1, û t 2,..., û t q 3. Test the overall significance of û t 1, û t 2,..., û t q using a statistic based on the Lagrange multiplier: LM = T R 2 H 0,a χ 2 (q) where R 2 is the coefficient of determination of the auxiliary regression. Reject the null hypothesis at the α % significance level if: LM > χ 2 α (q). Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 34 / 39

35 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 35 / 39

36 Heteroskedasticity and Autocorrelation. Inference using the OLS estimator conditional on X. Is there any procedure to make inference using the OLS estimator? ˆβ N[β, (X X) 1 X ΣX(X X) 1 ] Idea : To estimate the true covariance matrix of the OLS estimator. V ( ˆβ) = (X X) 1 X ΣX(X X) 1 Not an easy task because usually little is known about the structure of matrix Σ. Some econometricians { White (heteroskedasticity) Newey-West (autocorrelation) have derived estimators for the covariance matrix V ( ˆβ) with good properties in large samples. } Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 36 / 39

37 Heteroskedasticity and Autocorrelation. Inference using the OLS estimator conditional on X. Denote by V R ( ˆβ) the estimator of the true covariance matrix of the OLS estimator of the coefficients β. This estimator is usually called: estimator robust to heteroskedasticity and/ or autocorrelation. A. Hypothesis on a single coefficient. It can be proved that the asymptotic distribution of the t-statistic under the null is: t = ˆβ j β 0 j ˆσ Rˆβj H 0,a N(0, 1) where ˆσ Rˆβj is the standard error of the OLS estimator robust to heteroskedasticity and/ or autocorrelation. The null hypothesis is rejected at the α % significance level if: t > N α/2 (0, 1). B. Hypothesis on multiple restrictions. It is necessary to multiply the usual F statistic by the number of restrictions to obtain a q F statistic that is going to follow asymptotically a χ 2 (q) distribution. The null hypothesis is rejected at the α % significance level if: q F > χ 2 α (q). Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 37 / 39

38 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 38 / 39

39 Contents 1 Multiple Regression Model assumptions. 2 Heteroskedasticity. Concept. Consequences. Detection. 3 Autocorrelation. Concept. Consequences. Detection. 4 Inference using the OLS estimator. 5 Task: T7. 6 Exercises: E7.1, E7.2 and E7.3. Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity and Autocorrelation 39 / 39

The Multiple Regression Model Estimation

The Multiple Regression Model Estimation Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

Exercise E7. Heteroskedasticity and Autocorrelation. Pilar González and Susan Orbe. Dpt. Applied Economics III (Econometrics and Statistics)

Exercise E7. Heteroskedasticity and Autocorrelation. Pilar González and Susan Orbe. Dpt. Applied Economics III (Econometrics and Statistics) Exercise E7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 E7 Heteroskedasticity and

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

Empirical Economic Research, Part II

Empirical Economic Research, Part II Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

Diagnostics of Linear Regression

Diagnostics of Linear Regression Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 4 Jakub Mućk Econometrics of Panel Data Meeting # 4 1 / 30 Outline 1 Two-way Error Component Model Fixed effects model Random effects model 2 Non-spherical

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

Formulary Applied Econometrics

Formulary Applied Econometrics Department of Economics Formulary Applied Econometrics c c Seminar of Statistics University of Fribourg Formulary Applied Econometrics 1 Rescaling With y = cy we have: ˆβ = cˆβ With x = Cx we have: ˆβ

More information

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance. Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =

More information

Topic 7: Heteroskedasticity

Topic 7: Heteroskedasticity Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic

More information

The Simple Regression Model. Part II. The Simple Regression Model

The Simple Regression Model. Part II. The Simple Regression Model Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square

More information

Applied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics

Applied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e., 1 Motivation Auto correlation 2 Autocorrelation occurs when what happens today has an impact on what happens tomorrow, and perhaps further into the future This is a phenomena mainly found in time-series

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

Simple Linear Regression: The Model

Simple Linear Regression: The Model Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random

More information

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity 1/25 Outline Basic Econometrics in Transportation Heteroscedasticity What is the nature of heteroscedasticity? What are its consequences? How does one detect it? What are the remedial measures? Amir Samimi

More information

Week 11 Heteroskedasticity and Autocorrelation

Week 11 Heteroskedasticity and Autocorrelation Week 11 Heteroskedasticity and Autocorrelation İnsan TUNALI Econ 511 Econometrics I Koç University 27 November 2018 Lecture outline 1. OLS and assumptions on V(ε) 2. Violations of V(ε) σ 2 I: 1. Heteroskedasticity

More information

the error term could vary over the observations, in ways that are related

the error term could vary over the observations, in ways that are related Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random

More information

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Multiple Regression Analysis. Part III. Multiple Regression Analysis Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant

More information

Environmental Econometrics

Environmental Econometrics Environmental Econometrics Jérôme Adda j.adda@ucl.ac.uk Office # 203 EEC. I Syllabus Course Description: This course is an introductory econometrics course. There will be 2 hours of lectures per week and

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

Section 6: Heteroskedasticity and Serial Correlation

Section 6: Heteroskedasticity and Serial Correlation From the SelectedWorks of Econ 240B Section February, 2007 Section 6: Heteroskedasticity and Serial Correlation Jeffrey Greenbaum, University of California, Berkeley Available at: https://works.bepress.com/econ_240b_econometrics/14/

More information

LECTURE 11. Introduction to Econometrics. Autocorrelation

LECTURE 11. Introduction to Econometrics. Autocorrelation LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 17, 2012 Outline Heteroskedasticity

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/

More information

Reliability of inference (1 of 2 lectures)

Reliability of inference (1 of 2 lectures) Reliability of inference (1 of 2 lectures) Ragnar Nymoen University of Oslo 5 March 2013 1 / 19 This lecture (#13 and 14): I The optimality of the OLS estimators and tests depend on the assumptions of

More information

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage

More information

Econometrics of Panel Data

Econometrics of Panel Data Econometrics of Panel Data Jakub Mućk Meeting # 2 Jakub Mućk Econometrics of Panel Data Meeting # 2 1 / 26 Outline 1 Fixed effects model The Least Squares Dummy Variable Estimator The Fixed Effect (Within

More information

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

Christopher Dougherty London School of Economics and Political Science

Christopher Dougherty London School of Economics and Political Science Introduction to Econometrics FIFTH EDITION Christopher Dougherty London School of Economics and Political Science OXFORD UNIVERSITY PRESS Contents INTRODU CTION 1 Why study econometrics? 1 Aim of this

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

GENERALISED LEAST SQUARES AND RELATED TOPICS

GENERALISED LEAST SQUARES AND RELATED TOPICS GENERALISED LEAST SQUARES AND RELATED TOPICS Haris Psaradakis Birkbeck, University of London Nonspherical Errors Consider the model y = Xβ + u, E(u) =0, E(uu 0 )=σ 2 Ω, where Ω is a symmetric and positive

More information

The general linear regression with k explanatory variables is just an extension of the simple regression as follows

The general linear regression with k explanatory variables is just an extension of the simple regression as follows 3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing

F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing Feng Li Department of Statistics, Stockholm University What we have learned last time... 1 Estimating

More information

Lecture 6: Dynamic Models

Lecture 6: Dynamic Models Lecture 6: Dynamic Models R.G. Pierse 1 Introduction Up until now we have maintained the assumption that X values are fixed in repeated sampling (A4) In this lecture we look at dynamic models, where the

More information

Econometrics Part Three

Econometrics Part Three !1 I. Heteroskedasticity A. Definition 1. The variance of the error term is correlated with one of the explanatory variables 2. Example -- the variance of actual spending around the consumption line increases

More information

Chapter 8 Heteroskedasticity

Chapter 8 Heteroskedasticity Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized

More information

11.1 Gujarati(2003): Chapter 12

11.1 Gujarati(2003): Chapter 12 11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed

More information

Multivariate Regression Analysis

Multivariate Regression Analysis Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks Chapter 5 Classical linear regression model assumptions and diagnostics Introductory Econometrics for Finance c Chris Brooks 2013 1 Violation of the Assumptions of the CLRM Recall that we assumed of the

More information

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe

More information

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time

Autocorrelation. Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time Autocorrelation Given the model Y t = b 0 + b 1 X t + u t Think of autocorrelation as signifying a systematic relationship between the residuals measured at different points in time This could be caused

More information

Econometrics. Final Exam. 27thofJune,2008. Timeforcompletion: 2h30min

Econometrics. Final Exam. 27thofJune,2008. Timeforcompletion: 2h30min Econometrics Final Exam 27thofJune,2008 João Valle e Azevedo António José Morgado Tiago Silva Vieira Timeforcompletion: 2h30min Give your answers in the space provided. Usedraftpapertoplanyouranswersbeforewritingthemontheexampaper.

More information

Linear Regression with Time Series Data

Linear Regression with Time Series Data Econometrics 2 Linear Regression with Time Series Data Heino Bohn Nielsen 1of21 Outline (1) The linear regression model, identification and estimation. (2) Assumptions and results: (a) Consistency. (b)

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can

More information

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata EC312: Advanced Econometrics Problem Set 3 Solutions in Stata Nicola Limodio www.nicolalimodio.com N.Limodio1@lse.ac.uk The data set AIRQ contains observations for 30 standard metropolitan statistical

More information

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Practical Econometrics. for. Finance and Economics. (Econometrics 2) Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics

More information

Section 2 NABE ASTEF 65

Section 2 NABE ASTEF 65 Section 2 NABE ASTEF 65 Econometric (Structural) Models 66 67 The Multiple Regression Model 68 69 Assumptions 70 Components of Model Endogenous variables -- Dependent variables, values of which are determined

More information

Lab 11 - Heteroskedasticity

Lab 11 - Heteroskedasticity Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction

More information

Economics 308: Econometrics Professor Moody

Economics 308: Econometrics Professor Moody Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey

More information

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation 1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

1 Graphical method of detecting autocorrelation. 2 Run test to detect autocorrelation

1 Graphical method of detecting autocorrelation. 2 Run test to detect autocorrelation 1 Graphical method of detecting autocorrelation Residual plot : A graph of the estimated residuals ˆɛ i against time t is plotted. If successive residuals tend to cluster on one side of the zero line of

More information

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k

More information

Ordinary Least Squares Regression

Ordinary Least Squares Regression Ordinary Least Squares Regression Goals for this unit More on notation and terminology OLS scalar versus matrix derivation Some Preliminaries In this class we will be learning to analyze Cross Section

More information

Quick Review on Linear Multiple Regression

Quick Review on Linear Multiple Regression Quick Review on Linear Multiple Regression Mei-Yuan Chen Department of Finance National Chung Hsing University March 6, 2007 Introduction for Conditional Mean Modeling Suppose random variables Y, X 1,

More information

The Statistical Property of Ordinary Least Squares

The Statistical Property of Ordinary Least Squares The Statistical Property of Ordinary Least Squares The linear equation, on which we apply the OLS is y t = X t β + u t Then, as we have derived, the OLS estimator is ˆβ = [ X T X] 1 X T y Then, substituting

More information

Bootstrapping Heteroskedasticity Consistent Covariance Matrix Estimator

Bootstrapping Heteroskedasticity Consistent Covariance Matrix Estimator Bootstrapping Heteroskedasticity Consistent Covariance Matrix Estimator by Emmanuel Flachaire Eurequa, University Paris I Panthéon-Sorbonne December 2001 Abstract Recent results of Cribari-Neto and Zarkos

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level

More information

ECON 4230 Intermediate Econometric Theory Exam

ECON 4230 Intermediate Econometric Theory Exam ECON 4230 Intermediate Econometric Theory Exam Multiple Choice (20 pts). Circle the best answer. 1. The Classical assumption of mean zero errors is satisfied if the regression model a) is linear in the

More information

ECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions

ECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued Brief Suggested Solutions 1. In Lab 8 we considered the following

More information

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16)

Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) Lecture: Simultaneous Equation Model (Wooldridge s Book Chapter 16) 1 2 Model Consider a system of two regressions y 1 = β 1 y 2 + u 1 (1) y 2 = β 2 y 1 + u 2 (2) This is a simultaneous equation model

More information

Heteroscedasticity. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Heteroscedasticity POLS / 11

Heteroscedasticity. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Heteroscedasticity POLS / 11 Heteroscedasticity Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Heteroscedasticity POLS 7014 1 / 11 Objectives By the end of this meeting, participants should

More information

Generalized Least Squares

Generalized Least Squares Generalized Least Squares Upuntilnow,wehaveassumedthat Euu = ¾ I Now we generalize to let Euu = where is restricted to be a positive de nite, symmetric matrix ommon Examples Autoregressive Models AR()

More information

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

Statistical Inference. Part IV. Statistical Inference

Statistical Inference. Part IV. Statistical Inference Part IV Statistical Inference As of Oct 5, 2017 Sampling Distributions of the OLS Estimator 1 Statistical Inference Sampling Distributions of the OLS Estimator Testing Against One-Sided Alternatives Two-Sided

More information

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Ch.10 Autocorrelated Disturbances (June 15, 2016) Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the

More information

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of Wooldridge, Introductory Econometrics, d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of homoskedasticity of the regression error term: that its

More information

Testing Linear Restrictions: cont.

Testing Linear Restrictions: cont. Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n

More information

Modified Variance Ratio Test for Autocorrelation in the Presence of Heteroskedasticity

Modified Variance Ratio Test for Autocorrelation in the Presence of Heteroskedasticity The Lahore Journal of Economics 23 : 1 (Summer 2018): pp. 1 19 Modified Variance Ratio Test for Autocorrelation in the Presence of Heteroskedasticity Sohail Chand * and Nuzhat Aftab ** Abstract Given that

More information

Lectures on Structural Change

Lectures on Structural Change Lectures on Structural Change Eric Zivot Department of Economics, University of Washington April5,2003 1 Overview of Testing for and Estimating Structural Change in Econometric Models 1. Day 1: Tests of

More information

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017 Summary of Part II Key Concepts & Formulas Christopher Ting November 11, 2017 christopherting@smu.edu.sg http://www.mysmu.edu/faculty/christophert/ Christopher Ting 1 of 16 Why Regression Analysis? Understand

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares L Magee Fall, 2008 1 Consider a regression model y = Xβ +ɛ, where it is assumed that E(ɛ X) = 0 and E(ɛɛ X) =

More information