Violation of OLS assumption - Heteroscedasticity

Size: px
Start display at page:

Download "Violation of OLS assumption - Heteroscedasticity"

Transcription

1 Violation of OLS assumption - Heteroscedasticity What, why, so what and what to do? Lars Forsberg Uppsala Uppsala University, Department of Statistics October 22, 2014 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

2 Econometrics - Objectives and exam Violations of assumptions - Heteroscedasticity: What - Explain what is meant by heteroscedasticity Causes - Account for possible causes of heteroscedastic errors Consequences - Know what heteroscedasticity does to the OLS estimators (expectation and variance) Detection - Account for an informal way to detect heteroscedasticity (graphing) Detection - Account for two tests for heteroscedasticity (Explain the rationale behind them how they are done, and being able to perform them when given all the relevant numbers Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

3 Econometrics - Objectives and exam Violations of assumptions - Heteroscedasticity: Remedy - Know what to do in the presence of heteroscedasticity Explain the idea behind the Weighted Least Squares Perform a WLS estimation using scalar algebra (transformation of the regression equation and then OLS on the transformed equation) Interpret the parameters of the original equation after doing WLS Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

4 Heteroscedasticity - Questions Outline How to spell it? What - is Heteroscedasticity? Causes - How does Heteroscedasticity come about? Consequences - Is it a problem? In what way? When/in what situations? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

5 Violation of OLS assumption - Heteroscedasticity - Questions Outline (cont.) Detection - How do we know if there is a Heteroscedasticity problem? (Informal methods, graphs) Detection - How to test for Heteroscedasticity? Remedial measures - What can we do about it? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

6 OLS - Assumptions - Violations - Heteroscedasticity - How to spell it Heteroscedasticity Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

7 Heteroscedasticity - The word? The assumption here is Homo-Scedasticity Constant- Equal- Same- Variance Variation Spread Homo- Scedasticity That is, constant variance of the Error Term (for the di erent values of the regressor(s)) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

8 Heteroscedasticity - Animal? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

9 Heteroscedasticity - Animal? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

10 Heteroscedasticity - What? To keep it simple, we have the single linear regression, for a arbitrary observation Y i = β 1 + β 2 X 2,i + u i The assumption (for OLS) is Homoscedasticity Var (u i ) = σ 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

11 Heteroscedasticity - What? Recall de ntion of variance Var (u i ) = E [u i E (u i )] 2 recall assumption E (u i ) = 0 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

12 Heteroscedasticity - What? We can write Var (u i ) = E [u i 0] 2 Var (u i ) = E ui 2 So the assumption can be written E u 2 i = σ 2 Note no i on the sigma-two. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

13 Heteroscedasticity - What? If not homoscedasticity, then we have heteroscedasticity E u 2 i = σ 2 i i.e. it depends on i where the i refers to the di erent X i Remember the income - consumption example. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

14 Heteroscedasticity - Why? Why do we have heteroscedasticity? When there are natural constraints on the variables, Consumption - Income Learning, e.g. number of errors vs time practicing Improved data quality Over time Among the X s Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

15 Heteroscedasticity - Why? Why do we have heteroscedasticity? (cont) Outliers Misspeci cation of model Skewness of variables Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

16 Heteroscedasticity - OLS Consequences Should we worry? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

17 Heteroscedasticity - OLS Consequences Should we worry? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

18 Heteroscedasticity - OLS Consequences - Expectation OLS: In the presence of Heteroscedastiticy, the OLS-estimator is still UNBIASED That is, it is on average correct The sampling distribution of the estimator bβ j is centered around the true value β j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

19 Heteroscedasticity - Consequences - Variance Even though the OLS-estimator is unbiased it no longer have minimum variance It is NOT BLUE Not Best, in the sense it has minimun variance among, Linear Unbiased Estimators (It is only LUE - Note - Not standard notation...) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

20 Heteroscedasticity - Consequences - Variance - Y-bar To see what happens to the variance in the presence of heterosceasticity, let us study the simplest case possible, let where Y i = β + u i u i N 0, σ 2 That is, a regression on only a constant. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

21 Heteroscedasticity - Consequences - Variance - Y-bar We know that the OLS estimator of β, that is bβ, will be Ȳ, and that it is unbiased, i.e. E (Ȳ ) = β Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

22 Heteroscedasticity - Consequences - Variance - Y-bar Let s study the variance V (Ȳ ) = E [Ȳ E (Ȳ )] 2 V (Ȳ ) = E (Ȳ β) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

23 Heteroscedasticity - Consequences - Variance - Y-bar Use de nition of the sample mean V (Ȳ ) = E " n 1 n Y i i=1 β # 2 because we have V (Ȳ ) = E Y i = β + u i " # 2 n 1 n (β + u i ) β i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

24 Heteroscedasticity - Consequences - Variance - Y-bar (Again) V (Ȳ ) = E " 1 n! n (β + u i ) i=1 Split up sum and since β is a constant n i=1 β = nβ, we have V (Ȳ ) = E " 1 n nβ + n i=1 u i! β β # 2 # 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

25 Heteroscedasticity - Consequences - Variance - Y-bar Multiply in 1 n to get V (Ȳ ) = E V (Ȳ ) = E nβ n + n i=1 u i n β + n i=1 u i n 2 β 2 β Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

26 Heteroscedasticity - Consequences - Variance - Y-bar (again) V (Ȳ ) = E V (Ȳ ) = E β + n i=1 u i n n i=1 u 2 i n 2 β Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

27 Heteroscedasticity - Consequences - Variance - Y-bar Take the constant 1 n outside the square, and thus, square it, we have V (Ȳ ) = " 1 2 n # 2 E u i n i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

28 Heteroscedasticity - Consequences - Variance - Y-bar Studying again, we have the situation " n # 2 u i i=1 (a + b + c) 2 = a 2 + b 2 + c 2 + 2ab + 2ac + 2bc but now with, not three, but n terms... " n # 2 u i = [u 1 + u u n ] 2 i=1 = u u u 2 n + 2u 1 u 2 + 2u 1 u 3... Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

29 Heteroscedasticity - Consequences - Variance - Y-bar Again V (Ȳ ) = V (Ȳ ) = 1 n 2 E " n " 1 2 n # 2 E u i n i=1 i=1 u 2 i + 2 n i 1 i=2 j=1 u i u j # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

30 Heteroscedasticity - Consequences - Variance - Y-bar It can be instructive to study 2 n i 1 i=2 j=1 u i u j = 2u 2 u 1 + 2u 3 u 1 + 2u 3 u 2 + 2u 4 u 1 + 2u 4 u 2 + 2u 4 u 3 + 2u 5 u Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

31 Heteroscedasticity - Consequences - Variance - Y-bar Split up the expectation, (again) V (Ȳ ) = 1 n 2 E " n i=1 u 2 i + 2 n i 1 i=2 j=1 u i u j # V (Ȳ ) = 1 n 2 " n i=1 E u 2 i + 2 n i 1 i=2 j=1 E (u i u j ) # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

32 Heteroscedasticity - Consequences - Variance - Y-bar Again study We assume that that is, we have...? n i 1 i=2 j=1 E (u i u j ) E (u i u j ) = 0, i 6= j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

33 Heteroscedasticity - Consequences - Variance - Y-bar So (again) V (Ȳ ) = 1 n 2 " n i=1 V (Ȳ ) = 1 n 2 " n i=1 E E u 2 i u 2 i + 2 n + 2 n i 1 i=2 j=1 i 1 i=2 j=1 E (u i u j ) 0 # # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

34 Heteroscedasticity - Consequences - Variance - Y-bar That is V (Ȳ ) = 1 n 2 " n i=1 V (Ȳ ) = 1 n 2 " n i=1 E E ui 2 ui # # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

35 Heteroscedasticity - Consequences - Variance - Y-bar Since now, (under heteroscedasticity) E u 2 i = σ 2 i Again V (Ȳ ) = 1 n 2 " n i=1 V (Ȳ ) = 1 n n 2 σ 2 i i=1 E u 2 i # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

36 Heteroscedasticity - Consequences - Variance - Y-bar So in the model Y i = β + u i the variance of the sample mean (the OLS-estimator) is given by V (Ȳ ) = V b β = n i=1 σ 2 i n 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

37 Heteroscedasticity - Consequences - Variance - Y-bar Note that if σ 2 i would be the same for all i that is, we would have (homoscedasticity) E = σ 2 Then u 2 i V (Ȳ ) = 1 n 2 " n i=1 V (Ȳ ) = 1 n 2 " n i=1 E σ 2 # u 2 i # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

38 Heteroscedasticity - Consequences - Variance - Y-bar and we know that, for all constants n c = n c i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

39 Heteroscedasticity - Consequences - Variance - Y-bar So, we have V (Ȳ ) = 1 n 2 " n i=1 σ 2 # = 1 n 2 n σ 2 = nσ2 n 2 V (Ȳ ) = σ2 n which is the "usual" variance of the sample mean. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

40 Heteroscedasticity - Consequences - Variance - Slope est. Now, back to the (simple linear) regression model Y i = β 1 + β 2 X i + u i For the OLS estimator of β 2 : 2 V ˆβ 2 = E ˆβ 2 E ˆβ 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

41 Heteroscedasticity - Consequences - Variance - Slope est. Recall ˆβ 2 = β 2 + n j=1 (X i X ) u i (X X ) 2 and E ˆβ 2 = β2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

42 Heteroscedasticity - Consequences - Variance - Slope est. So V ˆβ 2 h 2 i = E ˆβ 2 E ˆβ 2 V ˆβ = E 4@β 2 + n j=1 (X i X ) u i (X X ) 2 β A 5 2 V ˆβ 2 = E 4@ n j=1 (X i X ) u i (X X ) 2 A 5 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

43 Heteroscedasticity - Consequences - Variance - Slope est. V ˆβ 2 = E 4 n j=1 (X i X ) u i 7 (X X ) V E h( n i=1 (X i X ) u i ) 2i ˆβ 2 = (X X ) 2 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

44 Heteroscedasticity - Consequences - Variance - Slope est. Numerator = E Numerator = E 2 4 n i=1 " n n i=1 j=1! 3 2 (X i X ) u i 5 (X i X ) (X j X ) u i u j # Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

45 Heteroscedasticity - Consequences - Variance - Slope est. Study numerator (Num.) 2! 3 2 Num. = E 4 n (X i X ) u i 5 i=1 Num. = E Num. = E " n " n i=1 j=1 (X i X ) (X j X ) u i u j # n (X i X ) 2 ui 2 i=1! + n i 1 i=2 j=1 2 (X i X ) (X j X ) u i u j!# Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

46 Heteroscedasticity - Consequences - Variance - Slope est. Split up expecation Num. = E n (X i X ) 2 ui 2 i=1! + E 2 n i 1 i=2 j=1 (X i X ) (X j X ) u i u j! Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

47 Heteroscedasticity - Consequences - Variance - Slope est. The Xi 0 s are constants, so we can move them out of the expectation, (again)!! n Num. = E (X i X ) 2 ui 2 n i 1 + E 2 (X i X ) (X j X ) u i u j i=1 i=2 j=1 Num. = n (X i X ) 2 E u 2 n i + 2 i=1 i=2 i 1 j=1 (X i X ) (X j X ) E (u i u j ) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

48 Heteroscedasticity - Consequences - Variance - Slope est. Recall that assuming?, so E (u i u j ) = 0, i 6= j Num. = n (X i X ) 2 E u 2 n i + 2 i=1 i=2 i 1 j=1 (X i X ) (X j X ) E (u i u j ) Num. = n (X i X ) 2 E u 2 n i + 2 i=1 i=2 i 1 j=1 (X i X ) (X j X ) 0 Num. = n (X i X ) 2 E u 2 i + 0 i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

49 Heteroscedasticity - Consequences - Variance - Slope est. and that now, (under heteroscedasticity) E u 2 i = σ 2 i so Num. = Num. = n (X i X ) 2 E u 2 i i=1 n (X i X ) 2 σ 2 i i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

50 Heteroscedasticity - Consequences - Variance - Slope est. That was the numerator of the variance expcession. Recall V ˆβ 2 = E n i=1 n j=1 (X i X ) (X j X ) u i u j (X X ) 2 2 we just showed that E " n n i=1 j=1 (X i X ) (X j X ) u i u j # = n (X i X ) 2 σ 2 i i=1 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

51 Heteroscedasticity - Consequences - Variance - Slope est. Putting it together we get V ˆβ 2 = n i=1 (X i X ) 2 σ 2 i (X X ) 2 2 (Why cannot we just move the σ 2 i outside the summation?) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

52 Heteroscedasticity - Consequences - Variance - Slope est. So the (correct) variance of the OLS-estimator for β 2 in the presence of heteroscedastiticy is given by V ˆβ 2 = n i=1 (X i X ) 2 σ 2 i (X X ) 2 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

53 Heteroscedasticity - Consequences - Variance - Slope est. Note that if σ 2 i = σ 2 for all i that is, if we would have homoscedasticity, we have V ˆβ 2 = (X i X ) 2 σ 2 i (X X ) 2 2 V ˆβ 2 = (X i X ) 2 σ 2 (X X ) 2 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

54 Heteroscedasticity - Consequences - Variance - Slope est. Recall here a a 2 = 1 a a = (X i X ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

55 Heteroscedasticity - Consequences - Variance - Slope est. so σ 2 (X i X ) 2 V ˆβ 2 = (X X ) 2 2 V ˆβ 2 = σ 2 (X X ) 2 V ˆβ 2 = σ 2 (X X ) 2 i.e. the "usual" variance of the estimator. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

56 Heteroscedasticity - Consequences - Variance - Slope est. Compare the two variance expresstions Homo- Scedasticity Hetero- Scedasticity (A) (B) V ˆβ 2 = σ 2 V ˆβ (X X ) 2 2 = n i=1(x i X ) 2 σ 2 i ( (X X ) 2 ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

57 Heteroscedasticity - Consequences - Variance - Slope est. Now, unlike in the case of multicollinearity, where multicollinearity always in ates the variance, here there is no general result. So if we use V ˆβ 2 = σ 2 (X X ) 2 when we should use V ˆβ 2 = n i=1 (X i X ) 2 σ 2 i (X X ) 2 2 Do we under or over-estimate true variance? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

58 Heteroscedasticity - Consequences - Variance - Slope est. That is, is there a way of knowing n i=1 (X i X ) 2 σ 2 i (X X ) 2 > 2 σ 2 (X X ) 2 or n i=1 (X i X ) 2 σ 2 i (X X ) 2 < 2 σ 2 (X X ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

59 Heteroscedasticity - Consequences - Variance - Slope est. If we mistakenly ignore heteroscedasticity. What happens? For the following reasoning, introduce the (temporary) notation V TRUE ˆβ 2 = n i=1 (X i X ) 2 σ 2 i (X X ) 2 2 V FALSE ˆβ 2 = σ 2 (X X ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

60 Heteroscedasticity - Consequences - Variance - Slope est. So, in testing the signi cance of the individual parameters H 0 : β j = 0 H 1 : β j 6= 0 Using z obs = bβ j 0 σ b β j where σ b β j = s σ 2 (X X ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

61 Heteroscedasticity - Consequences - Variance - Slope est. Note that here we are dealing with population quantities (pretending to know σ 2, thus we use bβ j 0 z obs = since we assume that the error term is normal, and the estimator is a linear estimator, thus also normal. σ b β j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

62 Heteroscedasticity - Consequences - Variance - Slope est. Of course in practice, we do not know σ 2 and need to estimate it by bσ 2 = Σbu2 i n k where, of course bu i = Y i b β 1 + bβ 2 X i,2 and k is the number of estimated coe cients in the regressrion equation, in this case 2. Then we use t obs = bβ j 0 bσ b β j t n k Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

63 Heteroscedasticity - Consequences - Variance - Slope est. If we UNDER-estimate the true variance, that is V Used but FALSE ˆβ 2 < VTRUE ˆβ 2 σ b β j bβ j σ b β j to SMALL + to BIG + Reject H 0 : β j = 0 to Often + Signi cance to Often Think model is "better" that it actually is Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

64 Heteroscedasticity - Consequences - Variance - Slope est. On the other hand, If we OVER-estimate the true variance, that is V Used but FALSE ˆβ 2 > VTRUE ˆβ 2 σ b β j bβ j σ b β j to BIG + to SMALL + Never Reject H 0 : β j = 0 + Never Signi cance Think model is "worse" that it actually is Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

65 Heteroscedasticity - Consequences - Variance - Slope est. In practice, unfortuently, there is no way of knowing if we under or over estimate the variance when ignoring heteroscedasticity. One would need to be careful in the analysis, and/or use Whites heteroscedastic robust standard errors. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

66 Heteroscedasticity - Detection - Graphs Quick and dirty preliminary analysis: Just plot Y vs X Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

67 Heteroscedasticity - Detection - Tests Some tests of Heteroscedasticity, outline: Parks test Gleijser test Goldfeldt-Quandt test (GQ-test) Breusch-Pagan-Godfrey test (BPG-test) White s test Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

68 Heteroscedasticity - Detection - Tests - Parks test Parks test: σ 2 i = σ 2 X β 2,i ev ln bu i 2 = β1 + β 2 ln (X 2,i ) + v i Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

69 Heteroscedasticity - Detection - Tests - Gleiser test Gleiser test: jbu i j = β 1 + β 2 X 2,i + v i or other functions of X 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

70 Heteroscedasticity - Detection - Tests - BPG test Breusch-Pagan-Godfrey (BPG) test: BPG - Test: Idea Regress Test statistic bu 2 i ˆσ 2 = β 1 + β 2 X 2,i + β 3 X 3,i + e i ExplSS 2 χ 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

71 Heteroscedasticity - Detection - Tests - White s tests Run regression of f (u) on g (X ) White s test: Using square regresors bu 2 i = β 1 + β 2 X 2 + β 3 X β 4 X 3 + β 5 X e What problem could we encounter running this regression? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

72 Heteroscedasticity - Detection - Tests - White s tests To see if the residual variance increases with the increasing values of the regressors, we do a F test of H 0 : β j = 0, j > 1 H 1 : At least one β j 6= 0 j = 2,..., k Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

73 Heteroscedasticity - Detection - Tests - White s tests We can include cross-terms in the regression bu 2 i = β 1 + β 2 X 2 + β 3 X β 4 X 3 + β 5 X β 6 X 2 X 3 + e Again perform a F test of H 0 : β j = 0, j > 1 H 1 : At least one β j 6= 0 j = 2,..., k Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

74 Heteroscedasticity - Detection - Tests - White s tests Note: With many regressors What regressors to include in the auxiliry regression What functional form? What regressors to square? What, if any, crossproducts to include? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

75 Heteroscedasticity - Detection - Tests - Goldfeld-Quandt Golfeld-Quandt (GQ) test: σ 2 i = σ 2 X 2 i Y i = β 1 + β 2 X i + u i 1 Divide sample into two parts with respect to X 2 Estimate the model on each part 3 Test using RSS 1 /df 1 RSS 2 /df 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

76 Heteroscedasticity - Detection - Tests - Properties - Size Sample size 10 and 50: White, White + cross terms, BPG, F and χ 2 : Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

77 Heteroscedasticity - Detection - Tests - Properties - Power Power comparison Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

78 Heteroscedasticity - Remedy OK, we have heterosceasticity, what can we do? White s consistent estimator WLS (Weighted Least Squares) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

79 Heteroscedasticity - Remedy - Whites HAC estimator Idea: Use bu 2 i as a proxy for σ 2 i \ V ˆβ 2 = n i=1 (X i X ) 2 bu 2 i (X X ) 2 2 use this estimate of the variance instead of the usual. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

80 Heteroscedasticity - Remedy - WLS Instead of OLS, we should use the Weighted Least Squares (WLS)) Without matrix algebra notation, hard to show exactly what is going on. How can we "do" WLS without matrix algebra? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

81 Heteroscedasticity - Remedy - WLS Assume a functional form for E u 2 i = σ 2 i We usually assume that the increasing variance of u i is proportional to some function of (some) X. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

82 Heteroscedasticity - Remedy - WLS Commonly used (assumed, mathematically convenient) functional forms: σ 2 i = σ 2 X 2 i σ 2 i = σ 2 jx i j σ 2 i = σ 2p X i Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

83 Heteroscedasticity - Remedy - WLS In general, if we have the variable u i that has the variance V (u i ) = cσ 2 What transform can we do on u i, that is, how can we standardize u i to get the standardized variable, say u i, to have variance σ 2? Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

84 Heteroscedasticity - Remedy - WLS Hint recall V (ay ) = a 2 V (Y ) Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

85 Heteroscedasticity - Remedy - WLS We try then u i = u i p c V (ui ui ) = V p c = 1 p c 2 V (u i ) = 1 c V (u i ) V (u i ) = V (u i ) c Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

86 Heteroscedasticity - Remedy - WLS Recall so (again) V (u i ) = cσ 2 V (u i ) = V (u i ) c = cσ2 c V (u i ) = σ 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

87 Heteroscedasticity - Remedy - WLS Back to the regression. How to implement, we start with Y i = β 1 + β 2 X i + u i Assuming σ 2 i = σ 2 jx i j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

88 Heteroscedasticity - Remedy - WLS Do the transformation, that is divide by p jx i j to get Y p i jxi j = β 1 1 p jxi j + β X i 2 p jxi j + u i p jxi j Y i = β 1 X i + β 2 X i + u i Note: No constant in the "new", transformed regression line. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

89 Heteroscedasticity - Remedy - WLS (again) Y i = β 1 X i study the "transformed" error term + β 2 X i u i = u i p jxi j + u i Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

90 Heteroscedasticity - Remedy - WLS Expectation E (u i ) = E u i p jxi j! = = 1 p jxi j E (u i ) 1 p jxi j 0 = 0 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

91 Heteroscedasticity - Remedy - WLS Study the variance of u i Var (u i ) = E (u i E (u i )) 2 = E (u i 0) 2 V (u i ) = E (u i ) 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

92 Heteroscedasticity - Remedy - WLS Focus on E (u i )2 Replace u i = u i p jxi j and we get E (u i ) 2 = E u i p jxi j! 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

93 Heteroscedasticity - Remedy - WLS Recall for any constant a E (a u i ) 2 = E ha 2 (u i ) 2i = a 2 E (u i ) 2 here a = 1 jx i j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

94 Heteroscedasticity - Remedy - WLS (again) E (u i ) 2 = E u i p jxi j! 2 2 E (ui ) 2 = E 4! p (u i ) 2 5 jxi j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

95 Heteroscedasticity - Remedy - WLS E (u i ) 2 = 1 pjxi 2 E ui j 2 = 1 jx i j E u2 i E (ui ) 2 = E u2 i jx i j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

96 Heteroscedasticity - Remedy - WLS We have heteroscedasticity, so E u 2 i = σ 2 i That is (again) E (ui ) 2 = E u2 i jx i j E (u i ) 2 = σ2 i jx i j Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

97 Heteroscedasticity - Remedy - WLS Recall that we assumed that σ 2 i had the functional form σ 2 i = σ 2 jx i j We have, (again) E (u i ) 2 = σ2 i jx i j E (u i ) 2 = σ2 jx i j jx i j E (u i ) 2 = σ 2 Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

98 Heteroscedasticity - Remedy - WLS In the transformed regression, Y i = β 1 X i + β 2 X i + u i we have constant variance, i.e. homoscedasticity V (u i ) = σ 2 So, here, WLS is equivalent to using OLS on transformed data! Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

99 Heteroscedasticity - Remedy - WLS We started with but we estimate Y i Y i = β 1 + β 2 X i + u i = β 1 X i + β 2 X i + u i How do we interpret β 1 and β 2? Note that, in e ect, when tranforming the PRF, we transform the data - not the parameters, so in terms of the original regression, we keep the interpretation of the parameters. Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

100 Heteroscedasticity - Remedy - WLS What if we choose (assume) another functional form of the heterscedasticity, again, for the single linear regression Y i = β 1 + β 2 X i + u i Assuming given all X i > 0 σ 2 i = σ 2 X 2 i Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

101 Heteroscedasticity - Remedy - WLS Do q the transformation, we divide the terms of the PFR by that is divide by Xi 2 = X i to get Y i X i = β 1 1 X i + β 2 X i X i + u i X i Y i = β 1 X i + β 2 + u i Yi = β 2 + β 1 Xi + u i Note: The coe cients has changed "roles". It is straightforward to see that now Var (u i ) = σ2, i.e. constant. We have transformed away the heteroscedasticity Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

102 Heteroscedasticity - Remedy - WLS The transformed regression again Yi = β 2 + β 1 Xi + u i Once estimated, we interpret bβ 2 as the slope in the orginal regression Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

103 Heteroscedasticity - Remedy - WLS For the WLS - some words of caution: What if we assume the wrong functional form? If we assume the wrong functional form, we might induce even "more" heteroscedasticity than we had to begin with, thus, causing more damage than good. So, if uncertain of what kind of functional form we have on the Lars Forsberg (Uppsala University) Hetero-sce-dasti-city October 22, / 103

Violation of OLS assumption- Multicollinearity

Violation of OLS assumption- Multicollinearity Violation of OLS assumption- Multicollinearity What, why and so what? Lars Forsberg Uppsala University, Department of Statistics October 17, 2014 Lars Forsberg (Uppsala University) 1110 - Multi - co -

More information

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity 1/25 Outline Basic Econometrics in Transportation Heteroscedasticity What is the nature of heteroscedasticity? What are its consequences? How does one detect it? What are the remedial measures? Amir Samimi

More information

Föreläsning /31

Föreläsning /31 1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata EC312: Advanced Econometrics Problem Set 3 Solutions in Stata Nicola Limodio www.nicolalimodio.com N.Limodio1@lse.ac.uk The data set AIRQ contains observations for 30 standard metropolitan statistical

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can

More information

The regression model with one fixed regressor cont d

The regression model with one fixed regressor cont d The regression model with one fixed regressor cont d 3150/4150 Lecture 4 Ragnar Nymoen 27 January 2012 The model with transformed variables Regression with transformed variables I References HGL Ch 2.8

More information

Introductory Econometrics

Introductory Econometrics Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)

More information

Applied Statistics and Econometrics

Applied Statistics and Econometrics Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple

More information

2 Prediction and Analysis of Variance

2 Prediction and Analysis of Variance 2 Prediction and Analysis of Variance Reading: Chapters and 2 of Kennedy A Guide to Econometrics Achen, Christopher H. Interpreting and Using Regression (London: Sage, 982). Chapter 4 of Andy Field, Discovering

More information

Heteroscedasticity 1

Heteroscedasticity 1 Heteroscedasticity 1 Pierre Nguimkeu BUEC 333 Summer 2011 1 Based on P. Lavergne, Lectures notes Outline Pure Versus Impure Heteroscedasticity Consequences and Detection Remedies Pure Heteroscedasticity

More information

Multiple Regression Analysis

Multiple Regression Analysis 1 OUTLINE Basic Concept: Multiple Regression MULTICOLLINEARITY AUTOCORRELATION HETEROSCEDASTICITY REASEARCH IN FINANCE 2 BASIC CONCEPTS: Multiple Regression Y i = β 1 + β 2 X 1i + β 3 X 2i + β 4 X 3i +

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

1 Regression with Time Series Variables

1 Regression with Time Series Variables 1 Regression with Time Series Variables With time series regression, Y might not only depend on X, but also lags of Y and lags of X Autoregressive Distributed lag (or ADL(p; q)) model has these features:

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Collection of Formulae and Statistical Tables for the B2-Econometrics and B3-Time Series Analysis courses and exams

Collection of Formulae and Statistical Tables for the B2-Econometrics and B3-Time Series Analysis courses and exams Collection of Formulae and Statistical Tables for the B2-Econometrics and B3-Time Series Analysis courses and exams Lars Forsberg Uppsala University Spring 2015 Abstract This collection of formulae is

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University Instructions: Answer all four (4) questions. Be sure to show your work or provide su cient justi cation for

More information

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43 Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression

More information

Environmental Econometrics

Environmental Econometrics Environmental Econometrics Syngjoo Choi Fall 2008 Environmental Econometrics (GR03) Fall 2008 1 / 37 Syllabus I This is an introductory econometrics course which assumes no prior knowledge on econometrics;

More information

Multiple Regression Analysis

Multiple Regression Analysis Chapter 4 Multiple Regression Analysis The simple linear regression covered in Chapter 2 can be generalized to include more than one variable. Multiple regression analysis is an extension of the simple

More information

Econometrics Midterm Examination Answers

Econometrics Midterm Examination Answers Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i

More information

Econometrics Homework 1

Econometrics Homework 1 Econometrics Homework Due Date: March, 24. by This problem set includes questions for Lecture -4 covered before midterm exam. Question Let z be a random column vector of size 3 : z = @ (a) Write out z

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing

F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing F3: Classical normal linear rgression model distribution, interval estimation and hypothesis testing Feng Li Department of Statistics, Stockholm University What we have learned last time... 1 Estimating

More information

L2: Two-variable regression model

L2: Two-variable regression model L2: Two-variable regression model Feng Li feng.li@cufe.edu.cn School of Statistics and Mathematics Central University of Finance and Economics Revision: September 4, 2014 What we have learned last time...

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

Testing Linear Restrictions: cont.

Testing Linear Restrictions: cont. Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n

More information

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

Motivation for multiple regression

Motivation for multiple regression Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope

More information

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance. Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

1 A Non-technical Introduction to Regression

1 A Non-technical Introduction to Regression 1 A Non-technical Introduction to Regression Chapters 1 and Chapter 2 of the textbook are reviews of material you should know from your previous study (e.g. in your second year course). They cover, in

More information

Heteroscedasticity. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Heteroscedasticity POLS / 11

Heteroscedasticity. Jamie Monogan. Intermediate Political Methodology. University of Georgia. Jamie Monogan (UGA) Heteroscedasticity POLS / 11 Heteroscedasticity Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Heteroscedasticity POLS 7014 1 / 11 Objectives By the end of this meeting, participants should

More information

OLS, MLE and related topics. Primer.

OLS, MLE and related topics. Primer. OLS, MLE and related topics. Primer. Katarzyna Bech Week 1 () Week 1 1 / 88 Classical Linear Regression Model (CLRM) The model: y = X β + ɛ, and the assumptions: A1 The true model is y = X β + ɛ. A2 E

More information

Statistical View of Least Squares

Statistical View of Least Squares May 23, 2006 Purpose of Regression Some Examples Least Squares Purpose of Regression Purpose of Regression Some Examples Least Squares Suppose we have two variables x and y Purpose of Regression Some Examples

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity OSU Economics 444: Elementary Econometrics Ch.0 Heteroskedasticity (Pure) heteroskedasticity is caused by the error term of a correctly speciþed equation: Var(² i )=σ 2 i, i =, 2,,n, i.e., the variance

More information

Correlation and Regression

Correlation and Regression Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class

More information

ECO375 Tutorial 7 Heteroscedasticity

ECO375 Tutorial 7 Heteroscedasticity ECO375 Tutorial 7 Heteroscedasticity Matt Tudball University of Toronto Mississauga November 9, 2017 Matt Tudball (University of Toronto) ECO375H5 November 9, 2017 1 / 24 Review: Heteroscedasticity Consider

More information

Diagnostics of Linear Regression

Diagnostics of Linear Regression Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions

More information

Chapter 8 Heteroskedasticity

Chapter 8 Heteroskedasticity Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized

More information

Chapter 2: simple regression model

Chapter 2: simple regression model Chapter 2: simple regression model Goal: understand how to estimate and more importantly interpret the simple regression Reading: chapter 2 of the textbook Advice: this chapter is foundation of econometrics.

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Multiple Regression Analysis. Part III. Multiple Regression Analysis Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant

More information

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators Thilo Klein University of Cambridge Judge Business School Session 4: Linear regression,

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

Economics Introduction to Econometrics - Fall 2007 Final Exam - Answers

Economics Introduction to Econometrics - Fall 2007 Final Exam - Answers Student Name: Economics 4818 - Introduction to Econometrics - Fall 2007 Final Exam - Answers SHOW ALL WORK! Evaluation: Problems: 3, 4C, 5C and 5F are worth 4 points. All other questions are worth 3 points.

More information

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors: Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility

More information

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks

Chapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks Chapter 5 Classical linear regression model assumptions and diagnostics Introductory Econometrics for Finance c Chris Brooks 2013 1 Violation of the Assumptions of the CLRM Recall that we assumed of the

More information

Economics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2

Economics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 Instructions: Answer all four (4) questions. Point totals for each question are given in parenthesis; there are 00 points possible. Within

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

Two-Variable Regression Model: The Problem of Estimation

Two-Variable Regression Model: The Problem of Estimation Two-Variable Regression Model: The Problem of Estimation Introducing the Ordinary Least Squares Estimator Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Two-Variable

More information

Macroeconometrics. Christophe BOUCHER. Session 4 Classical linear regression model assumptions and diagnostics

Macroeconometrics. Christophe BOUCHER. Session 4 Classical linear regression model assumptions and diagnostics Macroeconometrics Christophe BOUCHER Session 4 Classical linear regression model assumptions and diagnostics 1 Violation of the Assumptions of the CLRM Recall that we assumed of the CLRM disturbance terms:

More information

ECON 497: Lecture Notes 10 Page 1 of 1

ECON 497: Lecture Notes 10 Page 1 of 1 ECON 497: Lecture Notes 10 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 10 Heteroskedasticity Studenmund Chapter 10 We'll start with a quote from Studenmund:

More information

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 7: the K-Varable Linear Model IV

More information

Lectures on Simple Linear Regression Stat 431, Summer 2012

Lectures on Simple Linear Regression Stat 431, Summer 2012 Lectures on Simple Linear Regression Stat 43, Summer 0 Hyunseung Kang July 6-8, 0 Last Updated: July 8, 0 :59PM Introduction Previously, we have been investigating various properties of the population

More information

ECON Introductory Econometrics. Lecture 16: Instrumental variables

ECON Introductory Econometrics. Lecture 16: Instrumental variables ECON4150 - Introductory Econometrics Lecture 16: Instrumental variables Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 12 Lecture outline 2 OLS assumptions and when they are violated Instrumental

More information

Lecture 1: OLS derivations and inference

Lecture 1: OLS derivations and inference Lecture 1: OLS derivations and inference Econometric Methods Warsaw School of Economics (1) OLS 1 / 43 Outline 1 Introduction Course information Econometrics: a reminder Preliminary data exploration 2

More information

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y Regression and correlation Correlation & Regression, I 9.07 4/1/004 Involve bivariate, paired data, X & Y Height & weight measured for the same individual IQ & exam scores for each individual Height of

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

the error term could vary over the observations, in ways that are related

the error term could vary over the observations, in ways that are related Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may

More information

1. The Multivariate Classical Linear Regression Model

1. The Multivariate Classical Linear Regression Model Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

Lab 07 Introduction to Econometrics

Lab 07 Introduction to Econometrics Lab 07 Introduction to Econometrics Learning outcomes for this lab: Introduce the different typologies of data and the econometric models that can be used Understand the rationale behind econometrics Understand

More information

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity R.G. Pierse 1 Omitted Variables Suppose that the true model is Y i β 1 + β X i + β 3 X 3i + u i, i 1,, n (1.1) where β 3 0 but that the

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 2: Simple Regression Egypt Scholars Economic Society Happy Eid Eid present! enter classroom at http://b.socrative.com/login/student/ room name c28efb78 Outline

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Reliability of inference (1 of 2 lectures)

Reliability of inference (1 of 2 lectures) Reliability of inference (1 of 2 lectures) Ragnar Nymoen University of Oslo 5 March 2013 1 / 19 This lecture (#13 and 14): I The optimality of the OLS estimators and tests depend on the assumptions of

More information

ECON Introductory Econometrics. Lecture 5: OLS with One Regressor: Hypothesis Tests

ECON Introductory Econometrics. Lecture 5: OLS with One Regressor: Hypothesis Tests ECON4150 - Introductory Econometrics Lecture 5: OLS with One Regressor: Hypothesis Tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 5 Lecture outline 2 Testing Hypotheses about one

More information

Exercises Chapter 4 Statistical Hypothesis Testing

Exercises Chapter 4 Statistical Hypothesis Testing Exercises Chapter 4 Statistical Hypothesis Testing Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 5, 013 Christophe Hurlin (University of Orléans) Advanced Econometrics

More information

Econ107 Applied Econometrics

Econ107 Applied Econometrics Econ107 Applied Econometrics Topics 2-4: discussed under the classical Assumptions 1-6 (or 1-7 when normality is needed for finite-sample inference) Question: what if some of the classical assumptions

More information

Lab 11 - Heteroskedasticity

Lab 11 - Heteroskedasticity Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction

More information

Applied Statistics and Econometrics

Applied Statistics and Econometrics Applied Statistics and Econometrics Lecture 5 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 44 Outline of Lecture 5 Now that we know the sampling distribution

More information

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables. Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

Statistical View of Least Squares

Statistical View of Least Squares Basic Ideas Some Examples Least Squares May 22, 2007 Basic Ideas Simple Linear Regression Basic Ideas Some Examples Least Squares Suppose we have two variables x and y Basic Ideas Simple Linear Regression

More information

Lecture 3: Multiple Regression

Lecture 3: Multiple Regression Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

Econometrics Review questions for exam

Econometrics Review questions for exam Econometrics Review questions for exam Nathaniel Higgins nhiggins@jhu.edu, 1. Suppose you have a model: y = β 0 x 1 + u You propose the model above and then estimate the model using OLS to obtain: ŷ =

More information

1 Correlation between an independent variable and the error

1 Correlation between an independent variable and the error Chapter 7 outline, Econometrics Instrumental variables and model estimation 1 Correlation between an independent variable and the error Recall that one of the assumptions that we make when proving the

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Statistical Inference with Regression Analysis

Statistical Inference with Regression Analysis Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing

More information

MAT2377. Rafa l Kulik. Version 2015/November/26. Rafa l Kulik

MAT2377. Rafa l Kulik. Version 2015/November/26. Rafa l Kulik MAT2377 Rafa l Kulik Version 2015/November/26 Rafa l Kulik Bivariate data and scatterplot Data: Hydrocarbon level (x) and Oxygen level (y): x: 0.99, 1.02, 1.15, 1.29, 1.46, 1.36, 0.87, 1.23, 1.55, 1.40,

More information

ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria

ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria SOLUTION TO FINAL EXAM Friday, April 12, 2013. From 9:00-12:00 (3 hours) INSTRUCTIONS:

More information

L7: Multicollinearity

L7: Multicollinearity L7: Multicollinearity Feng Li feng.li@cufe.edu.cn School of Statistics and Mathematics Central University of Finance and Economics Introduction ï Example Whats wrong with it? Assume we have this data Y

More information

1 Motivation for Instrumental Variable (IV) Regression

1 Motivation for Instrumental Variable (IV) Regression ECON 370: IV & 2SLS 1 Instrumental Variables Estimation and Two Stage Least Squares Econometric Methods, ECON 370 Let s get back to the thiking in terms of cross sectional (or pooled cross sectional) data

More information