Chapter 8 Heteroskedasticity

Size: px
Start display at page:

Download "Chapter 8 Heteroskedasticity"

Transcription

1 Chapter 8 Walter R. Paczkowski Rutgers University Page 1

2 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized Least Squares: Unknown Form of Variance 8.6 in the Linear Model Page

3 8.1 The Nature of Page 3

4 8.1 The Nature of Consider our basic linear function: Eq. 8.1 E( y) 1 x To recognize that not all observations with the same x will have the same y, and in line with our general specification of the regression model, we let e i be the difference between the ith observation y i and mean for all observations with the same x i. Eq. 8. e y E( y ) y x i i i i 1 i Page 4

5 8.1 The Nature of Our model is then: Eq. 8.3 y x e i 1 i i Page 5

6 8.1 The Nature of The probability of getting large positive or negative values for e is higher for large values of x than it is for low values A random variable, in this case e, has a higher probability of taking on large values if its variance is high. We can capture this effect by having var(e) depend directly on x. Or: var(e) increases as x increases Page 6

7 8.1 The Nature of When the variances for all observations are not the same, we have heteroskedasticity The random variable y and the random error e are heteroskedastic Conversely, if all observations come from probability density functions with the same variance, homoskedasticity exists, and y and e are homoskedastic Page 7

8 8.1 The Nature of FIGURE 8.1 Heteroskedastic errors Page 8

9 8.1 The Nature of When there is heteroskedasticity, one of the least squares assumptions is violated: E e e e e ( i ) 0 var( i ) cov( i, j ) 0 Replace this with: Eq. 8.4 var( y ) var( e ) h( x ) i i i where h(x i ) is a function of x i that increases as x i increases Page 9

10 8.1 The Nature of Example from food data: yˆ x We can rewrite this as: eˆ y x i i i Page 10

11 8.1 The Nature of FIGURE 8. Least squares estimated food expenditure function and observed data points Page 11

12 8.1 The Nature of is often encountered when using cross-sectional data The term cross-sectional data refers to having data on a number of economic units such as firms or households, at a given point in time Cross-sectional data invariably involve observations on economic units of varying sizes Page 1

13 8.1 The Nature of This means that for the linear regression model, as the size of the economic unit becomes larger, there is more uncertainty associated with the outcomes y This greater uncertainty is modeled by specifying an error variance that is larger, the larger the size of the economic unit Page 13

14 8.1 The Nature of is not a property that is necessarily restricted to cross-sectional data With time-series data, where we have data over time on one economic unit, such as a firm, a household, or even a whole economy, it is possible that the error variance will change Page 14

15 8.1 The Nature of Consequences for the Least Squares Estimators There are two implications of heteroskedasticity: 1. The least squares estimator is still a linear and unbiased estimator, but it is no longer best There is another estimator with a smaller variance. The standard errors usually computed for the least squares estimator are incorrect Confidence intervals and hypothesis tests that use these standard errors may be misleading Page 15

16 8.1 The Nature of Consequences for the Least Squares Estimators Eq. 8.5 What happens to the standard errors? Consider the model: yi 1xi ei var( ei ) The variance of the least squares estimator for β as: Eq. 8.6 var( b ) N i1 ( x x) i Page 16

17 8.1 The Nature of Consequences for the Least Squares Estimators Eq. 8.7 Now let the variances differ: Consider the model: y 1 x e var( e ) i i i i i The variance of the least squares estimator for β is: Eq. 8.8 var( b ) N ( ) N xi x i i1 wi i N i1 ( xi x) i1 Page 17

18 8.1 The Nature of Consequences for the Least Squares Estimators If we proceed to use the least squares estimator and its usual standard errors when: var ei i we will be using an estimate of Eq. 8.6 to compute the standard error of b when we should be using an estimate of Eq. 8.8 The least squares estimator, that it is no longer best in the sense that it is the minimum variance linear unbiased estimator Page 18

19 8. Detecting Page 19

20 8. Detecting There are two methods we can use to detect heteroskedasticity 1. An informal way using residual charts. A formal way using statistical tests Page 0

21 8. Detecting 8..1 Residual Plots If the errors are homoskedastic, there should be no patterns of any sort in the residuals If the errors are heteroskedastic, they may tend to exhibit greater variation in some systematic way This method of investigating heteroskedasticity can be followed for any simple regression In a regression with more than one explanatory variable we can plot the least squares residuals against each explanatory variable, or against, y, to see if they vary in a ˆi systematic way Page 1

22 8. Detecting FIGURE 8.3 Least squares food expenditure residuals plotted against income 8..1 Residual Plots Page

23 8. Detecting 8.. LaGrange Multiplier Tests Eq. 8.9 Let s develop a test based on a variance function Consider the general multiple regression model: E y β β x β x 1 i i K ik A general form for the variance function related to Eq. 8.9 is: Eq var yi i E ei h 1 zi S zis This is a general form because we have not been specific about the function h Page 3

24 8. Detecting 8.. LaGrange Multiplier Tests Two possible functions for Exponential function: h are: Eq h z z exp z z 1 i S is 1 i S is Linear function: Eq. 8.1 h z z z z 1 i S is 1 i S is In this latter case one must be careful to ensure h 0 Page 4

25 8. Detecting 8.. LaGrange Multiplier Tests Notice that when then: 3 S 0 h z z h 1 i S is 1 But h 1 is a constant Page 5

26 8. Detecting 8.. LaGrange Multiplier Tests So, when: 3 S 0 heteroskedasticity is not present Page 6

27 8. Detecting 8.. LaGrange Multiplier Tests The null and alternative hypotheses are: Eq H H : : not all the in H are zero 1 i 0 S Page 7

28 8. Detecting 8.. LaGrange Multiplier Tests For the test statistic, use Eq and 8.1 to get: var yi i E ei 1 zi S zis Eq Letting we get v e E e i i i Eq e E e v z z v i i i 1 i S is i Page 8

29 8. Detecting 8.. LaGrange Multiplier Tests This is like the general regression model studied earlier: y E y e β β x β x e Eq i i i i K ik i Substituting the least squares residuals e for ˆi we get: e i Eq e z z v ˆi 1 i S is i Page 9

30 8. Detecting 8.. LaGrange Multiplier Tests Eq Since the R from Eq measures the proportion of variation in eˆi explained by the z s, it is a natural candidate for a test statistic. It can be shown that when H 0 is true, the sample size multiplied by R has a chi-square (χ ) distribution with S - 1 degrees of freedom: NR S 1 Page 30

31 8. Detecting 8.. LaGrange Multiplier Tests Important features of this test: It is a large sample test You will often see the test referred to as a Lagrange multiplier test or a Breusch-Pagan test for heteroskedasticity The value of the statistic computed from the linear function is valid for testing an alternative hypothesis of heteroskedasticity where the variance function can be of any form given by Eq Page 31

32 8. Detecting 8..a The White Test The previous test presupposes that we have knowledge of the variables appearing in the variance function if the alternative hypothesis of heteroskedasticity is true We may wish to test for heteroskedasticity without precise knowledge of the relevant variables Hal White suggested defining the z s as equal to the x s, the squares of the x s, and possibly their cross-products Page 3

33 8. Detecting 8..a The White Test Suppose: β1 β β3 3 E y x x The White test without cross-product terms (interactions) specifies: z x z x z x z x Including interactions adds one further variable z x x 5 3 Page 33

34 8. Detecting 8..a The White Test The White test is performed as an F-test or using: NR Page 34

35 8. Detecting 8..b Testing the Food Expenditure Example We test H 0 : α = 0 against H 1 : α 0 in the variance function σ i = h(α 1 + α x i ) First estimate e x v by least squares Save the R which is: SSE R SST Calculate: ˆi 1 i i N R Page 35

36 8. Detecting 8..b Testing the Food Expenditure Example Since there is only one parameter in the null hypothesis, the χ-test has one degree of freedom. The 5% critical value is 3.84 Because 7.38 is greater than 3.84, we reject H 0 and conclude that the variance depends on income Page 36

37 8. Detecting 8..b Testing the Food Expenditure Example For the White version, estimate: e x x v ˆi 1 i 3 i i Test H 0 : α = α 3 = 0 against H 1 : α 0 or α 3 0 Calculate: N R p value 0.03 The 5% critical value is χ (0.95, ) = 5.99 Again, we conclude that heteroskedasticity exists Page 37

38 8. Detecting 8..3 The Goldfeld- Quandt Test Estimate our wage equation as: Eq WAGE EDUC 0.133EXPER 1.54METRO se Page 38

39 8. Detecting 8..3 The Goldfeld- Quandt Test Eq. 8.0a Eq. 8.0b The Goldfeld-Quandt test is designed to test for this form of heteroskedasticity, where the sample can be partitioned into two groups and we suspect the variance could be different in the two groups Write the equations for the two groups as: WAGE β β EDUC β EXPER β METRO e i 1,,, N Mi M1 M Mi M 3 Mi M 4 Mi Mi M WAGE β β EDUC β EXPER β METRO e i 1,,, N Ri R1 R Ri R3 Ri R4 Ri Ri R Test the null hypothesis: M R Page 39

40 8. Detecting 8..3 The Goldfeld- Quandt Test Eq. 8.1 Eq. 8. The test statistic is: F ˆ M M F NM KM, NR KR ˆR R Suppose we want to test: H : against H : 0 M R 1 M R When H 0 is true, we have: Eq. 8.3 F ˆ ˆ R M Page 40

41 8. Detecting 8..3 The Goldfeld- Quandt Test The 5% significance level are F Lc = F (0.05, 805, 189) = 0.81 and F uc = F (0.975, 805, 189) = 1.6 We reject H 0 if F < F Lc or F > F uc Page 41

42 8. Detecting 8..3 The Goldfeld- Quandt Test Using least squares to estimate (8.0a) and (8.0b) separately yields variance estimates: ˆ M R We next compute: ˆ ˆM F ˆ R Since.09 > F UC = 1.6, we reject H 0 and conclude that the wage variances for the rural and metropolitan regions are not equal Page 4

43 8. Detecting 8..3a The Food Expenditure Example With the observations ordered according to income x i, and the sample split into two equal groups of 0 observations each, yields: ˆ Calculate: ˆ ˆ F 3.61 ˆ Page 43

44 8. Detecting 8..3a The Food Expenditure Example Believing that the variances could increase, but not decrease with income, we use a one-tail test with 5% critical value F(0.95, 18, 18) =. Since 3.61 >., a null hypothesis of homoskedasticity is rejected in favor of the alternative that the variance increases with income Page 44

45 8.3 -Consistent Standard Errors Page 45

46 8.3 - Consistent Standard Errors Recall that there are two problems with using the least squares estimator in the presence of heteroskedasticity: 1. The least squares estimator, although still being unbiased, is no longer best. The usual least squares standard errors are incorrect, which invalidates interval estimates and hypothesis tests There is a way of correcting the standard errors so that our interval estimates and hypothesis tests are valid Page 46

47 8.3 - Consistent Standard Errors Under heteroskedasticity: Eq. 8.4 var b N xi x i i1 N i1 x i x Page 47

48 8.3 - Consistent Standard Errors A consistent estimator for this variance has been developed and is known as: White s heteroskedasticity-consistent standard errors, or robust standard errors, or Robust standard errors The term robust is used because they are valid in large samples for both heteroskedastic and homoskedastic errors Page 48

49 8.3 - Consistent Standard Errors For K =, the White variance estimator is: Eq. 8.5 var b N x ˆ i x ei N i1 N N xi x i1 Page 49

50 8.3 - Consistent Standard Errors For the food expenditure example: yˆ x White se incorrect se Page 50

51 8.3 - Consistent Standard Errors The two corresponding 95% confidence intervals for β are: c c White: b t se b ,13.87 Incorrect: b t se b ,14.45 Page 51

52 8.3 - Consistent Standard Errors White s estimator for the standard errors helps avoid computing incorrect interval estimates or incorrect values for test statistics in the presence of heteroskedasticity It does not address the other implication of heteroskedasticity: the least squares estimator is no longer best Failing to address this issue may not be that serious With a large sample size, the variance of the least squares estimator may still be sufficiently small to get precise estimates To find an alternative estimator with a lower variance it is necessary to specify a suitable variance function Using least squares with robust standard errors avoids the need to specify a suitable variance function Page 5

53 8.4 Generalized Least Squares: Known Form of Variance Page 53

54 8.4 Generalized Least Squares: Known Form of Variance Variance Proportional to x Eq. 8.6 Recall the food expenditure example with heteroskedasticity: y β β x e i 1 i i 0, var, cov, 0 E e e e e i j i i i i j To develop an estimator that is better than the least squares estimator we need to make a further assumption about how the variances σ i change with each observation Page 54

55 8.4 Generalized Least Squares: Known Form of Variance Variance Proportional to x Eq. 8.7 An estimator known as the generalized least squares estimator, depends on the unknown σ i To make the generalized least squares estimator operational, some structure is imposed on σ i One possibility: e var i i xi Page 55

56 8.4 Generalized Least Squares: Known Form of Variance 8.4.1a Transforming the Model We change or transform the model into one with homoskedastic errors: Eq. 8.8 y 1 x e i β1 β i i x x x x i i i i Page 56

57 8.4 Generalized Least Squares: Known Form of Variance 8.4.1a Transforming the Model Define the following transformed variables: Eq. 8.9 * yi * 1 * xi * ei yi, xi 1, xi = xi, ei = x x x x i i i i Our model is now: Eq y β x β x e * * * * i 1 i1 i i Page 57

58 8.4 Generalized Least Squares: Known Form of Variance 8.4.1a Transforming the Model The new transformed error term is homoskedastic: Eq e i 1 1 var e var var e x * i i i x x i i xi The transformed error term will retain the properties of zero mean and zero correlation between different observations Page 58

59 8.4 Generalized Least Squares: Known Form of Variance 8.4.1a Transforming the Model To obtain the best linear unbiased estimator for a model with heteroskedasticity of the type specified in Eq. 8.7: 1. Calculate the transformed variables given in Eq Use least squares to estimate the transformed model given in Eq The estimator obtained in this way is called a generalized least squares estimator Page 59

60 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares One way of viewing the generalized least squares estimator is as a weighted least squares estimator Minimizing the sum of squared transformed errors: N N N * i 1 ei xi ei i1 i1 xi i1 The errors are weighted by e 1 x i Page 60

61 8.4 Generalized Least Squares: Known Form of Variance 8.4.1c Food Expenditure Estimates Eq. 8.3 Applying the generalized (weighted) least squares procedure to our food expenditure problem: yˆ x i se i A 95% confidence interval for β is given by: t c ˆ βˆ se β ,13.6 Page 61

62 8.4 Generalized Least Squares: Known Form of Variance Grouped Data Another form of heteroskedasticity is where the sample can be divided into two or more groups with each group having a different error variance Page 6

63 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares Most software has a weighted least squares or generalized least squares option Page 63

64 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares For our wage equation, we could have: Eq. 8.33a Eq. 8.33b WAGE 1 EDUC 3EXPER e i 1,,, N Mi M Mi Mi Mi M WAGE 1 EDUC 3EXPER e i 1,,, N Ri R Ri Ri Ri R with ˆ ˆ M R Page 64

65 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares The separate least squares estimates based on separate error variances are: b 9.05 b 1.8 b M1 M M 3 b b b R1 R R3 But we have two estimates for β and two for β 3 Page 65

66 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares Eq. 8.34a Obtaining generalized least squares estimates by dividing each observation by the standard deviation of the corresponding error term: σ M and σ R : WAGE Mi 1 EDUC Mi EXPER Mi e Mi M1 3 M M M M M i 1,,, N M Eq. 8.34b WAGE Ri 1 EDUC Ri EXPER Ri e Ri R 1 3 R R R R R i 1,,, N R Page 66

67 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares But σ M and σ R are unknown Transforming the observations with their estimates This yields a feasible generalized least squares estimator that has good properties in large samples. Page 67

68 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares Also, the intercepts are different Handle the different intercepts by including METRO Page 68

69 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares The method for obtaining feasible generalized least squares estimates: 1. Obtain estimated ˆ M and ˆ R by applying least squares separately to the metropolitan and rural observations.. Let ˆ M when METROi 1 ˆ i ˆ R when METROi 0 3. Apply least squares to the transformed model WAGE 1 i EDUC i EXPER i METRO i e i Eq R 1 3 ˆ ˆ ˆ ˆ ˆ ˆ i i i i i i Page 69

70 8.4 Generalized Least Squares: Known Form of Variance 8.4.1b Weighted Least Squares Our estimated equation is: Eq WAGE EDUC 0.13EXPER 1.539METRO (se) (1.0) (0.069) (0.015) (0.346) Page 70

71 8.5 Generalized Least Squares: Unknown Form of Variance Page 71

72 8.5 Generalized Least Squares: Unknown Form of Variance Consider a more general specification of the error variance: Eq var( e) i i x i where γ is an unknown parameter Page 7

73 8.5 Generalized Least Squares: Unknown Form of Variance To handle this, take logs ln( i) ln( ) ln( xi) and then exponentiate: Eq exp ln( ) ln( x ) i exp( z ) 1 i i Page 73

74 8.5 Generalized Least Squares: Unknown Form of Variance We an extend this function to: Eq exp( z z ) i 1 i S is Page 74

75 8.5 Generalized Least Squares: Unknown Form of Variance Now write Eq as: Eq ln i 1 zi To estimate α 1 and α we recall our basic model: β1 β y E y e x e i i i i i Page 75

76 8.5 Generalized Least Squares: Unknown Form of Variance Apply the least squares strategy to Eq using eˆi : Eq ˆ ln( ei ) ln( i ) vi 1 zi vi Page 76

77 8.5 Generalized Least Squares: Unknown Form of Variance For the food expenditure data, we have: ln( i ) z i Page 77

78 8.5 Generalized Least Squares: Unknown Form of Variance In line with the more general specification in Eq. 8.39, we can obtain variance estimates from: ˆ ˆ z ˆ i exp( 1 1 i) and then divide both sides of the equation by ˆ i Page 78

79 8.5 Generalized Least Squares: Unknown Form of Variance This works because dividing Eq. 8.6 by ˆ i yields: y 1 i x i e i 1 i i i i The error term is homoskedastic: Eq. 8.4 e 1 1 i var var( ) 1 ei i i i i Page 79

80 8.5 Generalized Least Squares: Unknown Form of Variance To obtain a generalized least squares estimator for β 1 and β, define the transformed variables: Eq y i 1 x i yi xi 1 xi ˆ ˆ ˆ i i i and apply least squares to: Eq y x x e i 1 i1 i i Page 80

81 8.5 Generalized Least Squares: Unknown Form of Variance To summarize for the general case, suppose our model is: Eq y x x e i 1 i k ik i where: Eq var( ei ) i exp( 1 zi szis ) Page 81

82 8.5 Generalized Least Squares: Unknown Form of Variance The steps for obtaining a generalized least squares estimator are: 1. Estimate Eq by least squares and compute the squares of the least squares residuals eˆi. Estimate 1,,, Sby applying least squares to the equation lne i 1 z i S z is v i 3. Compute variance estimates ˆ exp( ˆ ˆ z ˆ z ) 4. Compute the transformed observations defined by Eq. 8.43, including,, xi 3 xik if K 5. Apply least squares to Eq. 8.44, or to an extended version of (8.44), if K Page 8 ˆ i 1 i S is

83 8.5 Generalized Least Squares: Unknown Form of Variance Eq Following these steps for our food expenditure problem: yˆ x i (se) (9.71) (0.97) The estimates for β 1 and β have not changed much There has been a considerable drop in the standard errors that under the previous specification were ˆ 1 ˆ se β 3.79 and se β 1.39 Page 83

84 8.5 Generalized Least Squares: Unknown Form of Variance Using Robust Standard Errors Robust standard errors can be used not only to guard against the possible presence of heteroskedasticity when using least squares, they can be used to guard against the possible misspecification of a variance function when using generalized least squares Page 84

85 8.6 in the Linear Probability Model Page 85

86 8.6 in the Linear Probability Model We previously examined the linear probability model: E y p β1 βx β K x K and, including the error term: y E y e β β x β x e Eq i i i i K ik i Page 86

87 8.6 in the Linear Probability Model We know this model suffers from heteroskedasticity: Eq yi ei pi pi β β x β x 1 β β x β x var var 1 1 i K ik 1 i K ik Page 87

88 8.6 in the Linear Probability Model We can estimate p i with the least squares predictions: pˆ b b x b x Eq i i K ik Eq giving: e pˆ pˆ var 1 i i i Page 88

89 8.6 in the Linear Probability Model Generalized least squares estimates can be obtained by applying least squares to the transformed equation: y 1 x x e i i ik i β β β 1 K pˆ pˆ pˆ pˆ pˆ pˆ pˆ pˆ pˆ pˆ i i i i i i i i i i Page 89

90 8.6 in the Linear Probability Model Table 8.1 Linear Probability Model Estimates The Marketing Example Revisited Page 90

91 8.6 in the Linear Probability Model The Marketing Example Revisited A suitable test for heteroskedasticity is the White test: N R p value This leads us to reject a null hypothesis of homoskedasticity at a 1% level of significance. Page 91

92 Key Words Page 9

93 Keywords Breusch Pagan test generalized least squares Goldfeld Quandt test grouped data -consistent standard errors homoskedasticity Lagrange multiplier test linear probability model mean function residual plot transformed model variance function weighted least squares White test Page 93

94 Appendices Page 94

95 8A Properties of the Least Squares Estimator Assume our model is: y x e i 1 i i E e e e e i j ( i) 0 var( i) i cov( i, j) 0 ( ) Eq. 8A.1 The least squares estimator for β is: where b w we i i i x i x x i x Page 95

96 8A Properties of the Least Squares Estimator Since E(e i ) = 0, we get: β so β is unbiased i i β wie ei E b E E w e Page 96

97 8A Properties of the Least Squares Estimator But the least squares standard errors are incorrect under heteroskedasticity: var b var wiei Eq. 8A. w var e w w cov e, e w i i i j i j i j i i ( xi x) ( x x) i i Page 97

98 8A Properties of the Least Squares Estimator Eq. 8A.3 If the variances are all the same, then Eq. 8A. simplifies so that: var( b ) x i x Page 98

99 8B LaGrange Multiplier Tests for Recall that: Eq. 8B.1 e z z v ˆi 1 i S is i and assume that our objective is to test H 0 : α = α 3 = = α S = 0 against the alternative that at least one α s, for s =,..., S, is nonzero Page 99

100 8B LaGrange Multiplier Tests for The F-statistic is: Eq. 8B. F ( SST SSE) / ( S 1) SSE / ( N S) where i N N ˆ ˆ and ˆ i i1 i1 SST e e SSE v Page 100

101 8B LaGrange Multiplier Tests for Eq. 8B.3 Eq. 8B.4 Eq. 8B.5 We can modify Eq. 8B. Note that: SST SSE ( S1) F SSE / ( N S ) ( S 1) Next, note that: var( ei ) var( vi) Substituting, we get: SST SSE var( e ) SSE N S i Page 101

102 8B LaGrange Multiplier Tests for Eq. 8B.6 If it is assumed that e i is normally distributed, then: and: Further: var e 4 i e SST SSE ˆ 4 e 1 var ei var( e ) var( e ) 4 e e 4 i i e Page 10

103 8B LaGrange Multiplier Tests for Using Eq. 8B.6, we reject a null hypothesis of homoskedasticity when the χ -value is greater than a critical value from the χ (S 1) distribution. Page 103

104 8B LaGrange Multiplier Tests for Eq. 8B.7 Also, if we set: We can get: 1 var( e ) ( e e ) N ˆ ˆ i i N i1 SST SSE SST / N SST N Eq. 8B.8 N 1 SSE SST N R Page 104

105 8B LaGrange Multiplier Tests for At a 5% significance level, a null hypothesis of homoskedasticity is rejected when NR exceeds the critical value χ (0.95, S 1) Page 105

5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares

5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares 5.1 Model Specification and Data 5. Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares Estimator 5.4 Interval Estimation 5.5 Hypothesis Testing for

More information

Multiple Regression Analysis: Heteroskedasticity

Multiple Regression Analysis: Heteroskedasticity Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing

More information

Econometrics - 30C00200

Econometrics - 30C00200 Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 5. HETEROSKEDASTICITY Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 WHAT IS HETEROSKEDASTICITY? The multiple linear regression model can

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

ECON 497: Lecture Notes 10 Page 1 of 1

ECON 497: Lecture Notes 10 Page 1 of 1 ECON 497: Lecture Notes 10 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 10 Heteroskedasticity Studenmund Chapter 10 We'll start with a quote from Studenmund:

More information

4.1 Least Squares Prediction 4.2 Measuring Goodness-of-Fit. 4.3 Modeling Issues. 4.4 Log-Linear Models

4.1 Least Squares Prediction 4.2 Measuring Goodness-of-Fit. 4.3 Modeling Issues. 4.4 Log-Linear Models 4.1 Least Squares Prediction 4. Measuring Goodness-of-Fit 4.3 Modeling Issues 4.4 Log-Linear Models y = β + β x + e 0 1 0 0 ( ) E y where e 0 is a random error. We assume that and E( e 0 ) = 0 var ( e

More information

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data Chapter 5 Panel Data Models Pooling Time-Series and Cross-Section Data Sets of Regression Equations The topic can be introduced wh an example. A data set has 0 years of time series data (from 935 to 954)

More information

Topic 7: Heteroskedasticity

Topic 7: Heteroskedasticity Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic

More information

Lecture 4: Heteroskedasticity

Lecture 4: Heteroskedasticity Lecture 4: Heteroskedasticity Econometric Methods Warsaw School of Economics (4) Heteroskedasticity 1 / 24 Outline 1 What is heteroskedasticity? 2 Testing for heteroskedasticity White Goldfeld-Quandt Breusch-Pagan

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity OSU Economics 444: Elementary Econometrics Ch.0 Heteroskedasticity (Pure) heteroskedasticity is caused by the error term of a correctly speciþed equation: Var(² i )=σ 2 i, i =, 2,,n, i.e., the variance

More information

Lab 11 - Heteroskedasticity

Lab 11 - Heteroskedasticity Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction

More information

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/

More information

Introductory Econometrics

Introductory Econometrics Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.

More information

Chapter 2 The Simple Linear Regression Model: Specification and Estimation

Chapter 2 The Simple Linear Regression Model: Specification and Estimation Chapter The Simple Linear Regression Model: Specification and Estimation Page 1 Chapter Contents.1 An Economic Model. An Econometric Model.3 Estimating the Regression Parameters.4 Assessing the Least Squares

More information

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X. Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.

More information

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set

Heteroskedasticity. Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set Heteroskedasticity Occurs when the Gauss Markov assumption that

More information

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE 1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

Inference for the Regression Coefficient

Inference for the Regression Coefficient Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression line. We can shows that b 0 and b 1 are the unbiased estimates

More information

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

Mathematics for Economics MA course

Mathematics for Economics MA course Mathematics for Economics MA course Simple Linear Regression Dr. Seetha Bandara Simple Regression Simple linear regression is a statistical method that allows us to summarize and study relationships between

More information

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity 1/25 Outline Basic Econometrics in Transportation Heteroscedasticity What is the nature of heteroscedasticity? What are its consequences? How does one detect it? What are the remedial measures? Amir Samimi

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance.

Heteroskedasticity. y i = β 0 + β 1 x 1i + β 2 x 2i β k x ki + e i. where E(e i. ) σ 2, non-constant variance. Heteroskedasticity y i = β + β x i + β x i +... + β k x ki + e i where E(e i ) σ, non-constant variance. Common problem with samples over individuals. ê i e ˆi x k x k AREC-ECON 535 Lec F Suppose y i =

More information

Heteroskedasticity. (In practice this means the spread of observations around any given value of X will not now be constant)

Heteroskedasticity. (In practice this means the spread of observations around any given value of X will not now be constant) Heteroskedasticity Occurs when the Gauss Markov assumption that the residual variance is constant across all observations in the data set so that E(u 2 i /X i ) σ 2 i (In practice this means the spread

More information

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level

More information

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47 ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with

More information

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of

Wooldridge, Introductory Econometrics, 2d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of Wooldridge, Introductory Econometrics, d ed. Chapter 8: Heteroskedasticity In laying out the standard regression model, we made the assumption of homoskedasticity of the regression error term: that its

More information

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H. ACE 564 Spring 2006 Lecture 8 Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information by Professor Scott H. Irwin Readings: Griffiths, Hill and Judge. "Collinear Economic Variables,

More information

Unit 10: Simple Linear Regression and Correlation

Unit 10: Simple Linear Regression and Correlation Unit 10: Simple Linear Regression and Correlation Statistics 571: Statistical Methods Ramón V. León 6/28/2004 Unit 10 - Stat 571 - Ramón V. León 1 Introductory Remarks Regression analysis is a method for

More information

Chapter 10 Nonlinear Models

Chapter 10 Nonlinear Models Chapter 10 Nonlinear Models Nonlinear models can be classified into two categories. In the first category are models that are nonlinear in the variables, but still linear in terms of the unknown parameters.

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model

Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 7: the K-Varable Linear Model IV

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Course Econometrics I

Course Econometrics I Course Econometrics I 4. Heteroskedasticity Martin Halla Johannes Kepler University of Linz Department of Economics Last update: May 6, 2014 Martin Halla CS Econometrics I 4 1/31 Our agenda for today Consequences

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model Most of this course will be concerned with use of a regression model: a structure in which one or more explanatory

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage

More information

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

1 The Multiple Regression Model: Freeing Up the Classical Assumptions 1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator

More information

The Simple Regression Model. Simple Regression Model 1

The Simple Regression Model. Simple Regression Model 1 The Simple Regression Model Simple Regression Model 1 Simple regression model: Objectives Given the model: - where y is earnings and x years of education - Or y is sales and x is spending in advertising

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

1/24/2008. Review of Statistical Inference. C.1 A Sample of Data. C.2 An Econometric Model. C.4 Estimating the Population Variance and Other Moments

1/24/2008. Review of Statistical Inference. C.1 A Sample of Data. C.2 An Econometric Model. C.4 Estimating the Population Variance and Other Moments /4/008 Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University C. A Sample of Data C. An Econometric Model C.3 Estimating the Mean of a Population C.4 Estimating the Population

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit

More information

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)

More information

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley Models, Testing, and Correction of Heteroskedasticity James L. Powell Department of Economics University of California, Berkeley Aitken s GLS and Weighted LS The Generalized Classical Regression Model

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear

More information

Lecture 8: Heteroskedasticity. Causes Consequences Detection Fixes

Lecture 8: Heteroskedasticity. Causes Consequences Detection Fixes Lecture 8: Heteroskedasticity Causes Consequences Detection Fixes Assumption MLR5: Homoskedasticity 2 var( u x, x,..., x ) 1 2 In the multivariate case, this means that the variance of the error term does

More information

Heteroskedasticity (Section )

Heteroskedasticity (Section ) Heteroskedasticity (Section 8.1-8.4) Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Heteroskedasticity 1 / 44 Consequences of Heteroskedasticity for OLS Consequences

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators

More information

6.1 The F-Test 6.2 Testing the Significance of the Model 6.3 An Extended Model 6.4 Testing Some Economic Hypotheses 6.5 The Use of Nonsample

6.1 The F-Test 6.2 Testing the Significance of the Model 6.3 An Extended Model 6.4 Testing Some Economic Hypotheses 6.5 The Use of Nonsample 6.1 The F-Test 6. Testing the Significance of the Model 6.3 An Extended Model 6.4 Testing Some Economic Hypotheses 6.5 The Use of Nonsample Information 6.6 Model Specification 6.7 Poor Data, Collinearity

More information

Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error Distributions

Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error Distributions Journal of Modern Applied Statistical Methods Volume 8 Issue 1 Article 13 5-1-2009 Least Absolute Value vs. Least Squares Estimation and Inference Procedures in Regression Models with Asymmetric Error

More information

Lecture 14 Simple Linear Regression

Lecture 14 Simple Linear Regression Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 11, 2012 Outline Heteroskedasticity

More information

Psychology 282 Lecture #4 Outline Inferences in SLR

Psychology 282 Lecture #4 Outline Inferences in SLR Psychology 282 Lecture #4 Outline Inferences in SLR Assumptions To this point we have not had to make any distributional assumptions. Principle of least squares requires no assumptions. Can use correlations

More information

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors: Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility

More information

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation 1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption

More information

Using EViews Vox Principles of Econometrics, Third Edition

Using EViews Vox Principles of Econometrics, Third Edition Using EViews Vox Principles of Econometrics, Third Edition WILLIAM E. GRIFFITHS University of Melbourne R. CARTER HILL Louisiana State University GUAY С LIM University of Melbourne JOHN WILEY & SONS, INC

More information

Lecture 3: Inference in SLR

Lecture 3: Inference in SLR Lecture 3: Inference in SLR STAT 51 Spring 011 Background Reading KNNL:.1.6 3-1 Topic Overview This topic will cover: Review of hypothesis testing Inference about 1 Inference about 0 Confidence Intervals

More information

Föreläsning /31

Föreläsning /31 1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +

More information

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata

EC312: Advanced Econometrics Problem Set 3 Solutions in Stata EC312: Advanced Econometrics Problem Set 3 Solutions in Stata Nicola Limodio www.nicolalimodio.com N.Limodio1@lse.ac.uk The data set AIRQ contains observations for 30 standard metropolitan statistical

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Statistical Inference with Regression Analysis

Statistical Inference with Regression Analysis Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing

More information

DATA IN SERIES AND TIME I. Several different techniques depending on data and what one wants to do

DATA IN SERIES AND TIME I. Several different techniques depending on data and what one wants to do DATA IN SERIES AND TIME I Several different techniques depending on data and what one wants to do Data can be a series of events scaled to time or not scaled to time (scaled to space or just occurrence)

More information

Lecture 2 Linear Regression: A Model for the Mean. Sharyn O Halloran

Lecture 2 Linear Regression: A Model for the Mean. Sharyn O Halloran Lecture 2 Linear Regression: A Model for the Mean Sharyn O Halloran Closer Look at: Linear Regression Model Least squares procedure Inferential tools Confidence and Prediction Intervals Assumptions Robustness

More information

the error term could vary over the observations, in ways that are related

the error term could vary over the observations, in ways that are related Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance Var(u i x i ) = σ 2 is common to all observations i = 1,..., n In many applications, we may

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

ECON Introductory Econometrics. Lecture 16: Instrumental variables

ECON Introductory Econometrics. Lecture 16: Instrumental variables ECON4150 - Introductory Econometrics Lecture 16: Instrumental variables Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 12 Lecture outline 2 OLS assumptions and when they are violated Instrumental

More information

1 Motivation for Instrumental Variable (IV) Regression

1 Motivation for Instrumental Variable (IV) Regression ECON 370: IV & 2SLS 1 Instrumental Variables Estimation and Two Stage Least Squares Econometric Methods, ECON 370 Let s get back to the thiking in terms of cross sectional (or pooled cross sectional) data

More information

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal yuppal@ysu.edu Sampling Distribution of b 1 Expected value of b 1 : Variance of b 1 : E(b 1 ) = 1 Var(b 1 ) = σ 2 /SS x Estimate of

More information

Chapter 13. Multiple Regression and Model Building

Chapter 13. Multiple Regression and Model Building Chapter 13 Multiple Regression and Model Building Multiple Regression Models The General Multiple Regression Model y x x x 0 1 1 2 2... k k y is the dependent variable x, x,..., x 1 2 k the model are the

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

8. Instrumental variables regression

8. Instrumental variables regression 8. Instrumental variables regression Recall: In Section 5 we analyzed five sources of estimation bias arising because the regressor is correlated with the error term Violation of the first OLS assumption

More information

Lecture 10 Multiple Linear Regression

Lecture 10 Multiple Linear Regression Lecture 10 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 10-1 Topic Overview Multiple Linear Regression Model 10-2 Data for Multiple Regression Y i is the response variable

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43 Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression

More information

9. Linear Regression and Correlation

9. Linear Regression and Correlation 9. Linear Regression and Correlation Data: y a quantitative response variable x a quantitative explanatory variable (Chap. 8: Recall that both variables were categorical) For example, y = annual income,

More information

Chapter 14 Simple Linear Regression (A)

Chapter 14 Simple Linear Regression (A) Chapter 14 Simple Linear Regression (A) 1. Characteristics Managerial decisions often are based on the relationship between two or more variables. can be used to develop an equation showing how the variables

More information

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 9: Simple Regression Business Statistics Lecture 9: Simple Regression 1 On to Model Building! Up to now, class was about descriptive and inferential statistics Numerical and graphical summaries of data Confidence intervals

More information

ECON2228 Notes 7. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 41

ECON2228 Notes 7. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 41 ECON2228 Notes 7 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 6 2014 2015 1 / 41 Chapter 8: Heteroskedasticity In laying out the standard regression model, we made

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

Inference for Regression Simple Linear Regression

Inference for Regression Simple Linear Regression Inference for Regression Simple Linear Regression IPS Chapter 10.1 2009 W.H. Freeman and Company Objectives (IPS Chapter 10.1) Simple linear regression p Statistical model for linear regression p Estimating

More information

Topic 7: HETEROSKEDASTICITY

Topic 7: HETEROSKEDASTICITY Universidad Carlos III de Madrid César Alonso ECONOMETRICS Topic 7: HETEROSKEDASTICITY Contents 1 Introduction 1 1.1 Examples............................. 1 2 The linear regression model with heteroskedasticity

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

THE MULTIVARIATE LINEAR REGRESSION MODEL

THE MULTIVARIATE LINEAR REGRESSION MODEL THE MULTIVARIATE LINEAR REGRESSION MODEL Why multiple regression analysis? Model with more than 1 independent variable: y 0 1x1 2x2 u It allows : -Controlling for other factors, and get a ceteris paribus

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random

More information

ECON 4551 Econometrics II Memorial University of Newfoundland. Panel Data Models. Adapted from Vera Tabakova s notes

ECON 4551 Econometrics II Memorial University of Newfoundland. Panel Data Models. Adapted from Vera Tabakova s notes ECON 4551 Econometrics II Memorial University of Newfoundland Panel Data Models Adapted from Vera Tabakova s notes 15.1 Grunfeld s Investment Data 15.2 Sets of Regression Equations 15.3 Seemingly Unrelated

More information

DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Genap 2017/2018 Jurusan Teknik Industri Universitas Brawijaya

DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Genap 2017/2018 Jurusan Teknik Industri Universitas Brawijaya DESAIN EKSPERIMEN Analysis of Variances (ANOVA) Semester Jurusan Teknik Industri Universitas Brawijaya Outline Introduction The Analysis of Variance Models for the Data Post-ANOVA Comparison of Means Sample

More information

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on

More information

ECO375 Tutorial 7 Heteroscedasticity

ECO375 Tutorial 7 Heteroscedasticity ECO375 Tutorial 7 Heteroscedasticity Matt Tudball University of Toronto Mississauga November 9, 2017 Matt Tudball (University of Toronto) ECO375H5 November 9, 2017 1 / 24 Review: Heteroscedasticity Consider

More information

Answer Key: Problem Set 6

Answer Key: Problem Set 6 : Problem Set 6 1. Consider a linear model to explain monthly beer consumption: beer = + inc + price + educ + female + u 0 1 3 4 E ( u inc, price, educ, female ) = 0 ( u inc price educ female) σ inc var,,,

More information