Econ107 Applied Econometrics

Size: px
Start display at page:

Download "Econ107 Applied Econometrics"

Transcription

1 Econ107 Applied Econometrics Topics 2-4: discussed under the classical Assumptions 1-6 (or 1-7 when normality is needed for finite-sample inference) Question: what if some of the classical assumptions are violated? Topic 5: deals with violations of Assumption 1 (A1 hereafter). Topics 6-8: deal with three cases of violations of the classical assumptions: multicollinearity (A6), serial correlation (A4), and heteroskedasticity (A5). Questions to be addressed: what is the nature of the problem? what are the consequences of the problem? how is the problem diagnosed? how to remedy the problem? 1

2 6 Multicollinearity (Studenmund, Chapter 8) 6.1 The Nature of Multicollinearity Perfect Multicollinearity 1. Definition: Perfect multicollinearity exists in the following regression Y i = β 0 + β 1 X 1i + + β k X ki + ε i, (1) if there exist a set of parameters λ j (j =0, 1,,k, not all equal to zero) such that λ 0 X 0i + λ 1 X 1i + + λ k X ki =0, (2) where X 0i 1. (2) must hold for all observations. 2

3 Alternatively, we could write an independent variable as an exact linear combination of the others, e.g., if λ k 6=0, we can write (2) as X ki = λ 0 λ k X 0i λ 1 λ k X 1i λ k 1 λ k X k 1,i. (3) The last expression says essentially that X ki is redundant and it does not have any information other than those contained in X 0i,X 1i,,X k 1,i to explain Y i. 3

4 Example. Consider the following regression model for consumption function C = β 1 + β 2 N + β 3 S + β 4 T + ε, where C is consumption, N is nonlabor income, S is salary, T is total income, and ε is error term. Since T = N + S, it is not possible to separate individual effects of the components (N, S) of income and total income (T ). According to the model, E (C) =β 1 + β 2 N + β 3 S + β 4 T. But if we let c be any nonzero value, and let β 0 2 = β 2 c, β 0 3 = β 3 c, and β 0 4 = β 4 + c, then E (C) =β 1 + β 0 2 N + β0 3 S + β0 4 T as well for a different set of parameters. This allows the same value of E (C) for many different values of the parameters. 4

5 2. Problems (1) Coefficients can t be estimated. Consider the regression: Y i = β 0 + β 1 X 1i + β 2 X 2i + ε i. (4) If X 2i = λx 1i (λ 6= 0), we will explain that the parameters β 1 and β 2 cannot be identified or estimated. To see why, define β 1 = β 1 + cλ, and β 2 = β 2 c, where c can be any constant. (4) is equivalent to Y i = β 0 + β 1 X 1i + β 2 X 2i + ε i. (5) This means that there are an infinite number of c s that makes (5) hold. In other words, there are an infinite number of (β 1,β 2 ) such that (5) holds. We cannot separate the influence of X 1i from that of X 2i on Y i. The above analysis extends to a generic MLR model where a regressor can be written as a linear combination of others. 5

6 (2) Standard errors can t be estimated. In the regression model (4), the standard error of ˆβ 1 can be written: std(ˆβ 1 ) = v u t σ 2 P ni=1 ³ X1i X 2 ³ 1 r 2 12 where r 12 is the sample correlation between X 1i and X 2i. In the case of perfect multicollinearity (e.g., X 2i = λx 1i + a), r 12 =1or 1, so that the denominator is zero. Thus, std(ˆβ 1 )=. Solution: The solution to perfect multicollinearity is trivial: Drop one or several of the regressors. In the above example, we can drop either X 2i or X 1i so that (4) can be written as or Y i = β 0 +(β 1 + λβ 2 ) X 1i + ε i, Y i = β 0 +(β 2 + β 1 /λ) X 2i + ε i., 6

7 By regressing Y i on X 1i, we are estimating β 1 + λβ 2. Analogously, by regressing Y i on X 2i, we are estimating β 2 + β 1 /λ. In either case, we cannot estimate β 1 or β 2. Remarks. Perfect multicollinearity is fairy easy to avoid. Econometricians almost never talk about perfect multicollinearity. Instead, when we use the word multicollinearity, we are really talking about severe imperfect multicollinearity. 7

8 6.1.2 Imperfect Multicollinearity Imperfect multicollinearity can be defined as a linear functional relationship between two or more independent variables that is so strong that it can significantly affect the estimation of the coefficients of the variables. Definition: Imperfect multicollinearity exists in a k-variate regression if forsomestochasticvariablev i. λ 0 X 0i + λ 1 X 1i + + λ k X ki + v i =0 Remarks. 1) As Var(v i ) 0, imperfect multicollinearity tends to perfect multicollinearity. 8

9 2) Alternatively, we could write any particular independent variable as an almost exact linear function of the others. E.g., if λ k 6=0, then X ki = λ 0 λ k X 0i λ 1 λ k X 1i λ k 1 λ k X k 1,i v i λ k. (6) The above equation implies that although the relationship between X ki and X 0i,X 1i, might be fairly strong, it is not strong enough to allow X ki to be completely explained by X 0i,X 1i,,X k 1,i ; some unexplained variation still remains. 3) Imperfect multicollinearity indicates a strong linear relationship between the regressors. The stronger the relationship between the two or more regressors, the more likely it is that they will be considered significantly multicollinear. 9

10 6.2 The Consequences of (Imperfect) Multicollinearity 1. Coefficient estimators will remain unbiased. Imperfect multicollinearity does not violate the classical assumptions. If all the classical assumptions 1-6 are met, we can estimate the coefficients and the estimators of β s will still be centered around the true value of β s. Moreover, the OLS estimators are still unbiased and are BLUE. 2. The variances/standard errors of the coefficient estimators blow up. They increase with the degree of multicollinearity. Since two or more of the regressors are significantly related, it becomes too difficult to identify the separate effects of the multicollinear variables and we are much more likely to make errors in estimating the coefficients than we were before we encountered multicollinearity. So imperfect multicollinearity reduces the precision of our coefficient estimates. 10

11 For example, in the 2-variate regression case std(ˆβ 1 ) = v u t As r 12 1, the standard error. σ 2 P ni=1 ³ X1i X 2 ³ 1 r 2 12 Numerical example: Suppose the standard error σ 2 / P n i=1 ³ X1i X 2 =1, i.e., std(ˆβ 1 )=1 when r 12 =0. If r 12 =0.10, then the standard error=1.01. If r 12 =0.25, then the standard error=1.03. If r 12 =0.50, then the standard error=1.15. If r 12 =0.75, then the standard error=1.51. If r 12 =0.90, then the standard error=2.29. If r 12 =0.99, then the standard error=7.09. Standard error increases at an increasing rate with the multicollinearity between the explanatory variables. As a result, we will have wider confidence intervals and possibly insignificant t ratios on our coefficient estimates because t 1 = β b 1 /se( β b 1 ).. 11

12 8 7 6 ) std(β 1 ^ r 12 Figure 1: A consequence of impefect multicollinearity: the blow up of standard errors 12

13 3. The computed t-ratios will fall. This means that we ll have more difficulty in rejecting the null hypothesis that a slope coefficient is equal to zero. This problem is closely related to the problem of a small sample size. In both cases, standard errors blow up. With a small sample size the denominator is reduced by the lack of variation in the explanatory variable. 4. Coefficient estimates become very sensitive to the changes in specification and number of observations. The coefficient estimates may be very sensitive to the addition of one or a small number of observations. The coefficient estimates may be sensitive to the deletion of a statistically insignificant variable. One may get very odd coefficient estimates possibly with wrong signs due to the high variance of the estimator. 5. The overall fit of the model will be largely unaffected. Even though the individual t-ratios are often quite low in the case of imperfect multicollinearity, the overall fit oftheequation(r 2 or R 2 ) will not fall much. 13

14 Ahypotheticalexample. Suppose we want to estimate a student s consumption function. After some preliminary work, we come up with the following equation C i = β 0 + β 1 Y d,i + β 2 LA i + ε i, where C i =the annual consumption expenditures of the ith student Y d,i =the annual disposable income (including gifts) of the ith student LA i =the liquid asset (cash, savings, etc.) of the ith student. Please analyze the following regression outputs: bc i = Y d,i LA i (7) (1.0307) (0.0492) t [0.496] [0.868] n = 9, R 2 =

15 bc i = Y d,i (8) (0.157) t [6.187] n = 9, R 2 = An empirical example: petroleum consumption Suppose that we are interested in building a cross-sectional model of the demand for gasoline by state: C i = β 0 + β 1 Mile i + β 2 Tax i + β 3 Reg i + ε i, where C i =the petroleum consumption in the ith state Mile i =the urban highway miles within the ith state Tax i =the gasoline tax rate in the ith state Reg i =the motor vehicle registrations in the ith state. 15

16 Please analyze the following regression outputs: bc i = Mile i 36.5Tax i 0.061Reg i (9) (10.3) (13.2) (0.043) t [5.92] [-2.77] [-1.43] n = 50, R 2 = bc i = Tax i Reg i (10) (16.9) (0.012) t [-3.18] [15.88] n = 50, R 2 =

17 6.3 The Detection of Multicollinearity It is worth mentioning that multicollinearity exists in almost all equations. It is virtually impossible in the real world to find a set of independent variables that are totally uncorrelated with each other. Our purpose is to learn to determine how much multicollinearity exists by using three general indicators or diagnostic tools. 1. t-ratios versus R 2. Look for a high R 2,but few significant t ratios. Remarks. (1) Common rule of thumb. Can t reject the null hypotheses that coefficients are individually equal to zero (t tests), but can reject the null hypothesis that they are simultaneously equal to zero (F test). (2) This is not an exact test. What do we mean by few significant t ratios, and a high R 2? Too imprecise. Also depends on other factors like the sample size. 17

18 2. Correlation matrix of regressors. Look for high pair-wise correlation coefficients. Look at the correlation matrix for the regressors. Remarks. (1) How high is high? As a rule of thumb, we can use 0.8. If the sample correlation exceeds 0.8 in absolute value, we should be concerned about multicollinearity. (2) Multicollinearity refers to a linear relationship among all or some of the regressors. Any pair of independent variables may not be highly correlated, but one variable may be a linear function of a number of others. In a 2-variate regression, multicollinearity is the correlation between the 2 explanatory variables. (3) This is a... sufficient, but not a necessary condition for multicollinearity. In other words, if you ve got a high pairwise correlation, you ve got problems. However, it isn t conclusive evidence of an absence of multicollinearity. 18

19 3. High variance inflation factors (VIFs). The variance inflation factor (VIF) is a method of detecting the degree of multicollinearity by looking at the extent to which a given explanatory variable can be explained by all other explanatory variables in the equation. So there is a VIF for each regressor. Suppose we want to use VIF to detect multicollinearity in the following regression: Y i = β 0 + β 1 X 1i + + β k X ki + ε i. (11) Let b β j denote the OLS estimator of β j in the above regression. We need to calculate k different VIFs, one for each X ji (j =1,,k). 1) Run the following k regressions: X 1i = γ 0 + γ 2 X 2i + + γ k X k,i + v 1i X ki = α 0 + α 1 X 1i + + α k 1 X k 1,i + v ki 19

20 2) Calculate the R 2 for each of the above k regressions and denote Rj 2 as the R2 from the linear regression of X ji on all other regressors in (11). The VIF for β b j is defined by VIF( β b j )= 1 1 Rj 2. ThehigherVIF( b β j ), thehigherthevarianceof b β j (holding constant the variance of the error term) and the more severe the effects of multicollinearity. Remarks. 1) How high is high? As a common rule of thumb, if VIF( b β j ) > 5 for some j, then the multicollinearity is severe. 2) As the number of regressors increases, it makes sense to increase the above number (5) slightly. 3) In Eviews we can calculate the VIF( b β j ) after the jth regression (i.e, run X ji = α 0 +α 1 X 1i + + α j 1 X j 1,i +α j+1 X j+1,i + + α k X k,i + v ki, and name the equation as eqj after the regression) by typing in the command window scalar VIFj=1/(1-eqj.@R2) Summary: No single test for multicollinearity. 20

21 6.4 Remedies for Multicollinearity Once we re convinced that multicollinearity is present, what can we do about it? The diagnosis of the ailment isn t clear cut, neither is the treatment. Appropriateness of the following remedial measures varies from one situation to another. Example. Estimating the labour supply of married women from : Hours t = β 0 + β 1 W w,t + β 2 W m,t + ε t, (12) where: Hours t = Average annual hours of work of married women W w,t = Average wage rate for married women W m,t =Averagewagerateformarriedmen. Suppose the regression output is Hours d t = W w,t 22.91W m,t (34.97) (29.01) n = 50, R 2 =

22 Multicollinearity is a problem here. The t-ratios are less than 1.5 and 1, respectively (insignificant at 10% levels). Yet, R 2 is It is easy to confirm multicollinearity in this case. The correlation between the two wage rates is as high as 0.99 over our sample period! Standard errors blow up. We can t separate the wage effects on labour supply of married women. Possible Solutions? 22

23 1. A Priori Information If we know the relationship between the slope coefficients, we can substitute this restriction into the regression and eliminate the multicollinearity. This relies heavily on economic theory. Example. If we use time series data to estimate the Cobb-Douglass production function or the elasticity of output (Y ) with respect to the capital (K) and labor (L), we may have multicollinearity problem because as time evolves, both K and L increase and they can be highly correlated. Suppose that we have a constant return to scale in the Cobb-Douglass production function Y t = AK β 1 t L β 2 t eε t (β 1 + β 2 =1). We can impose the restriction β 1 + β 2 =1in the following regression: ln Y t = β 0 + β 1 ln K t + β 2 ln L t + ε t 23

24 by plugging β 2 =1 β 1 into the above equation to obtain ln Y t = β 0 + β 1 ln K t +(1 β 1 )lnl t + ε t ln Y t ln L t = β 0 + β 1 (ln K t ln L t )+ε t ln (Y t /L t ) = β 0 + β 1 ln (K t /L t )+ε t. That is we can estimate β 1 by regressing ln (Y t /L t ) on a constant and ln (K t /L t ). Afterweobtainestimate b β 1 of β 1, we can obtain estimate β 2 by bβ 2 =1 b β 1. Remarks. Unfortunately, such a priori information is extremely rare. 24

25 2. Dropping a Variable In the example of labour supply of married women, suppose we omit the wage of married men and estimate the following model Hours t = α 0 + α 1 W w,t + v t. (13) In this example, it seems natural to drop the variable W m,t. In other cases, it may make no statistical significance which variable is dropped. One has to rely on the theoretical underpinnings of the model or common sense. Some cautionary note. Sometimes we have to be careful when we consider dropping a variable in case of multicollinearity. If one variable should appear in the regression while we have dropped it, then we will encounter the problem of omitted variable bias. So we are substituting one problem for another. The remedy may be worse than the disease. Suppose that W m,t should appear in (12), then the OLS estimator bα 1 in (13) is likely to be biased for β 1 : E(ˆα 1 )=β 1 + β 2 b 12, where b 12 is associated with the correlation between W w,t and W m,t. 25

26 3. Transformation of the Variables One of the simplest things to do with time series regressions is to run the regression on the first differences data. Start with the original specification at time t: Hours t = β 0 + β 1 W w,t + β 2 W m,t + ε t, (14) The same linear relationship holds for the previous period (t 1) as well: Hours t 1 = β 0 + β 1 W w,t 1 + β 2 W m,t 1 + ε t 1. (15) Subtracting (15) from (14) yields Hours t Hours t 1 = β 1 ³ Ww,t W w,t 1 +β2 ³ Wm,t W m,t 1 +(εt ε t 1 ), (16) or 4Hours t = β 1 4W w,t + β 2 4W m,t + 4ε t, (17) where e.g., 4Hours t = Hours t Hours t 1. The advantage is that changes in wage rates may not be as highly correlated as their levels. 26

27 The disadvantages are: (i) Number of observations are reduced (i.e., loss of one degree of freedom). The sample period changes from to , say. (ii) May introduce serial correlation. Even if ε t are uncorrelated, 4ε t are not because Cov(4ε t, 4ε t 1 ) = Cov(ε t ε t 1,ε t 1 ε t 2 ) = Cov(ε t,ε t 1 ) Cov(ε t,ε t 2 ) Cov(ε t 1,ε t 1 ) + Cov(ε t 1,ε t 2 ) = 0 0 Var (ε t 1 )+0= Var (ε t 1 ) 6= 0. Again, the cure may be worse than the disease. It violates one of the classical assumptions and new problems need to be addressed (in later topic). 27

28 4. Get More Data. Two possibilities here: (1) Extend the data set. Multicollinearity is a sample phenomenon. Wage rates may be correlated over the period Add more years. For example, go back to Correlation may be reduced. The problem is that more data may not be available, or the relationship among the variables may have changed (i.e., the regression function isn t stable over time). More likely that the data are not there. If they are there, why not include them initially? (2)ChangeNatureorSourceofData. If possible, we can switch from time-series to cross-sectional analysis or to panel data analysis. The sample correlation in the cross-sectional data is usually different from that in thetimeseriesdata. The use of panel data potentially reduces the multicollinearity in the total sample. For example, we can use a random sample of many households at a point in time. The degree of multicollinearity in wages may be relatively lower between spouses. Or, we can use a random sample of households over a number of years. 28

29 5. Do Nothing (A Remedy!) Multicollinearity is not a problem if the objective of the analysis is forecasting. It doesn t affect the overall explanatory power of the regression (i.e., R 2 ). It is a problem if the objective is to test the significance of individual coefficients because of the inflated variances/standard errors. Multicollinearity is often given too much emphasis in the list of common problems with regression analysis. If it s imperfect multicollinearity, which is almost always going to be the case, then it doesn t violate the classical assumptions. 29

30 Exercise: Q8.11 Questions for Discussion: Example Example 8.5.2: Does the Pope s 1966 decision to allow Catholics to eat meat on non-lent Fridays cause a shift in the demand function for fish? Consider the regression F t = β 0 + β 1 PF t + β 2 PB t + β 3 ln Yd t + β 4 N t + β 5 P t + ε t, where F t : average pounds of fish consumed per capita in year t PF t : price index for fish in year t PB t : price index for beef in year t Yd t : real per capita disposable income in year t (in billions of dollars) N t : the number of Catholics in the US in year t (tens of thousands) P t : =1 after the Pope s 1966 decision and 0 otherwise Question 1: State the null and alternative hypotheses to test whether the Pope s decision plays a negative role in the consumption of fish. Question 2: Some economic theory suggests that as income rises, the portion of 30

31 that extra income devoted to the consumption of fish will decrease. Is the choice of semilog function to relate the disposable income to the consumption of fish consistent with this theory? Question 3: Suppose the regression output is F t = PF t PB t ln Yd t N t P t (0.031) (0.0202) (1.87) ( ) (0.353) t : [1.27] [ ] [0.945] (-0.958) (-1.01) R 2 = 0.736, R 2 =0.666, n = 25. Evaluate the above regression results. Question 4: Are there any signs of multicollinearity in the above regression model? How do you check for this by using simple correlation coefficients? What is the drawback of this approach? [Hint. To detect multicollinearity with simple correlation coefficients: After you run the regression, select Procs/Make Regressor group on the equation window menu bar. Select View/Correlation/Common Sample on the group object menu bar.] 31

32 Question 5: How do you check the presence of multicollinearity by using the VIF? Verify that the VIF for PF t and ln Yd t, is about 43.4 and 23.3, respectively. What does this suggest to us? Question 6: Given the high correlation between ln Yd t and N t, it is reasonable to drop one of them. Given that the logic behind including the number of Catholics in a per capita fish consumption equation is fairly weak, we can decide to drop N t : F t = PF t PB t ln Yd t P t (0.03) (0.019) (1.15) (0.26) t : [0.98] [0.24] [0.31] (-0.48) R 2 = 0.723, R 2 =0.667, n = 25. Does this solve the problem? Question 7: Inthecaseofprices,bothPF t and PB t are theoretically important, so it is not advisable to drop either one. As an alternative, the textbook author suggests to use RP t = PF t /P B t to replace both price variables. Does it make any 32

33 sense to do so? If so, what is the expected sign of RP t? The regression output now becomes F t = RP t ln Yd t P t (1.43) (0.66) (0.281) t : [-1.35] [4.13] [0.019] R 2 = 0.640, R 2 =0.588, n = 25. Question 8: Based on the last regression output, can you reject the null hypothesis in Question 1? Remark. To calculate VIF( β b j )(j =1,,k) in Eviews: Step 1: Run the regression of X ji on (1, X 1i,,X j 1,i,X j+1,i,,x ki ) and name the equation as eqj, for example. Step 2: In the command window type scalar vifj=1/(1-eqj.@r2) or genr vifj=1/(1-eqj.@r2). The former generate a scalar value for vifj whereas the latter generates a sequence values for vifj. 33

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H. ACE 564 Spring 2006 Lecture 8 Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information by Professor Scott H. Irwin Readings: Griffiths, Hill and Judge. "Collinear Economic Variables,

More information

405 ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati

405 ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati 405 ECONOMETRICS Chapter # 11: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED? Domodar N. Gujarati Prof. M. El-Sakka Dept of Economics Kuwait University In this chapter we take a critical

More information

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity

OSU Economics 444: Elementary Econometrics. Ch.10 Heteroskedasticity OSU Economics 444: Elementary Econometrics Ch.0 Heteroskedasticity (Pure) heteroskedasticity is caused by the error term of a correctly speciþed equation: Var(² i )=σ 2 i, i =, 2,,n, i.e., the variance

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 7: Multicollinearity Egypt Scholars Economic Society November 22, 2014 Assignment & feedback Multicollinearity enter classroom at room name c28efb78 http://b.socrative.com/login/student/

More information

Contest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2.

Contest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2. Updated: November 17, 2011 Lecturer: Thilo Klein Contact: tk375@cam.ac.uk Contest Quiz 3 Question Sheet In this quiz we will review concepts of linear regression covered in lecture 2. NOTE: Please round

More information

LECTURE 11. Introduction to Econometrics. Autocorrelation

LECTURE 11. Introduction to Econometrics. Autocorrelation LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct

More information

ECON 497: Lecture 4 Page 1 of 1

ECON 497: Lecture 4 Page 1 of 1 ECON 497: Lecture 4 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 4 The Classical Model: Assumptions and Violations Studenmund Chapter 4 Ordinary least squares

More information

Steps in Regression Analysis

Steps in Regression Analysis MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4) 2-1 Steps in Regression Analysis 1. Review the literature and develop the theoretical model 2. Specify the model:

More information

Multiple Linear Regression CIVL 7012/8012

Multiple Linear Regression CIVL 7012/8012 Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

The regression model with one fixed regressor cont d

The regression model with one fixed regressor cont d The regression model with one fixed regressor cont d 3150/4150 Lecture 4 Ragnar Nymoen 27 January 2012 The model with transformed variables Regression with transformed variables I References HGL Ch 2.8

More information

Multicollinearity. Filippo Ferroni 1. Course in Econometrics and Data Analysis Ieseg September 22, Banque de France.

Multicollinearity. Filippo Ferroni 1. Course in Econometrics and Data Analysis Ieseg September 22, Banque de France. Filippo Ferroni 1 1 Business Condition and Macroeconomic Forecasting Directorate, Banque de France Course in Econometrics and Data Analysis Ieseg September 22, 2011 We have multicollinearity when two or

More information

Statistical Inference with Regression Analysis

Statistical Inference with Regression Analysis Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing

More information

An overview of applied econometrics

An overview of applied econometrics An overview of applied econometrics Jo Thori Lind September 4, 2011 1 Introduction This note is intended as a brief overview of what is necessary to read and understand journal articles with empirical

More information

WISE International Masters

WISE International Masters WISE International Masters ECONOMETRICS Instructor: Brett Graham INSTRUCTIONS TO STUDENTS 1 The time allowed for this examination paper is 2 hours. 2 This examination paper contains 32 questions. You are

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

EC4051 Project and Introductory Econometrics

EC4051 Project and Introductory Econometrics EC4051 Project and Introductory Econometrics Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Intro to Econometrics 1 / 23 Project Guidelines Each student is required to undertake

More information

The multiple regression model; Indicator variables as regressors

The multiple regression model; Indicator variables as regressors The multiple regression model; Indicator variables as regressors Ragnar Nymoen University of Oslo 28 February 2013 1 / 21 This lecture (#12): Based on the econometric model specification from Lecture 9

More information

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity R.G. Pierse 1 Omitted Variables Suppose that the true model is Y i β 1 + β X i + β 3 X 3i + u i, i 1,, n (1.1) where β 3 0 but that the

More information

Applied Quantitative Methods II

Applied Quantitative Methods II Applied Quantitative Methods II Lecture 4: OLS and Statistics revision Klára Kaĺıšková Klára Kaĺıšková AQM II - Lecture 4 VŠE, SS 2016/17 1 / 68 Outline 1 Econometric analysis Properties of an estimator

More information

ECON 497: Lecture Notes 10 Page 1 of 1

ECON 497: Lecture Notes 10 Page 1 of 1 ECON 497: Lecture Notes 10 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 10 Heteroskedasticity Studenmund Chapter 10 We'll start with a quote from Studenmund:

More information

Modelling the Electric Power Consumption in Germany

Modelling the Electric Power Consumption in Germany Modelling the Electric Power Consumption in Germany Cerasela Măgură Agricultural Food and Resource Economics (Master students) Rheinische Friedrich-Wilhelms-Universität Bonn cerasela.magura@gmail.com Codruța

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C =

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C = Economics 130 Lecture 6 Midterm Review Next Steps for the Class Multiple Regression Review & Issues Model Specification Issues Launching the Projects!!!!! Midterm results: AVG = 26.5 (88%) A = 27+ B =

More information

Lecture 1: OLS derivations and inference

Lecture 1: OLS derivations and inference Lecture 1: OLS derivations and inference Econometric Methods Warsaw School of Economics (1) OLS 1 / 43 Outline 1 Introduction Course information Econometrics: a reminder Preliminary data exploration 2

More information

Chapter 1. An Overview of Regression Analysis. Econometrics and Quantitative Analysis. What is Econometrics? (cont.) What is Econometrics?

Chapter 1. An Overview of Regression Analysis. Econometrics and Quantitative Analysis. What is Econometrics? (cont.) What is Econometrics? Econometrics and Quantitative Analysis Using Econometrics: A Practical Guide A.H. Studenmund 6th Edition. Addison Wesley Longman Chapter 1 An Overview of Regression Analysis Instructor: Dr. Samir Safi

More information

FinQuiz Notes

FinQuiz Notes Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable

More information

ECON 497 Midterm Spring

ECON 497 Midterm Spring ECON 497 Midterm Spring 2009 1 ECON 497: Economic Research and Forecasting Name: Spring 2009 Bellas Midterm You have three hours and twenty minutes to complete this exam. Answer all questions and explain

More information

L7: Multicollinearity

L7: Multicollinearity L7: Multicollinearity Feng Li feng.li@cufe.edu.cn School of Statistics and Mathematics Central University of Finance and Economics Introduction ï Example Whats wrong with it? Assume we have this data Y

More information

Iris Wang.

Iris Wang. Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?

More information

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall Applied Econometrics Second edition Dimitrios Asteriou and Stephen G. Hall MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3. Imperfect Multicollinearity 4.

More information

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Multiple Regression Analysis. Part III. Multiple Regression Analysis Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

1/34 3/ Omission of a relevant variable(s) Y i = α 1 + α 2 X 1i + α 3 X 2i + u 2i

1/34 3/ Omission of a relevant variable(s) Y i = α 1 + α 2 X 1i + α 3 X 2i + u 2i 1/34 Outline Basic Econometrics in Transportation Model Specification How does one go about finding the correct model? What are the consequences of specification errors? How does one detect specification

More information

2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0

2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0 Introduction to Econometrics Midterm April 26, 2011 Name Student ID MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. (5,000 credit for each correct

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables. Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

Linear Models in Econometrics

Linear Models in Econometrics Linear Models in Econometrics Nicky Grant At the most fundamental level econometrics is the development of statistical techniques suited primarily to answering economic questions and testing economic theories.

More information

STOCKHOLM UNIVERSITY Department of Economics Course name: Empirical Methods Course code: EC40 Examiner: Lena Nekby Number of credits: 7,5 credits Date of exam: Friday, June 5, 009 Examination time: 3 hours

More information

REED TUTORIALS (Pty) LTD ECS3706 EXAM PACK

REED TUTORIALS (Pty) LTD ECS3706 EXAM PACK REED TUTORIALS (Pty) LTD ECS3706 EXAM PACK 1 ECONOMETRICS STUDY PACK MAY/JUNE 2016 Question 1 (a) (i) Describing economic reality (ii) Testing hypothesis about economic theory (iii) Forecasting future

More information

Hypothesis Tests and Confidence Intervals. in Multiple Regression

Hypothesis Tests and Confidence Intervals. in Multiple Regression ECON4135, LN6 Hypothesis Tests and Confidence Intervals Outline 1. Why multipple regression? in Multiple Regression (SW Chapter 7) 2. Simpson s paradox (omitted variables bias) 3. Hypothesis tests and

More information

Homework Set 2, ECO 311, Fall 2014

Homework Set 2, ECO 311, Fall 2014 Homework Set 2, ECO 311, Fall 2014 Due Date: At the beginning of class on October 21, 2014 Instruction: There are twelve questions. Each question is worth 2 points. You need to submit the answers of only

More information

Applied Econometrics (MSc.) Lecture 3 Instrumental Variables

Applied Econometrics (MSc.) Lecture 3 Instrumental Variables Applied Econometrics (MSc.) Lecture 3 Instrumental Variables Estimation - Theory Department of Economics University of Gothenburg December 4, 2014 1/28 Why IV estimation? So far, in OLS, we assumed independence.

More information

1 Motivation for Instrumental Variable (IV) Regression

1 Motivation for Instrumental Variable (IV) Regression ECON 370: IV & 2SLS 1 Instrumental Variables Estimation and Two Stage Least Squares Econometric Methods, ECON 370 Let s get back to the thiking in terms of cross sectional (or pooled cross sectional) data

More information

Hypothesis Tests and Confidence Intervals in Multiple Regression

Hypothesis Tests and Confidence Intervals in Multiple Regression Hypothesis Tests and Confidence Intervals in Multiple Regression (SW Chapter 7) Outline 1. Hypothesis tests and confidence intervals for one coefficient. Joint hypothesis tests on multiple coefficients

More information

Motivation for multiple regression

Motivation for multiple regression Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope

More information

8. Instrumental variables regression

8. Instrumental variables regression 8. Instrumental variables regression Recall: In Section 5 we analyzed five sources of estimation bias arising because the regressor is correlated with the error term Violation of the first OLS assumption

More information

Regression Analysis. BUS 735: Business Decision Making and Research

Regression Analysis. BUS 735: Business Decision Making and Research Regression Analysis BUS 735: Business Decision Making and Research 1 Goals and Agenda Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Applied Statistics and Econometrics. Giuseppe Ragusa Lecture 15: Instrumental Variables

Applied Statistics and Econometrics. Giuseppe Ragusa Lecture 15: Instrumental Variables Applied Statistics and Econometrics Giuseppe Ragusa Lecture 15: Instrumental Variables Outline Introduction Endogeneity and Exogeneity Valid Instruments TSLS Testing Validity 2 Instrumental Variables Regression

More information

Lecture 3: Multiple Regression

Lecture 3: Multiple Regression Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u

More information

1. The OLS Estimator. 1.1 Population model and notation

1. The OLS Estimator. 1.1 Population model and notation 1. The OLS Estimator OLS stands for Ordinary Least Squares. There are 6 assumptions ordinarily made, and the method of fitting a line through data is by least-squares. OLS is a common estimation methodology

More information

Econometrics. 7) Endogeneity

Econometrics. 7) Endogeneity 30C00200 Econometrics 7) Endogeneity Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Common types of endogeneity Simultaneity Omitted variables Measurement errors

More information

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation 1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption

More information

Heteroscedasticity 1

Heteroscedasticity 1 Heteroscedasticity 1 Pierre Nguimkeu BUEC 333 Summer 2011 1 Based on P. Lavergne, Lectures notes Outline Pure Versus Impure Heteroscedasticity Consequences and Detection Remedies Pure Heteroscedasticity

More information

ECONOMETRICS HONOR S EXAM REVIEW SESSION

ECONOMETRICS HONOR S EXAM REVIEW SESSION ECONOMETRICS HONOR S EXAM REVIEW SESSION Eunice Han ehan@fas.harvard.edu March 26 th, 2013 Harvard University Information 2 Exam: April 3 rd 3-6pm @ Emerson 105 Bring a calculator and extra pens. Notes

More information

2 Prediction and Analysis of Variance

2 Prediction and Analysis of Variance 2 Prediction and Analysis of Variance Reading: Chapters and 2 of Kennedy A Guide to Econometrics Achen, Christopher H. Interpreting and Using Regression (London: Sage, 982). Chapter 4 of Andy Field, Discovering

More information

download instant at

download instant at Answers to Odd-Numbered Exercises Chapter One: An Overview of Regression Analysis 1-3. (a) Positive, (b) negative, (c) positive, (d) negative, (e) ambiguous, (f) negative. 1-5. (a) The coefficients in

More information

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, 2015-16 Academic Year Exam Version: A INSTRUCTIONS TO STUDENTS 1 The time allowed for this examination paper is 2 hours. 2 This

More information

Föreläsning /31

Föreläsning /31 1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +

More information

Introduction to Econometrics. Heteroskedasticity

Introduction to Econometrics. Heteroskedasticity Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/

More information

2. Linear regression with multiple regressors

2. Linear regression with multiple regressors 2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions

More information

Nonlinear Regression Functions

Nonlinear Regression Functions Nonlinear Regression Functions (SW Chapter 8) Outline 1. Nonlinear regression functions general comments 2. Nonlinear functions of one variable 3. Nonlinear functions of two variables: interactions 4.

More information

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43

Panel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43 Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression

More information

Multiple Regression Methods

Multiple Regression Methods Chapter 1: Multiple Regression Methods Hildebrand, Ott and Gray Basic Statistical Ideas for Managers Second Edition 1 Learning Objectives for Ch. 1 The Multiple Linear Regression Model How to interpret

More information

CHAPTER 6: SPECIFICATION VARIABLES

CHAPTER 6: SPECIFICATION VARIABLES Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero

More information

The general linear regression with k explanatory variables is just an extension of the simple regression as follows

The general linear regression with k explanatory variables is just an extension of the simple regression as follows 3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because

More information

Regression #8: Loose Ends

Regression #8: Loose Ends Regression #8: Loose Ends Econ 671 Purdue University Justin L. Tobias (Purdue) Regression #8 1 / 30 In this lecture we investigate a variety of topics that you are probably familiar with, but need to touch

More information

Final Exam - Solutions

Final Exam - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis March 17, 2010 Instructor: John Parman Final Exam - Solutions You have until 12:30pm to complete this exam. Please remember to put your

More information

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data

Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)

More information

Chapter 7. Hypothesis Tests and Confidence Intervals in Multiple Regression

Chapter 7. Hypothesis Tests and Confidence Intervals in Multiple Regression Chapter 7 Hypothesis Tests and Confidence Intervals in Multiple Regression Outline 1. Hypothesis tests and confidence intervals for a single coefficie. Joint hypothesis tests on multiple coefficients 3.

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

Sociology 593 Exam 1 Answer Key February 17, 1995

Sociology 593 Exam 1 Answer Key February 17, 1995 Sociology 593 Exam 1 Answer Key February 17, 1995 I. True-False. (5 points) Indicate whether the following statements are true or false. If false, briefly explain why. 1. A researcher regressed Y on. When

More information

Econ 444, class 11. Robert de Jong 1. Monday November 6. Ohio State University. Econ 444, Wednesday November 1, class Department of Economics

Econ 444, class 11. Robert de Jong 1. Monday November 6. Ohio State University. Econ 444, Wednesday November 1, class Department of Economics Econ 444, class 11 Robert de Jong 1 1 Department of Economics Ohio State University Monday November 6 Monday November 6 1 Exercise for today 2 New material: 1 dummy variables 2 multicollinearity Exercise

More information

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity

Outline. Possible Reasons. Nature of Heteroscedasticity. Basic Econometrics in Transportation. Heteroscedasticity 1/25 Outline Basic Econometrics in Transportation Heteroscedasticity What is the nature of heteroscedasticity? What are its consequences? How does one detect it? What are the remedial measures? Amir Samimi

More information

STA121: Applied Regression Analysis

STA121: Applied Regression Analysis STA121: Applied Regression Analysis Linear Regression Analysis - Chapters 3 and 4 in Dielman Artin Department of Statistical Science September 15, 2009 Outline 1 Simple Linear Regression Analysis 2 Using

More information

Inference in Regression Model

Inference in Regression Model Inference in Regression Model Christopher Taber Department of Economics University of Wisconsin-Madison March 25, 2009 Outline 1 Final Step of Classical Linear Regression Model 2 Confidence Intervals 3

More information

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity

More information

Unless provided with information to the contrary, assume for each question below that the Classical Linear Model assumptions hold.

Unless provided with information to the contrary, assume for each question below that the Classical Linear Model assumptions hold. Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Spring 2015 Instructor: Martin Farnham Unless provided with information to the contrary,

More information

Lecture 4: Multivariate Regression, Part 2

Lecture 4: Multivariate Regression, Part 2 Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above

More information

Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data

Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data Panel data Repeated observations on the same cross-section of individual units. Important advantages relative to pure cross-section data - possible to control for some unobserved heterogeneity - possible

More information

:Effects of Data Scaling We ve already looked at the effects of data scaling on the OLS statistics, 2, and R 2. What about test statistics?

:Effects of Data Scaling We ve already looked at the effects of data scaling on the OLS statistics, 2, and R 2. What about test statistics? MRA: Further Issues :Effects of Data Scaling We ve already looked at the effects of data scaling on the OLS statistics, 2, and R 2. What about test statistics? 1. Scaling the explanatory variables Suppose

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 6: Multiple regression analysis: Further issues

Wooldridge, Introductory Econometrics, 4th ed. Chapter 6: Multiple regression analysis: Further issues Wooldridge, Introductory Econometrics, 4th ed. Chapter 6: Multiple regression analysis: Further issues What effects will the scale of the X and y variables have upon multiple regression? The coefficients

More information

Solutions: Monday, October 15

Solutions: Monday, October 15 Amherst College Department of Economics Economics 360 Fall 2012 1. Consider Nebraska petroleum consumption. Solutions: Monday, October 15 Petroleum Consumption Data for Nebraska: Annual time series data

More information

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level

More information

WISE International Masters

WISE International Masters WISE International Masters ECONOMETRICS Instructor: Brett Graham INSTRUCTIONS TO STUDENTS 1 The time allowed for this examination paper is 2 hours. 2 This examination paper contains 32 questions. You are

More information

Econometrics Problem Set 11

Econometrics Problem Set 11 Econometrics Problem Set WISE, Xiamen University Spring 207 Conceptual Questions. (SW 2.) This question refers to the panel data regressions summarized in the following table: Dependent variable: ln(q

More information

Chapter 7 Student Lecture Notes 7-1

Chapter 7 Student Lecture Notes 7-1 Chapter 7 Student Lecture Notes 7- Chapter Goals QM353: Business Statistics Chapter 7 Multiple Regression Analysis and Model Building After completing this chapter, you should be able to: Explain model

More information

Econometrics Homework 1

Econometrics Homework 1 Econometrics Homework Due Date: March, 24. by This problem set includes questions for Lecture -4 covered before midterm exam. Question Let z be a random column vector of size 3 : z = @ (a) Write out z

More information

Rewrap ECON November 18, () Rewrap ECON 4135 November 18, / 35

Rewrap ECON November 18, () Rewrap ECON 4135 November 18, / 35 Rewrap ECON 4135 November 18, 2011 () Rewrap ECON 4135 November 18, 2011 1 / 35 What should you now know? 1 What is econometrics? 2 Fundamental regression analysis 1 Bivariate regression 2 Multivariate

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction

More information

Violation of OLS assumption- Multicollinearity

Violation of OLS assumption- Multicollinearity Violation of OLS assumption- Multicollinearity What, why and so what? Lars Forsberg Uppsala University, Department of Statistics October 17, 2014 Lars Forsberg (Uppsala University) 1110 - Multi - co -

More information

Applied Statistics and Econometrics

Applied Statistics and Econometrics Applied Statistics and Econometrics Lecture 7 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 68 Outline of Lecture 7 1 Empirical example: Italian labor force

More information

Inference with Simple Regression

Inference with Simple Regression 1 Introduction Inference with Simple Regression Alan B. Gelder 06E:071, The University of Iowa 1 Moving to infinite means: In this course we have seen one-mean problems, twomean problems, and problems

More information

6. Assessing studies based on multiple regression

6. Assessing studies based on multiple regression 6. Assessing studies based on multiple regression Questions of this section: What makes a study using multiple regression (un)reliable? When does multiple regression provide a useful estimate of the causal

More information

ECON Introductory Econometrics. Lecture 13: Internal and external validity

ECON Introductory Econometrics. Lecture 13: Internal and external validity ECON4150 - Introductory Econometrics Lecture 13: Internal and external validity Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 9 Lecture outline 2 Definitions of internal and external

More information