Multiple Regression Analysis. Part III. Multiple Regression Analysis

Size: px
Start display at page:

Download "Multiple Regression Analysis. Part III. Multiple Regression Analysis"

Transcription

1 Part III Multiple Regression Analysis As of Sep 26, 2017

2 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

3 The general linear regression with k explanatory variables is just an extension of the simple regression as follows Because y = β 0 + β 1 x β k x k + u. (1) y x j = β j (2) j = 1,..., k, coefficient β j indicates the marginal effect of variable x j, and indicates the amount y is expected to change as x j changes by one unit and other variables are kept constant (ceteris paribus).

4 Multiple regression opens up several additional options to enrich analysis and make modeling more realistic compared to the simple regression.

5 Example 3.1: Consider the consumption function C = f (Y ), where Y is income. Suppose the assumption is that as incomes grow the marginal propensity to consume decreases. In simple regression we could dry to fit a level-log model or log-log model. One possibility also could be β 1 = β 1l + β 1q Y, where according to our hypothesis β 1q < 0. Thus the consumption function becomes C = β 0 + (β 1l + β 1q Y )Y + u = β 0 + β 1l Y + β 1q Y 2 + u This is a multiple regression model with x 1 = Y and x 2 = Y 2.

6 This simple example demonstrates that we can meaningfully enrich simple regression analysis (even though we have essentially only two variables, C and Y ) and at the same time get a meaningful interpretation to the above polynomial model. Technically, considering the simple regression C = β 0 + β 1 Y + v, the extension C = β 0 + β il Y + β 1q Y 2 + u means that we have extracted the quadratic term Y 2 out from the error term v of the simple regression.

7 Estimation 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

8 Estimation In order to estimate the model we replace the classical assumption 4 by 4. The explanatory variables are in mathematical sense linearly independent. That is no column in the data matrix X (see below) is a linear combination of the other columns. The key assumption (Assumption 5) in terms of the conditional expectation becomes E[u x 1,..., x k ] = 0. (3)

9 Estimation Given observations (y i, x i1,..., x ik ), i = 1,..., n (n = the number of observations), the estimation method again is the OLS, which produces estimates ˆβ 0, ˆβ 1,..., ˆβ k by minimizing n (y i β 0 β 1 x i1 β k x ik ) 2 (4) i=1 with respect to the parameters. Again the first order solution is to set the (k + 1) partial derivatives equal to zero. The solution is straightforward although the explicit form of the estimators become complicated.

10 Estimation 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

11 Estimation Using matrix algebra simplifies considerably the notations in multiple regression. Denote the observation vector on y as where the prime denotes transposition. y = (y 1,..., y n ), (5) In the same manner denote the data matrix on x-variables enhanced with ones in the first column as an n (k + 1) matrix 1 x 11 x 12 x 1k 1 x 21 x 22 x 2k X =...., (6) 1 x n1 x n2 x nk where k < n (the number of observations, n, is larger than the number of x-variables, k).

12 Estimation Then we can present the whole set of regression equations for the sample y 1 = β 0 + β 1 x β k x 1k + u 1 y 2 = β 0 + β 1 x β k x 2k + u 2 in the matrix form as y 1 y 2 =.. y n or shortly where. y n = β 0 + β 1 x n1 + + β k x nk + u n 1 x 11 x 12 x 1k 1 x 21 x 22 x 2k... 1 x n1 x n2 x nk β 0 β 1 β 2. β k + u 1 u 2. u n (7) (8) y = Xβ + u, (9) β = (β 0, β 1,..., β k )

13 Estimation The normal equations for the first order conditions of the minimization of (4) in matrix form are simply X Xˆβ = X y (10) which gives the explicit solution for the OLS estimator of β as where ˆβ = ( ˆβ 0, ˆβ 1,..., ˆβ k ). The fitted model is i = 1,..., n. ˆβ = (X X) 1 X y, (11) ŷ i = ˆβ 0 + ˆβ 1 x i1 + + ˆβ k x ik, (12) Remark 3.1: For example, if you fit the simple regression ỹ = β 0 + β 1 x 1, where β 0 and β 1 are the OLS estimators, and fit a multiple regression ŷ = ˆβ 0 + ˆβ 1 x 1 + ˆβ 2 x 2 then generally β 1 ˆβ 1 unless ˆβ 2 = 0, or x 1 and x 2 are uncorrelated.

14 Estimation Example 3.2: Consider the hourly wage example. enhance the model as log(w) = β 0 + β 1 x 1 + β 2 x 2 + β 3 x 3, (13) where w = average hourly earnings, x 1 = years of education (educ), x 2 = years of labor market experience (exper), and x 3 = years with the current employer (tenure). Dependent Variable: LOG(WAGE) Method: Least Squares Sample: Included observations: 526 Variable Coefficient Std. Error t-statistic Prob. C EDUC EXPER TENURE R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)

15 Goodness-of-Fit 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

16 Goodness-of-Fit Again in the same manner as with the simple regression we have or where n n n (y i ȳ) 2 = (ŷ i ȳ) 2 + (y i ŷ i ) 2 (14) i=1 i=1 i=1 SST = SSE + SSR, (15) ŷ i = ˆβ 0 + ˆβ 1 x i1 + + ˆβ k x ik. (16)

17 Goodness-of-Fit 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

18 Goodness-of-Fit R 2 = SSE SST = 1 SSR SST. (17) Again as in the case of the simple regression, R 2 can be shown to be the squared correlation coefficient between the actual y i and fitted ŷ i. This correlation is called the multiple correlation R = (yi ȳ)(ŷ i ŷ) (yi ȳ) 2. (18) (ŷi ŷ) 2 Remark 3.2: ŷ = ȳ. Remark 3.3: R 2 never decreases when an explanatory variable is added to the model.

19 Goodness-of-Fit 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

20 Goodness-of-Fit where s 2 u = 1 n k 1 R 2 = 1 s2 u sy 2, (19) n (y i ŷ i ) 2 = i=1 1 n k 1 n ûi 2 (20) i=1 is an estimator of the error variance σ 2 u = var[u i ].

21 Expected values of the OLS estimators 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

22 Expected values of the OLS estimators Under the classical assumptions 1 5 (with assumption 5 updated to multiple regression) we can show that the estimators of the regression coefficients are unbiased. That is Theorem 3.1: Given observations on the x-variables [ ] E ˆβ j = β j (21)

23 Expected values of the OLS estimators Using matrix notations, the proof of Theorem 3.1 is pretty straightforward. To do this write ˆβ = (X X) 1 X y = (X X) 1 X (Xβ + u) = (X X) 1 X Xβ + (X X) 1 X u (22) = β + (X X) 1 X u. Given X, the expected value of ˆβ is ] E[ˆβ = β + (X X) 1 X E[u] = β (23) because by Assumption 1 E[u] = 0. Thus the OLS estimators of the regression coefficients are unbiased.

24 Expected values of the OLS estimators Remark 3.4: If z = (z 1,..., z k ) is a random vector, then E[z] = (E[z 1 ],..., E[z n ]). That is the expected value of a vector is a vector whose components are the individual expected values.

25 Expected values of the OLS estimators 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

26 Expected values of the OLS estimators Suppose the correct model is but we estimate the model as y = β 0 + β 1 x 1 + u (24) y = β 0 + β 1 x 1 + β 2 x 2 + u. (25) Thus, β 2 = 0 in reality. The OLS estimation results yield ŷ = ˆβ 0 + ˆβ 1 x 1 + ˆβ 2 x 2 (26) [ ] [ ] By Theorem 3.1 E ˆβ j = β j, thus in particular E ˆβ 2 = β 2 = 0, implying that inclusion of extra variables to a regression does not bias the results. However, as will be seen later, they decrease accuracy of estimation by increasing variance of the OLS estimates.

27 Expected values of the OLS estimators 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

28 Expected values of the OLS estimators Suppose now as an example that the correct model is but we misspecify the model as y = β 0 + β 1 x 1 + β 2 x 2 + u, (27) y = β 0 + β 1 x 1 + v, (28) where the omitted variable is embedded into the residual term v = β 2 x 2 + u. OLS estimator for β 1 for specification (28)is β 1 = (xi1 x 1 )y i (xi1 x 1 ) 2. (29)

29 Expected values of the OLS estimators From Equation (2.38) we have n β 1 = β 1 + a i v i, (30) i=1 where a i = (x i1 x 1 ) (xi1 x 1 ) 2. (31)

30 Expected values of the OLS estimators Thus because E[v i ] = E[β 2 x i2 + u i ] = β 2 x i2 [ ] E β1 = β 1 + a i E[v i ] = β 1 + a i β 2 x i2 (32) = β 1 + β 2 (xi1 x 1 )x i2 (xi1 x 1 ) 2 i.e., [ ] (xi1 x 1 )x i2 E β 1 = β 1 + β 2 (xi1 x 1 ) 2, (33) implying that β 1 is biased for β 1 unless x 1 and x 2 are uncorrelated (or β 2 = 0). This is called the omitted variable bias.

31 Expected values of the OLS estimators The direction of the omitted variable bias is as follows: corr[x 1, x 2 ] > 0 corr[x 1, x 2 ] < 0 β 2 > 0 positive bias negative bias β 2 < 0 negative bias positive bias

32 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

33 Write the regression model y i = β 0 + β 1 x i1 + + β k x ik + u i (34) i = 1,..., n in the matrix form y = Xβ + u. (35) Then we can write the OLS estimators compactly as ˆβ = (X X) 1 X y. (36)

34 Under the classical assumption 1 5, and assuming X fixed we can show that the variance-covariance matrix β is ] cov[ˆβ = (X X) 1 σu. 2 (37) Variances of the individual coefficients are obtained form the main diagonal of the matrix, and can be shown to be of the form [ ] var ˆβ j = σ 2 u (1 R 2 j ) n i=1 (x ij x j ) 2, (38) j = 1,..., k, where Rj 2 is the R-square when regressing x j on the other explanatory variables and the constant term.

35 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

36 In terms of linear algebra, we say that vectors are linearly independent if x 1, x 2,..., x k a 1 x a k x k = 0 holds only if a 1 = = a k = 0. Otherwise x 1,..., x k are linearly dependent. In such a case some a l 0 and we can write x l = c 1 x c l 1 x l 1 + c l+1 x l c k x k, where c j = a j /a l that is x l can be represented as a linear combination of the other variables.

37 In statistics the multiple correlation measures the degree of linear dependence. If the variables are perfectly linearly dependent. That is, if for example, x j is a linear combination of other variables, the multiple correlation R j = 1. A perfect linear dependence is rare between random variables. However, particularly between macro economic variables dependencies are often high.

38 [ ] From the variance equation (38) we see that var ˆβ j as R 2 j 1. That is, the more the explanatory variables are linearly dependent the larger the variance becomes. This implies that the coefficient estimates become increasingly unstable. High (but not perfect) correlation between two or more explanatory variables is called multicollinearity.

39 Symptoms of multicollinearity: 1 High correlations between explanatory variables. 2 R 2 is relatively high, but the coefficient estimates tend to be insignificant (see the section of hypothesis testing) 3 Some of the coefficients are of wrong sign and some coefficients are at the same time unreasonably large. 4 Coefficient estimates change much from one model alternative to another.

40 Example 3.3: Variable E t denotes expenditure costs in a sample of Toyota Mark II cars at time point t, M t denotes the mileage and A t age. Consider model alternatives: Model A: Model B: Model C: E t = α 0 + α 1 A t + u 1t E t = β 0 + β 1 M t + u 2t E t = γ 0 + γ 1 M t + γ 2 A t + u 3t

41 Estimation results: (t-values in parentheses) Variable Malli A Malli B Malli C Constant (-5.98) (-5.91) (0.06) Age (22.16) (9.58) Miles (18.27) (-7.06) df R ˆσ Findings: Apriori, coefficients α 1, β 1, γ 1, and γ 2 should be positive. However, ˆγ 2 = (!!?), but ˆβ 1 = Correlation r M,A = 0.996!

42 Remedies: In the collinearity problem the question is the there is not enough information to reliably identify each variables contribution as an explanatory variable in the model. Thus in order to alleviate the problem: 1 Use non-sample information if available to impose restrictions between coefficients. 2 Increase the sample size if possible. 3 Drop the most collinear (on the base of R 2 j ) variables. 4 If a linear combination (usually a sum) of the most collinear variables is meaningful, replace the collinear variables by the linear combination.

43 Remark: Multicollinearity is not always harmful. If an explanatory variable is not correlated with the rest of the explanatory variables multicollinearity of these variables (provided that it is not perfect) does not harm the variance of the slope coefficient estimator of the uncorrelated explanatory variable. If x j is not correlated with any of the other predictors, Rj 2 = 0 and hence, as is seen from equation (38), the correlation factor (1 Rj 2 ) drops out [ ] from its slope coefficient estimator variance var ˆβj. Another case where multicollinearity (again excluding the perfect case) is not a problem is if we are purely interested in the predictions of the model. Predictions are affected essentially only by the usefulness of the available information.

44 Exploring further (Wooldridge 5e, p.93) Consider a model explaining final exam score by class attendance (number of classes attended). To control for student abilities and efforts outside the classroom, additional explanatory variables like cumulative GPA, SAT score, and a measure of high school performance are included. Someone says, You cannot learn much from this exercise because cumulative GPA, SAT score, and high school performance are likely to be highly collinear. What should be your response?

45 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

46 Consider again as in (27) the regression model y = β 0 + β 1 x 1 + β 2 x 2 + u. (39) Suppose the following models are estimated by OLS ŷ = ˆβ 0 + ˆβ 1 x 1 + ˆβ 2 x 2 (40) and Then and [ ] var ˆβ 1 = ỹ = β 0 + β 1 x 1. (41) [ ] var β 1 = σ 2 u (1 r 2 12 ) (x i1 x 1 ) 2 (42) σ 2 u (xi1 x 1 ) 2, (43) where r 12 is the sample correlation between x 1 and x 2.

47 [ ] [ ] Thus var β1 var ˆβ1, and the inequality is strict if r In summary (assuming r 12 0): 1 If β 2 0, then β 1 is biased, ˆβ [ ] [ ] 1 is unbiased, and var β1 < var ˆβ1 2 If β 2 = 0, then both β 1 and ˆβ [ ] [ ] 1 are unbiased, but var β1 < var ˆβ1

48 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

49 An unbiased estimator of the error variance var[u] = σ 2 u is where ˆσ 2 u = 1 n k 1 n ûi 2, (44) i=1 û i = y i ˆβ 0 ˆβ 1 x i1 ˆβ k u ik. (45) The term n k 1 in (43) is the degrees of freedom (df). It can be shown that i.e., ˆσ 2 u is unbiased estimator of σ 2 u. E [ˆσ 2 u] = σ 2 u, (46) ˆσ u is called the standard error of the regression.

50 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

51 Standard deviation of ˆβ j is the square root of (38), i.e. σ ˆβj = σ u var[β j ] = (1 Rj 2) (47) (x ij x j ) 2 Substituting σ u by its estimate ˆσ u = ˆσ 2 u gives the standard error of ˆβ j se( ˆβ j ) = ˆσ u (1 R 2 j ) (x ij x j ) 2. (48)

52 The Gauss-Markov Theorem 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant variables in a regression Omitted variable bias Multicollinearity Variances of misspecified models Estimating error variance σ 2 u Standard error of ˆβ k The Gauss-Markov Theorem

53 The Gauss-Markov Theorem Theorem 3.2: Under the classical assumptions 1 5 ˆβ 0, ˆβ 1,... ˆβ k are the best linear unbiased estimators (BLUEs) of β 0, β 1,... β k, respectively. BLUE: Best: The variance of the OLS estimator is smallest among all linear unbiased estimators of β j Linear: ˆβ j = n i=1 w ijy i. [ ] Unbiased: E ˆβ j = β j.

The general linear regression with k explanatory variables is just an extension of the simple regression as follows

The general linear regression with k explanatory variables is just an extension of the simple regression as follows 3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because

More information

The Simple Regression Model. Part II. The Simple Regression Model

The Simple Regression Model. Part II. The Simple Regression Model Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square

More information

CHAPTER 6: SPECIFICATION VARIABLES

CHAPTER 6: SPECIFICATION VARIABLES Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero

More information

Heteroskedasticity. Part VII. Heteroskedasticity

Heteroskedasticity. Part VII. Heteroskedasticity Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least

More information

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018 Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

Statistical Inference. Part IV. Statistical Inference

Statistical Inference. Part IV. Statistical Inference Part IV Statistical Inference As of Oct 5, 2017 Sampling Distributions of the OLS Estimator 1 Statistical Inference Sampling Distributions of the OLS Estimator Testing Against One-Sided Alternatives Two-Sided

More information

Model Specification and Data Problems. Part VIII

Model Specification and Data Problems. Part VIII Part VIII Model Specification and Data Problems As of Oct 24, 2017 1 Model Specification and Data Problems RESET test Non-nested alternatives Outliers A functional form misspecification generally means

More information

The Multiple Regression Model Estimation

The Multiple Regression Model Estimation Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:

More information

5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1)

5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) 5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) Assumption #A1: Our regression model does not lack of any further relevant exogenous variables beyond x 1i, x 2i,..., x Ki and

More information

Multiple Linear Regression CIVL 7012/8012

Multiple Linear Regression CIVL 7012/8012 Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Answers to Problem Set #4

Answers to Problem Set #4 Answers to Problem Set #4 Problems. Suppose that, from a sample of 63 observations, the least squares estimates and the corresponding estimated variance covariance matrix are given by: bβ bβ 2 bβ 3 = 2

More information

Multivariate Regression Analysis

Multivariate Regression Analysis Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x

More information

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors: Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility

More information

THE MULTIVARIATE LINEAR REGRESSION MODEL

THE MULTIVARIATE LINEAR REGRESSION MODEL THE MULTIVARIATE LINEAR REGRESSION MODEL Why multiple regression analysis? Model with more than 1 independent variable: y 0 1x1 2x2 u It allows : -Controlling for other factors, and get a ceteris paribus

More information

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Practical Econometrics. for. Finance and Economics. (Econometrics 2) Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics

More information

Heteroscedasticity 1

Heteroscedasticity 1 Heteroscedasticity 1 Pierre Nguimkeu BUEC 333 Summer 2011 1 Based on P. Lavergne, Lectures notes Outline Pure Versus Impure Heteroscedasticity Consequences and Detection Remedies Pure Heteroscedasticity

More information

2. Linear regression with multiple regressors

2. Linear regression with multiple regressors 2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions

More information

Statistical Inference with Regression Analysis

Statistical Inference with Regression Analysis Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing

More information

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects Economics 113 Simple Regression Models Simple Regression Assumptions Simple Regression Derivation Changing Units of Measurement Nonlinear effects OLS and unbiased estimates Variance of the OLS estimates

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

3. Linear Regression With a Single Regressor

3. Linear Regression With a Single Regressor 3. Linear Regression With a Single Regressor Econometrics: (I) Application of statistical methods in empirical research Testing economic theory with real-world data (data analysis) 56 Econometrics: (II)

More information

Advanced Quantitative Methods: ordinary least squares

Advanced Quantitative Methods: ordinary least squares Advanced Quantitative Methods: Ordinary Least Squares University College Dublin 31 January 2012 1 2 3 4 5 Terminology y is the dependent variable referred to also (by Greene) as a regressand X are the

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there

More information

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C =

Multiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C = Economics 130 Lecture 6 Midterm Review Next Steps for the Class Multiple Regression Review & Issues Model Specification Issues Launching the Projects!!!!! Midterm results: AVG = 26.5 (88%) A = 27+ B =

More information

2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0

2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0 Introduction to Econometrics Midterm April 26, 2011 Name Student ID MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. (5,000 credit for each correct

More information

Linear Regression. Junhui Qian. October 27, 2014

Linear Regression. Junhui Qian. October 27, 2014 Linear Regression Junhui Qian October 27, 2014 Outline The Model Estimation Ordinary Least Square Method of Moments Maximum Likelihood Estimation Properties of OLS Estimator Unbiasedness Consistency Efficiency

More information

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018 Econometrics I KS Module 1: Bivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: March 12, 2018 Alexander Ahammer (JKU) Module 1: Bivariate

More information

5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is

5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is Practice Final Exam Last Name:, First Name:. Please write LEGIBLY. Answer all questions on this exam in the space provided (you may use the back of any page if you need more space). Show all work but do

More information

Multiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE. Sampling Distributions of OLS Estimators

Multiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE. Sampling Distributions of OLS Estimators 1 2 Multiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE Hüseyin Taştan 1 1 Yıldız Technical University Department of Economics These presentation notes are based on Introductory

More information

6. Assessing studies based on multiple regression

6. Assessing studies based on multiple regression 6. Assessing studies based on multiple regression Questions of this section: What makes a study using multiple regression (un)reliable? When does multiple regression provide a useful estimate of the causal

More information

Regression with Qualitative Information. Part VI. Regression with Qualitative Information

Regression with Qualitative Information. Part VI. Regression with Qualitative Information Part VI Regression with Qualitative Information As of Oct 17, 2017 1 Regression with Qualitative Information Single Dummy Independent Variable Multiple Categories Ordinal Information Interaction Involving

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators

More information

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1

coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1 Review - Interpreting the Regression If we estimate: It can be shown that: where ˆ1 r i coefficients β ˆ+ βˆ x+ βˆ ˆ= 0 1 1 2x2 y ˆβ n n 2 1 = rˆ i1yi rˆ i1 i= 1 i= 1 xˆ are the residuals obtained when

More information

Econometrics - Slides

Econometrics - Slides 1 Econometrics - Slides 2011/2012 João Nicolau 2 1 Introduction 1.1 What is Econometrics? Econometrics is a discipline that aims to give empirical content to economic relations. It has been defined generally

More information

Empirical Economic Research, Part II

Empirical Economic Research, Part II Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction

More information

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall

Applied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall Applied Econometrics Second edition Dimitrios Asteriou and Stephen G. Hall MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3. Imperfect Multicollinearity 4.

More information

Introductory Econometrics

Introductory Econometrics Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction

More information

x = 1 n (x = 1 (x n 1 ι(ι ι) 1 ι x) (x ι(ι ι) 1 ι x) = 1

x = 1 n (x = 1 (x n 1 ι(ι ι) 1 ι x) (x ι(ι ι) 1 ι x) = 1 Estimation and Inference in Econometrics Exercises, January 24, 2003 Solutions 1. a) cov(wy ) = E [(WY E[WY ])(WY E[WY ]) ] = E [W(Y E[Y ])(Y E[Y ]) W ] = W [(Y E[Y ])(Y E[Y ]) ] W = WΣW b) Let Σ be a

More information

Heteroskedasticity and Autocorrelation

Heteroskedasticity and Autocorrelation Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity

More information

Outline. 2. Logarithmic Functional Form and Units of Measurement. Functional Form. I. Functional Form: log II. Units of Measurement

Outline. 2. Logarithmic Functional Form and Units of Measurement. Functional Form. I. Functional Form: log II. Units of Measurement Outline 2. Logarithmic Functional Form and Units of Measurement I. Functional Form: log II. Units of Measurement Read Wooldridge (2013), Chapter 2.4, 6.1 and 6.2 2 Functional Form I. Functional Form: log

More information

Motivation for multiple regression

Motivation for multiple regression Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope

More information

Sample Problems. Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them.

Sample Problems. Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them. Sample Problems 1. True or False Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them. (a) The sample average of estimated residuals

More information

Applied Statistics and Econometrics

Applied Statistics and Econometrics Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 2: Simple Regression Egypt Scholars Economic Society Happy Eid Eid present! enter classroom at http://b.socrative.com/login/student/ room name c28efb78 Outline

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares Many economic models involve endogeneity: that is, a theoretical relationship does not fit

More information

Multiple Regression Analysis

Multiple Regression Analysis Chapter 4 Multiple Regression Analysis The simple linear regression covered in Chapter 2 can be generalized to include more than one variable. Multiple regression analysis is an extension of the simple

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Review of Econometrics

Review of Econometrics Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Econometric Methods. Prediction / Violation of A-Assumptions. Burcu Erdogan. Universität Trier WS 2011/2012

Econometric Methods. Prediction / Violation of A-Assumptions. Burcu Erdogan. Universität Trier WS 2011/2012 Econometric Methods Prediction / Violation of A-Assumptions Burcu Erdogan Universität Trier WS 2011/2012 (Universität Trier) Econometric Methods 30.11.2011 1 / 42 Moving on to... 1 Prediction 2 Violation

More information

11. Simultaneous-Equation Models

11. Simultaneous-Equation Models 11. Simultaneous-Equation Models Up to now: Estimation and inference in single-equation models Now: Modeling and estimation of a system of equations 328 Example: [I] Analysis of the impact of advertisement

More information

ECO220Y Simple Regression: Testing the Slope

ECO220Y Simple Regression: Testing the Slope ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x

More information

Lecture 8. Using the CLR Model. Relation between patent applications and R&D spending. Variables

Lecture 8. Using the CLR Model. Relation between patent applications and R&D spending. Variables Lecture 8. Using the CLR Model Relation between patent applications and R&D spending Variables PATENTS = No. of patents (in 000) filed RDEP = Expenditure on research&development (in billions of 99 $) The

More information

Multiple Regression Analysis: Estimation. Simple linear regression model: an intercept and one explanatory variable (regressor)

Multiple Regression Analysis: Estimation. Simple linear regression model: an intercept and one explanatory variable (regressor) 1 Multiple Regression Analysis: Estimation Simple linear regression model: an intercept and one explanatory variable (regressor) Y i = β 0 + β 1 X i + u i, i = 1,2,, n Multiple linear regression model:

More information

Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama

Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama Course Packet The purpose of this packet is to show you one particular dataset and how it is used in

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear

More information

Eastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I. M. Balcilar. Midterm Exam Fall 2007, 11 December 2007.

Eastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I. M. Balcilar. Midterm Exam Fall 2007, 11 December 2007. Eastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I M. Balcilar Midterm Exam Fall 2007, 11 December 2007 Duration: 120 minutes Questions Q1. In order to estimate the demand

More information

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists

More information

1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11

1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11 Econ 495 - Econometric Review 1 Contents 1 Linear Regression Analysis 4 1.1 The Mincer Wage Equation................. 4 1.2 Data............................. 6 1.3 Econometric Model.....................

More information

Brief Suggested Solutions

Brief Suggested Solutions DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECONOMICS 366: ECONOMETRICS II SPRING TERM 5: ASSIGNMENT TWO Brief Suggested Solutions Question One: Consider the classical T-observation, K-regressor linear

More information

4. Nonlinear regression functions

4. Nonlinear regression functions 4. Nonlinear regression functions Up to now: Population regression function was assumed to be linear The slope(s) of the population regression function is (are) constant The effect on Y of a unit-change

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Introduction to Statistical modeling: handout for Math 489/583

Introduction to Statistical modeling: handout for Math 489/583 Introduction to Statistical modeling: handout for Math 489/583 Statistical modeling occurs when we are trying to model some data using statistical tools. From the start, we recognize that no model is perfect

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 7: Multicollinearity Egypt Scholars Economic Society November 22, 2014 Assignment & feedback Multicollinearity enter classroom at room name c28efb78 http://b.socrative.com/login/student/

More information

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata'

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata' Business Statistics Tommaso Proietti DEF - Università di Roma 'Tor Vergata' Linear Regression Specication Let Y be a univariate quantitative response variable. We model Y as follows: Y = f(x) + ε where

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

Outline. 11. Time Series Analysis. Basic Regression. Differences between Time Series and Cross Section

Outline. 11. Time Series Analysis. Basic Regression. Differences between Time Series and Cross Section Outline I. The Nature of Time Series Data 11. Time Series Analysis II. Examples of Time Series Models IV. Functional Form, Dummy Variables, and Index Basic Regression Numbers Read Wooldridge (2013), Chapter

More information

Lecture 4: Multivariate Regression, Part 2

Lecture 4: Multivariate Regression, Part 2 Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above

More information

Lecture 4: Multivariate Regression, Part 2

Lecture 4: Multivariate Regression, Part 2 Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above

More information

Applied Quantitative Methods II

Applied Quantitative Methods II Applied Quantitative Methods II Lecture 4: OLS and Statistics revision Klára Kaĺıšková Klára Kaĺıšková AQM II - Lecture 4 VŠE, SS 2016/17 1 / 68 Outline 1 Econometric analysis Properties of an estimator

More information

In Chapter 2, we learned how to use simple regression analysis to explain a dependent

In Chapter 2, we learned how to use simple regression analysis to explain a dependent C h a p t e r Three Multiple Regression Analysis: Estimation In Chapter 2, we learned how to use simple regression analysis to explain a dependent variable, y, as a function of a single independent variable,

More information

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity R.G. Pierse 1 Omitted Variables Suppose that the true model is Y i β 1 + β X i + β 3 X 3i + u i, i 1,, n (1.1) where β 3 0 but that the

More information

Applied Regression Analysis

Applied Regression Analysis Applied Regression Analysis Chapter 3 Multiple Linear Regression Hongcheng Li April, 6, 2013 Recall simple linear regression 1 Recall simple linear regression 2 Parameter Estimation 3 Interpretations of

More information

Regression Analysis: Basic Concepts

Regression Analysis: Basic Concepts The simple linear model Regression Analysis: Basic Concepts Allin Cottrell Represents the dependent variable, y i, as a linear function of one independent variable, x i, subject to a random disturbance

More information

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research Linear models Linear models are computationally convenient and remain widely used in applied econometric research Our main focus in these lectures will be on single equation linear models of the form y

More information

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47 ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with

More information

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent

More information

ECON3150/4150 Spring 2015

ECON3150/4150 Spring 2015 ECON3150/4150 Spring 2015 Lecture 3&4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo January 29, 2015 1 / 67 Chapter 4 in S&W Section 17.1 in S&W (extended OLS assumptions) 2

More information

Correlation and Regression

Correlation and Regression Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class

More information

Multiple Regression Analysis: Further Issues

Multiple Regression Analysis: Further Issues Multiple Regression Analysis: Further Issues Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) MLR: Further Issues 1 / 36 Effects of Data Scaling on OLS Statistics Effects

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Markus Haas LMU München Summer term 2011 15. Mai 2011 The Simple Linear Regression Model Considering variables x and y in a specific population (e.g., years of education and wage

More information

Econometrics Review questions for exam

Econometrics Review questions for exam Econometrics Review questions for exam Nathaniel Higgins nhiggins@jhu.edu, 1. Suppose you have a model: y = β 0 x 1 + u You propose the model above and then estimate the model using OLS to obtain: ŷ =

More information

Inference in Regression Analysis

Inference in Regression Analysis ECNS 561 Inference Inference in Regression Analysis Up to this point 1.) OLS is unbiased 2.) OLS is BLUE (best linear unbiased estimator i.e., the variance is smallest among linear unbiased estimators)

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,

More information

Final Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58

Final Review. Yang Feng.   Yang Feng (Columbia University) Final Review 1 / 58 Final Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Final Review 1 / 58 Outline 1 Multiple Linear Regression (Estimation, Inference) 2 Special Topics for Multiple

More information

Econ 510 B. Brown Spring 2014 Final Exam Answers

Econ 510 B. Brown Spring 2014 Final Exam Answers Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity

More information

Topic 4: Model Specifications

Topic 4: Model Specifications Topic 4: Model Specifications Advanced Econometrics (I) Dong Chen School of Economics, Peking University 1 Functional Forms 1.1 Redefining Variables Change the unit of measurement of the variables will

More information

In Chapter 2, we learned how to use simple regression analysis to explain a dependent

In Chapter 2, we learned how to use simple regression analysis to explain a dependent 3 Multiple Regression Analysis: Estimation In Chapter 2, we learned how to use simple regression analysis to explain a dependent variable, y, as a function of a single independent variable, x. The primary

More information

Basic econometrics. Tutorial 3. Dipl.Kfm. Johannes Metzler

Basic econometrics. Tutorial 3. Dipl.Kfm. Johannes Metzler Basic econometrics Tutorial 3 Dipl.Kfm. Introduction Some of you were asking about material to revise/prepare econometrics fundamentals. First of all, be aware that I will not be too technical, only as

More information

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning Økonomisk Kandidateksamen 2004 (I) Econometrics 2 Rettevejledning This is a closed-book exam (uden hjælpemidler). Answer all questions! The group of questions 1 to 4 have equal weight. Within each group,

More information

Econometrics Multiple Regression Analysis: Heteroskedasticity

Econometrics Multiple Regression Analysis: Heteroskedasticity Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties

More information

ECON3150/4150 Spring 2016

ECON3150/4150 Spring 2016 ECON3150/4150 Spring 2016 Lecture 6 Multiple regression model Siv-Elisabeth Skjelbred University of Oslo February 5th Last updated: February 3, 2016 1 / 49 Outline Multiple linear regression model and

More information

ECON3150/4150 Spring 2016

ECON3150/4150 Spring 2016 ECON3150/4150 Spring 2016 Lecture 4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo Last updated: January 26, 2016 1 / 49 Overview These lecture slides covers: The linear regression

More information

7. Prediction. Outline: Read Section 6.4. Mean Prediction

7. Prediction. Outline: Read Section 6.4. Mean Prediction Outline: Read Section 6.4 II. Individual Prediction IV. Choose between y Model and log(y) Model 7. Prediction Read Wooldridge (2013), Chapter 6.4 2 Mean Prediction Predictions are useful But they are subject

More information

ECON Program Evaluation, Binary Dependent Variable, Misc.

ECON Program Evaluation, Binary Dependent Variable, Misc. ECON 351 - Program Evaluation, Binary Dependent Variable, Misc. Maggie Jones () 1 / 17 Readings Chapter 13: Section 13.2 on difference in differences Chapter 7: Section on binary dependent variables Chapter

More information

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model Most of this course will be concerned with use of a regression model: a structure in which one or more explanatory

More information