Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama
|
|
- Jared Todd
- 6 years ago
- Views:
Transcription
1 Economics 471: Econometrics Department of Economics, Finance and Legal Studies University of Alabama Course Packet The purpose of this packet is to show you one particular dataset and how it is used in practice with the methods taught in lecture. The data used in the packet are available on the course website A brief description of the variables is available on the course website You should be able to replicate all of the results using the econometric package. This should help bridge the material from class with the computer program and at the same time help you with the assigned problem sets. Brief explanations on how to obtain the various tables and figures within the window framework are given within the packet and a more advanced formal set of code is given at the end which will run most of the results in one step. The academic paper from which this subset of data was collected can be found at You are not required to read or understand the material in the paper, but you may be interested in the motivation for choosing this particular data. You must be on campus to access the information in the above link. A related paper of interest is here: This packet is designed to be a tool which will help you understand the difficult material in the course. You are expected to bring this to each lecture as I will often refer to the results given in the packet. Questions, comments and suggestions are always appreciated.
2 I. Descriptive Statistics Sample Average 1 / Sample Standard Deviation 1 1 TESTSCORES HOMEWORK HRSOFCLASS CLASSSIZE PREVTESTSCORES Mean Median Maximum Minimum Std. Dev Skewness Kurtosis Jarque Bera Probability Sum Sum Sq. Dev Observations Quick Group Statistics Descriptive Statistics Common Sample Enter in Series List Dialog Box: testscores homework classsize hrsofclass prevtestscores
3 Conditional Expectation Testscores, if sex=0: Testscores, if sex=1: One may expect, on average, tenth grade boys to score nearly a point higher than tenth grade girls on math tests. smpl if sex=0 scalar meantscond0 smpl if sex=1 scalar meantscond1 Note: All scalars appear at the bottom left hand corner of the window. Sample Covariance homework) Or QuickGroup StatisticsCovariances Enter testscores homework in the dialog box. This shows the following covariance matrix (note that the diagonals represent the variances): TESTSCORES HOMEWORK TESTSCORES HOMEWORK
4 Sample Correlation Coefficient Both covariance and the correlation coefficient measure the linear dependence between the variables. Notes: A correlation coefficient of 0 does not imply independence. Both covariance and correlation coefficient have the same sign. homework) Or QuickGroup StatisticsCorrelations Enter testscores homework in the dialog box. This shows the following correlation matrix (the diagonals are unity because each variable is perfectly correlated with itself): TESTSCORES HOMEWORK TESTSCORES HOMEWORK
5 II. Histograms, PDF s and CDF s Histogram Frequency Test Scores Series: TESTSCORES Sample Observations 3733 Mean Median Maximum Minimum Std. Dev Skewness Kurtosis Jarque-Bera Probability QuickSeries StatisticsHistograms and Stats Enter testscores in the dialog box.
6 Probability Density Function (pdf), 1,2,,.036 Kernel Density (Normal, h = ) Probability TESTSCORES Probability TESTSCORES Probability if sex=0 Probability if sex=1
7 Double click on the variable of interest. View Distribution Kernel Density Graphs Normal (Gaussian) OK For the conditional, restrict the sample size: smpl if sex = 0 freeze(ts_pdf0) testscores.kdensity(k=n) smpl if sex=1 freeze(ts_pdf1)testscores.kdensity(k=n) Cumulative Distribution Function (cdf) 1.0 Empirical CDF 0.8 Probability TESTSCORES Double click on the variable of interest. View Distribution CDF-Survivor-Quantile Graphs OK
8 III. Simple Linear Regression Model (Regression of y on x) Standard output: Where Dependent Variable: TESTSCORES Method: Least Squares Date: 08/10/08 Time: 10:04 Sample: Included observations: 3733 is the total variation in x. Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
9 TESTSCORES = *HOMEWORK TEST SCORES HOMEWORK The model here is: This model relates average tenth grade math test scores to hours of daily homework. The estimated equation is: A tenth grade student scores an average of on their math test if they do not do any homework, holding all other factors fixed. Holding all other factors fixed, an additional hour of daily math homework increases a tenth grader s predicted test score by points. Quick Estimate Equation Enter into Equation Specification dialog box: testscores c homework Press OK
10 Standard Error of The standard error for the slope parameter is scalar Standard Error of Regression (SER) SER estimates the standard deviation in y after the effect of x has been accounted for. scalar sersimple=simplehw.@se Sum Squared Residuals (SSR) SSR measures the unexplained variation between the data and the predicted values. scalar ssr=simplehw.@ssr
11 Explained Sum of Squares (SSE) SSE measures the explained variation between the predicted values and the mean. Computed manually: genr y_hat = *homework genr sumofsquares = (y_hat )^2 Quick Group Statistics Descriptive Statistics Common Sample Enter in Series List Dialog Box: sumofsquares. The SSE is in the row labeled sum. or calculate SST first and then subtract SSR from SST to get SSE. Total Sum of Squares (SST) SST measures the total variation in y. It is the sum of the explained and unexplained variation. Take the values for SSR (reported in the regression output) and SSE (above) and add them. Or series scalar tss=@sum(sst)
12 R Squared measures the proportion of sample variation in the dependent variable that is accounted for by the independent variable(s) in the model. Note: never decreases (and usually increases) when adding more regressors. scalar rsqsimple=simplehw.@r2 Adjusted R squared Note: imposes a penalty for adding more independent variables. scalar rbarsqsimple=simplehw.@rbar2
13 Regression Through the Origin Dependent Variable: TESTSCORES Method: Least Squares Date: 08/11/08 Time: 07:10 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. HOMEWORK R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood Durbin-Watson stat TEST SCORES y_hat= *homework HOMEWORK
14 We would use this model if there is evidence to suggest that 0. The interpretation would be that if the student does no homework, the predicted test score would be 0. If the student increased his daily homework time by one hour, his predicted test score would increase by 60.3 points. Both seem unreasonable. Note that R squared is meaningless when the intercept is excluded from the model. Functional Forms with Logs Model Dependent Variable Independent Variable Interpretation of β level level y x level log y log(x) /100% log level log(y) x % 100 log log log(y) log(x) % % Level Log testscores ln Dependent Variable: TESTSCORES Method: Least Squares Date: 08/14/08 Time: 08:04 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C LNCLASSSIZE R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) A one percent increase in class size will increase test scores by points. Generating ln of a variable. In the command window enter: series lnclasssize=@log(classsize)
15 Log Level ln Dependent Variable: LNTESTSCORES Method: Least Squares Date: 08/14/08 Time: 08:05 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C CLASSSIZE R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) The coefficient for classsize multiplied by one hundred can be interpreted as the semi elasticity of test scores with respect to class size. If you add one student to the class, test scores would increase by %. Log Log ln ln Dependent Variable: LNTESTSCORES Method: Least Squares Date: 08/10/08 Time: 10:04 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C LNCLASSSIZE R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
16 The coefficient for ln(classsize) can be interpreted as the elasticity of test scores with respect to class size. A one percent increase in class size will increase test scores by 0.06%. Some may have expected a negative relationship between these two variables. This result is common in empirical studies and may be attributed to selection of high quality students into larger classes. IV. Multiple Regression Dependent Variable: TESTSCORES Method: Least Squares Date: 08/10/08 Time: 10:04 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK CLASSSIZE HRSOFCLASS R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) The interpretation for each coefficient remains the same. Standard Error of 1 1 Where for j1,2,,k and is the from the regression of x on the other x s
17 Quadratic Model Estimated Slope 2 Dependent Variable: TESTSCORES Method: Least Squares Date: 08/10/08 Time: 10:04 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK HOMEWORK*HOMEWORK R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
18 y_hat= *homework *homework^ Test Scores x* HOMEWORK Turning Point After 2.8 hours of homework per day, the return to homework starts to diminish. When we restrict the sample to students who do more than hours of homework a day, we find that there are only 17 sample points out of a possible 3733 (0.0046%). Standardized/Beta Coefficients Where for all j 1, 2,, k
19 Dependent Variable: ZTESTSCORES Method: Least Squares Date: 08/11/08 Time: 07:10 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. ZHOMEWORK ZHRSOFCLASS ZCLASSSIZE R-squared Mean dependent var -5.04E-09 Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood Durbin-Watson stat The interpretation for this equation is that a one standard deviation change in homework will result in a standard deviation change in test scores. Homework has the largest impact out of the three regressors on test scores. To generate standardized variables, type into the command window: series ztestscores=(testscores-@mean(testscores))/@stdev(testscores) This creates a series called ztestscores that is the standardized form of testscores.
20 V. Lagged Dependent Variable Model Dependent Variable: TESTSCORES Method: Least Squares Date: 08/20/08 Time: 00:37 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK PREVTESTSCORES R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) We include eighth grade test scores because it helps control for unobserved variables that may influence their test scores in the tenth grade. The eighth grade test scores can be a proxy for ability or school quality. According to this model, holding all other factors fixed, increasing homework time by one hour would increase test scores by 0.88 points. The return to homework is lower than it was in the model that did not include previous test scores as a regressor.
21 V. Dummy Variable Models Dependent Variable: TESTSCORES Method: Least Squares Date: 08/11/08 Time: 07:10 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK SEX R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
22 75 70 Test Scores HOMEWORK y_hat= *homework y_hat= *homework Here the base group is male tenth graders. Since sex is binary (0, 1), the only difference between the base group and the alternative is the intercept.
23 Dependent Variable: TESTSCORES Method: Least Squares Date: 08/14/08 Time: 11:21 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK SEX DMEDUC DMEDUC DMEDUC DMEDUC DMEDUC DMEDUC DTEACHERSEX R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) This model contains three dummy variables, two of which are binary, and one with 7 groups. The base group for this model is male tenth grader whose mother did not finish high school and currently has a male teacher. Once again, the only difference between the base group and the alternatives is the intercept. The students with the highest predicted test scores are tenth grade boys whose mothers have a master s degree and currently have a male teacher. The students with the lowest predicted test scores are tenth grade girls whose mothers dropped out of high school and currently have a female teacher. The difference between the predicted test scores of these two groups for any given level of homework is points. Generating separate dummy variables for mother s education In the command window enter: =@expand(mothereduc)
24 Dummy Model with Interaction/Cross Terms Dependent Variable: TESTSCORES Method: Least Squares Date: 08/11/08 Time: 07:10 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK SEX HOMEWORK*SEX R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
25 Test Scores HOMEWORK y_hat= homework*( ) y_hat= *homework testscores = *homework testscores = homework*( ) There is a clear slope change and a relatively small change in the intercept. The return to homework is less for tenth grade girls than boys at all time periods spent on homework, except when hours spent on homework is very small.
26 VI. Confidence Intervals and Hypothesis Testing Confidence Intervals 1001α% Confidence Interval, The critical values are taken from the t tables with n k 1 degrees of freedom. Using the model, the 95% confidence interval for is or , We have 95% confidence that the true population parameter,, is in this interval. t Test : H :,, ~ Note: automatically gives the t statistic for 0. Model: Hypothesis: : 0 against H : 0 reports a t statistic of for the parameter estimate. This falls inside the rejection region even at the 99% confidence level. With this data set we can use the standard normal table since n is large. We would reject the null hypothesis and conclude that 0. Hypothesis: : 0 against H : 0 We could also determine if the properly specified model excludes the intercept parameter from the model (whether a regression through the origin is appropriate) using the same model. The t statistic of is The critical value here with 3731 ( ) degrees of freedom, at a 99% confidence level, is Once again we could have used the standard normal table. We reject the null hypothesis and conclude that 0.
27 Model: Hypothesis: : 0 against : 0 reports as the t statistic. Once again, even at a 99% confidence level, we reject the null hypothesis. This helps verify that homework has decreasing return. Testing Linear Combination of Parameters Single Restriction : : 2, 2 Where is the estimate of, Model: In order to compute, we could modify the test to state: : 0 against : 0. When we rearrange the model using, the model becomes
28 Dependent Variable: TESTSCORES Method: Least Squares Date: 08/14/08 Time: 19:58 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK HOMEWORK+CLASSSIZE HRSOFCLASS R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) The standard error and the t statistic of homework are important because they are the values of and respectively under the original null and alternative hypotheses. This method bypasses the need to compute the covariance between and. We can now apply these values to the original hypothesis test. The critical value from the t distribution at a 99% confidence level in a one tailed test is With this specific test we reject the null hypothesis when the t statistic is greater than the absolute value of the critical value. Since the t statistic, , is greater than 2.326, we reject the null hypothesis. F Test (Testing the Validity of the Regression) : : ~,
29 Restricted Unrestricted : 0 : In the case where the restricted model does not have a slope parameter, the value is zero for the restricted model. We can also compute the F statistic using the following equation: Where is the from the estimation of the unrestricted model. 99% confidence level The F statistic is greater than the critical value. Therefore, we reject the null hypothesis. At least one of the parameters is different zero. We can conclude by saying that the inclusion of homework, class size and hours of class explain some variation in the test scores. Restricted Unrestricted : 0 : % confidence level In this test, we come to the same conclusion as before. Either or is different from zero or, and are different from zero. Thus, the inclusion of class size and hours of class help explain some variation in test scores.
30 Restricted Unrestricted : 0 : % confidence level Notes: When there is only one restriction, the F statistic is the square of the t statistic ( ). We reject the null hypothesis in this test as well. However, only this test tells us that homework is statistically significant. The inclusion of homework helps explain some variation in the test scores. VII. Heteroskedasticity Heteroskedasticity Robust Standard Errors for White Huber Eicker Standard Errors 1 Where denotes the th residual from regressing on all other independent variables.
31 Dependent Variable: TESTSCORES Method: Least Squares Date: 08/20/08 Time: 00:43 Sample: Included observations: 3733 White Heteroskedasticity-Consistent Standard Errors & Covariance Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Parameter Regular Standard Errors Robust Standard Errors α β Since we are still using OLS estimates, the t statistic is still constructed and interpreted as before. The only difference is that we use the robust standard errors. The conclusions from the t tests concerning α and β remain the same. Note: Robust standard errors and robust t statistics can only be used if there is a large sample size. Otherwise, the inference from the t test may be wrong. Quick Estimate Equation Enter into the Equation specification dialog box testscores c homework Click on the Options tab. Now select Heteroskedasticity consistent coefficient covariance. White should be the default. Click OK.
32 Heteroskedasticity Robust F statistics computes the heteroskedasticity robust Wald statistic which is a transformation of the heteroskedasticity robust F statistic. The interpretation of the Wald statistic is the same. In this case, the restricted model is and the unrestricted model is. Wald Test: Equation: Untitled : 0 : Test Statistic Value df Probability F-statistic (3, 3729) Chi-square Null Hypothesis Summary: Normalized Restriction (= 0) Value Std. Err. C(2) C(3) C(4) Restrictions are linear in coefficients. The regular F statistic for this test was However, our inference from the output is the same because the F statistic is greater than the critical value. To compute the heteroskedasticity robust Wald statistic in, estimate the equation as before but include classsize and hrsofclass as independent variables. ViewCoefficient TestsWald Coefficient Restrictions Enter into the Wald Test dialog box: c(2)=c(3)=c(4)=0 Hit OK
33 Testing for Heteroskedasticity White Test for Heteroskedasticity :,,, :,,, If the original model contained three independent variables, the White test is based on the estimation of where is the series of residuals from the OLS estimation. The F statistic is computed from the R squared of this regression. However, this approach uses up degrees of freedom quickly, which is important when your sample size is small. The F statistic is computed automatically in. White Heteroskedasticity Test: F-statistic Prob. F(6,3726) Obs*R-squared Prob. Chi-Square(6) Test Equation: Dependent Variable: RESID^2 Method: Least Squares Date: 08/19/08 Time: 15:50 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK HOMEWORK^ CLASSSIZE CLASSSIZE^ HRSOFCLASS HRSOFCLASS^ R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
34 We can obtain the critical value from the, distribution. F* =2.8 at a 99% confidence level. Since the F-statistic is greater than the critical value, we must reject the null hypothesis that homoskedasticity holds. Estimate the model. ViewResidual TestsWhite Heteroskedasticity (no cross terms) The White test with cross terms requires the same approach. We estimate the model and then. White Heteroskedasticity Test: F-statistic Prob. F(9,3723) Obs*R-squared Prob. Chi-Square(9) Test Equation: Dependent Variable: RESID^2 Method: Least Squares Date: 08/19/08 Time: 15:51 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK HOMEWORK^ HOMEWORK*CLASSSIZE HOMEWORK*HRSOFCLASS CLASSSIZE CLASSSIZE^ CLASSSIZE*HRSOFCLASS HRSOFCLASS HRSOFCLASS^ R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Once again we can reject the null hypothesis since the F-statistic is greater than 2.8.
35 Estimate the model. ViewResidual TestsWhite Heteroskedasticity (cross terms) An alternative to this method is to run the regression of y on the s and then take the residuals and square them. Now estimate the following model:. Dependent Variable: RESID_SQUARED Method: Least Squares Date: 08/19/08 Time: 18:12 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C Y_HAT Y_HAT_SQUARED R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Take the R squared from this regression,, and compute the F statistic using: We can obtain the critical value from the, distribution. This approach clearly uses up fewer degrees of freedom, especially when more independent variables are added to the original model. The F statistic is Since the F statistic is greater than 4.61, we again reject the null hypothesis that homoskedasticity holds.
36 Feasible Generalized Least Squares (FGLS) h(x) is some function of the explanatory variables that determines the heteroskedasticity. Since it is hard to find such a function, we can model the function and then estimate the parameters in this model. Procedure for FGLS 1. Run the regression of y on the explanatory variables. 2. Take the residuals,, and form ln(. 3. Regress ln( on the explanatory variables. 4. Take the fitted values,, and form =exp(). 5. Estimate the original model again by WLS using weights of 1/. Dependent Variable: TESTSCORES Method: Least Squares Date: 08/19/08 Time: 19:09 Sample: Included observations: 3733 Weighting series: H_HAT Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK CLASSSIZE HRSOFCLASS Weighted Statistics R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Unweighted Statistics R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Sum squared resid Durbin-Watson stat
37 We can also do the following: 1. Run the regression of y on the explanatory variables. 2. Take the residuals,, and form ln(. 3. Regress ln( on and. 4. Take the fitted values,, and form =exp(). 5. Estimate the original model again by WLS using weights of 1/. Dependent Variable: TESTSCORES Method: Least Squares Date: 08/19/08 Time: 19:18 Sample: Included observations: 3733 Weighting series: H_HAT2 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK CLASSSIZE HRSOFCLASS Weighted Statistics R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Unweighted Statistics R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Sum squared resid Durbin-Watson stat The only part of this process that may be unfamiliar to you is running the regression with WLS. In the Options tab of the Equation Estimation box, there is an option called Weighted LS/TSLS. Select that option and enter h_hat into the box labeled Weight.
38 Ramsey s Regression Specification Error Test (RESET) Model Misspecification (Neglected Nonlinearities) Original Model: Model with fitted values: : 0 : Ramsey RESET Test: F-statistic Prob. F(2,3727) Log likelihood ratio Prob. Chi-Square(2) Test Equation: Dependent Variable: TESTSCORES Method: Least Squares Date: 08/19/08 Time: 23:47 Sample: Included observations: 3733 Variable Coefficient Std. Error t-statistic Prob. C HOMEWORK CLASSSIZE HRSOFCLASS FITTED^ FITTED^ R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
39 The F statistic is This is greater than the critical value at a 99% confidence level,. Therefore, we reject the null hypothesis. This is evidence that the functional form is misspecified. Estimate the model as usual. ViewStability TestsRamsey s RESET Test In the RESET Specification box enter: 2 Hit OK. Test Against Nonnested Alternatives Mizon and Richard Test Two nonnested models: 1. ln classsize 2. The comprehensive model is: ln : 0 : We can form the F statistic the same way as usual. The F statistic is We reject the null hypothesis at a 99% confidence level. and are jointly statistically significant. Either one or both of these parameters does not equal zero. : 0 : The F statistic is This means that is statistically significant at a 99% confidence level.
40 Davidson MacKinnon Test First estimate the original models ln classsize Now estimate each model including the predicted value of the other model as a regressor lnclasssize θ The t statistics are and for in the two estimated models, respectively. In both instances, even at a 99% confidence level we reject the null hypothesis that 0. Therefore, neither model could be rejected. However, since the R squared value for the second model is higher than that of the first, we can conclude that the second model is better. However, note that this need not be the best possible model. It is just the best of the two being considered.
CHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More informationHeteroskedasticity. Part VII. Heteroskedasticity
Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least
More informationStatistical Inference. Part IV. Statistical Inference
Part IV Statistical Inference As of Oct 5, 2017 Sampling Distributions of the OLS Estimator 1 Statistical Inference Sampling Distributions of the OLS Estimator Testing Against One-Sided Alternatives Two-Sided
More information2. Linear regression with multiple regressors
2. Linear regression with multiple regressors Aim of this section: Introduction of the multiple regression model OLS estimation in multiple regression Measures-of-fit in multiple regression Assumptions
More information2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0
Introduction to Econometrics Midterm April 26, 2011 Name Student ID MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. (5,000 credit for each correct
More informationLecture 8. Using the CLR Model. Relation between patent applications and R&D spending. Variables
Lecture 8. Using the CLR Model Relation between patent applications and R&D spending Variables PATENTS = No. of patents (in 000) filed RDEP = Expenditure on research&development (in billions of 99 $) The
More informationExercise Sheet 6: Solutions
Exercise Sheet 6: Solutions R.G. Pierse 1. (a) Regression yields: Dependent Variable: LC Date: 10/29/02 Time: 18:37 Sample(adjusted): 1950 1985 Included observations: 36 after adjusting endpoints C 0.244716
More informationAnswers to Problem Set #4
Answers to Problem Set #4 Problems. Suppose that, from a sample of 63 observations, the least squares estimates and the corresponding estimated variance covariance matrix are given by: bβ bβ 2 bβ 3 = 2
More informationOutline. 2. Logarithmic Functional Form and Units of Measurement. Functional Form. I. Functional Form: log II. Units of Measurement
Outline 2. Logarithmic Functional Form and Units of Measurement I. Functional Form: log II. Units of Measurement Read Wooldridge (2013), Chapter 2.4, 6.1 and 6.2 2 Functional Form I. Functional Form: log
More informationLecture 8. Using the CLR Model
Lecture 8. Using the CLR Model Example of regression analysis. Relation between patent applications and R&D spending Variables PATENTS = No. of patents (in 1000) filed RDEXP = Expenditure on research&development
More informationECON Introductory Econometrics. Lecture 5: OLS with One Regressor: Hypothesis Tests
ECON4150 - Introductory Econometrics Lecture 5: OLS with One Regressor: Hypothesis Tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 5 Lecture outline 2 Testing Hypotheses about one
More informationWednesday, October 17 Handout: Hypothesis Testing and the Wald Test
Amherst College Department of Economics Economics 360 Fall 2012 Wednesday, October 17 Handout: Hypothesis Testing and the Wald Test Preview No Money Illusion Theory: Calculating True] o Clever Algebraic
More information4. Nonlinear regression functions
4. Nonlinear regression functions Up to now: Population regression function was assumed to be linear The slope(s) of the population regression function is (are) constant The effect on Y of a unit-change
More informationModel Specification and Data Problems. Part VIII
Part VIII Model Specification and Data Problems As of Oct 24, 2017 1 Model Specification and Data Problems RESET test Non-nested alternatives Outliers A functional form misspecification generally means
More informationThe Simple Regression Model. Part II. The Simple Regression Model
Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square
More informationOutline. 11. Time Series Analysis. Basic Regression. Differences between Time Series and Cross Section
Outline I. The Nature of Time Series Data 11. Time Series Analysis II. Examples of Time Series Models IV. Functional Form, Dummy Variables, and Index Basic Regression Numbers Read Wooldridge (2013), Chapter
More informationMultiple Regression Analysis. Part III. Multiple Regression Analysis
Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant
More informationECON 366: ECONOMETRICS II. SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued. Brief Suggested Solutions
DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #10 Nonspherical Errors Continued Brief Suggested Solutions 1. In Lab 8 we considered the following
More information6. Assessing studies based on multiple regression
6. Assessing studies based on multiple regression Questions of this section: What makes a study using multiple regression (un)reliable? When does multiple regression provide a useful estimate of the causal
More informationEastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I. M. Balcilar. Midterm Exam Fall 2007, 11 December 2007.
Eastern Mediterranean University Department of Economics ECON 503: ECONOMETRICS I M. Balcilar Midterm Exam Fall 2007, 11 December 2007 Duration: 120 minutes Questions Q1. In order to estimate the demand
More informationMultiple Regression Analysis
Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,
More informationECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #12 VAR Brief suggested solution
DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECON 366: ECONOMETRICS II SPRING TERM 2005: LAB EXERCISE #12 VAR Brief suggested solution Location: BEC Computing LAB 1) See the relevant parts in lab 11
More informationNonlinear Regression Functions
Nonlinear Regression Functions (SW Chapter 8) Outline 1. Nonlinear regression functions general comments 2. Nonlinear functions of one variable 3. Nonlinear functions of two variables: interactions 4.
More informationBrief Suggested Solutions
DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECONOMICS 366: ECONOMETRICS II SPRING TERM 5: ASSIGNMENT TWO Brief Suggested Solutions Question One: Consider the classical T-observation, K-regressor linear
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 5 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 44 Outline of Lecture 5 Now that we know the sampling distribution
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationRegression with a Single Regressor: Hypothesis Tests and Confidence Intervals
Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals (SW Chapter 5) Outline. The standard error of ˆ. Hypothesis tests concerning β 3. Confidence intervals for β 4. Regression
More informationApplied Economics. Regression with a Binary Dependent Variable. Department of Economics Universidad Carlos III de Madrid
Applied Economics Regression with a Binary Dependent Variable Department of Economics Universidad Carlos III de Madrid See Stock and Watson (chapter 11) 1 / 28 Binary Dependent Variables: What is Different?
More informationBrief Sketch of Solutions: Tutorial 3. 3) unit root tests
Brief Sketch of Solutions: Tutorial 3 3) unit root tests.5.4.4.3.3.2.2.1.1.. -.1 -.1 -.2 -.2 -.3 -.3 -.4 -.4 21 22 23 24 25 26 -.5 21 22 23 24 25 26.8.2.4. -.4 - -.8 - - -.12 21 22 23 24 25 26 -.2 21 22
More informationHeteroscedasticity 1
Heteroscedasticity 1 Pierre Nguimkeu BUEC 333 Summer 2011 1 Based on P. Lavergne, Lectures notes Outline Pure Versus Impure Heteroscedasticity Consequences and Detection Remedies Pure Heteroscedasticity
More informationTopic 7: Heteroskedasticity
Topic 7: Heteroskedasticity Advanced Econometrics (I Dong Chen School of Economics, Peking University Introduction If the disturbance variance is not constant across observations, the regression is heteroskedastic
More informationChapter 11 Handout: Hypothesis Testing and the Wald Test
Chapter 11 Handout: Hypothesis Testing and the Wald Test Preview No Money Illusion Theory: Calculating True] o Clever Algebraic Manipulation o Wald Test Restricted Regression Reflects Unrestricted Regression
More informationApplied Econometrics (QEM)
Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear
More informationPractical Econometrics. for. Finance and Economics. (Econometrics 2)
Practical Econometrics for Finance and Economics (Econometrics 2) Seppo Pynnönen and Bernd Pape Department of Mathematics and Statistics, University of Vaasa 1. Introduction 1.1 Econometrics Econometrics
More informationExercise Sheet 5: Solutions
Exercise Sheet 5: Solutions R.G. Pierse 2. Estimation of Model M1 yields the following results: Date: 10/24/02 Time: 18:06 C -1.448432 0.696587-2.079327 0.0395 LPC -0.306051 0.272836-1.121740 0.2640 LPF
More informationEcon 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:
Econ 427, Spring 2010 Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements: 1. (page 132) In each case, the idea is to write these out in general form (without the lag
More informationLampiran. Lampiran 1 Data Penelitian TAHUN Y X1 X2 X3 X4_AS X4_JPG X4_INDH 1995
Lampiran Lampiran 1 Data Penelitian TAHUN Y X1 X2 X3 X4_AS X4_JPG X4_INDH 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 3,107,163 3,102,431 3,443,555
More informationOLSQ. Function: Usage:
OLSQ OLSQ (HCTYPE=robust SE type, HI, ROBUSTSE, SILENT, TERSE, UNNORM, WEIGHT=name of weighting variable, WTYPE=weight type) dependent variable list of independent variables ; Function: OLSQ is the basic
More informationMultiple Regression Analysis
Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators
More informationLECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit
LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define
More informationARDL Cointegration Tests for Beginner
ARDL Cointegration Tests for Beginner Tuck Cheong TANG Department of Economics, Faculty of Economics & Administration University of Malaya Email: tangtuckcheong@um.edu.my DURATION: 3 HOURS On completing
More informationTesting and Model Selection
Testing and Model Selection This is another digression on general statistics: see PE App C.8.4. The EViews output for least squares, probit and logit includes some statistics relevant to testing hypotheses
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple
More informationLeast Squares Estimation-Finite-Sample Properties
Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions
More informationThe Simple Linear Regression Model
The Simple Linear Regression Model Lesson 3 Ryan Safner 1 1 Department of Economics Hood College ECON 480 - Econometrics Fall 2017 Ryan Safner (Hood College) ECON 480 - Lesson 3 Fall 2017 1 / 77 Bivariate
More informationRomanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS
THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable
More informationApplied Econometrics. Applied Econometrics Second edition. Dimitrios Asteriou and Stephen G. Hall
Applied Econometrics Second edition Dimitrios Asteriou and Stephen G. Hall MULTICOLLINEARITY 1. Perfect Multicollinearity 2. Consequences of Perfect Multicollinearity 3. Imperfect Multicollinearity 4.
More informationAnswer Key: Problem Set 6
: Problem Set 6 1. Consider a linear model to explain monthly beer consumption: beer = + inc + price + educ + female + u 0 1 3 4 E ( u inc, price, educ, female ) = 0 ( u inc price educ female) σ inc var,,,
More informationThe general linear regression with k explanatory variables is just an extension of the simple regression as follows
3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because
More informationLectures 5 & 6: Hypothesis Testing
Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across
More informationIntermediate Econometrics
Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the
More informationMultiple Regression Analysis: Heteroskedasticity
Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing
More informationMultiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C =
Economics 130 Lecture 6 Midterm Review Next Steps for the Class Multiple Regression Review & Issues Model Specification Issues Launching the Projects!!!!! Midterm results: AVG = 26.5 (88%) A = 27+ B =
More informationRegression with Qualitative Information. Part VI. Regression with Qualitative Information
Part VI Regression with Qualitative Information As of Oct 17, 2017 1 Regression with Qualitative Information Single Dummy Independent Variable Multiple Categories Ordinal Information Interaction Involving
More informationIntroduction to Econometrics. Heteroskedasticity
Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory
More informationContest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2.
Updated: November 17, 2011 Lecturer: Thilo Klein Contact: tk375@cam.ac.uk Contest Quiz 3 Question Sheet In this quiz we will review concepts of linear regression covered in lecture 2. NOTE: Please round
More informationUsing Tables and Graphing Calculators in Math 11
Using Tables and Graphing Calculators in Math 11 Graphing calculators are not required for Math 11, but they are likely to be helpful, primarily because they allow you to avoid the use of tables in some
More information7. Prediction. Outline: Read Section 6.4. Mean Prediction
Outline: Read Section 6.4 II. Individual Prediction IV. Choose between y Model and log(y) Model 7. Prediction Read Wooldridge (2013), Chapter 6.4 2 Mean Prediction Predictions are useful But they are subject
More information4. Examples. Results: Example 4.1 Implementation of the Example 3.1 in SAS. In SAS we can use the Proc Model procedure.
4. Examples Example 4.1 Implementation of the Example 3.1 in SAS. In SAS we can use the Proc Model procedure. Simulate data from t-distribution with ν = 6. SAS: data tdist; do i = 1 to 500; y = tinv(ranuni(158),6);
More informationUsing EViews Vox Principles of Econometrics, Third Edition
Using EViews Vox Principles of Econometrics, Third Edition WILLIAM E. GRIFFITHS University of Melbourne R. CARTER HILL Louisiana State University GUAY С LIM University of Melbourne JOHN WILEY & SONS, INC
More informationNovember 9th-12th 2010, Port of Spain, Trinidad
By: Damie Sinanan and Dr. Roger Hosein 42nd Annual Monetary Studies Conference Financial Stability, Crisis Preparedness and Risk Management in the Caribbean November 9th-12th 2010, Port of Spain, Trinidad
More informationEconometrics Review questions for exam
Econometrics Review questions for exam Nathaniel Higgins nhiggins@jhu.edu, 1. Suppose you have a model: y = β 0 x 1 + u You propose the model above and then estimate the model using OLS to obtain: ŷ =
More informationEconometrics Multiple Regression Analysis: Heteroskedasticity
Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties
More informationECON Introductory Econometrics. Lecture 7: OLS with Multiple Regressors Hypotheses tests
ECON4150 - Introductory Econometrics Lecture 7: OLS with Multiple Regressors Hypotheses tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 7 Lecture outline 2 Hypothesis test for single
More informationAnalysis of Cross-Sectional Data
Analysis of Cross-Sectional Data Kevin Sheppard http://www.kevinsheppard.com Oxford MFE This version: November 8, 2017 November 13 14, 2017 Outline Econometric models Specification that can be analyzed
More informationRecent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data
Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)
More informationStatistics and Quantitative Analysis U4320
Statistics and Quantitative Analysis U3 Lecture 13: Explaining Variation Prof. Sharyn O Halloran Explaining Variation: Adjusted R (cont) Definition of Adjusted R So we'd like a measure like R, but one
More informationFinal Exam - Solutions
Ecn 102 - Analysis of Economic Data University of California - Davis March 19, 2010 Instructor: John Parman Final Exam - Solutions You have until 5:30pm to complete this exam. Please remember to put your
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationG. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication
G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationEconometrics -- Final Exam (Sample)
Econometrics -- Final Exam (Sample) 1) The sample regression line estimated by OLS A) has an intercept that is equal to zero. B) is the same as the population regression line. C) cannot have negative and
More informationPractice Questions for the Final Exam. Theoretical Part
Brooklyn College Econometrics 7020X Spring 2016 Instructor: G. Koimisis Name: Date: Practice Questions for the Final Exam Theoretical Part 1. Define dummy variable and give two examples. 2. Analyze the
More information1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE
1. You have data on years of work experience, EXPER, its square, EXPER, years of education, EDUC, and the log of hourly wages, LWAGE You estimate the following regressions: (1) LWAGE =.00 + 0.05*EDUC +
More information10. Time series regression and forecasting
10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 7 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 68 Outline of Lecture 7 1 Empirical example: Italian labor force
More informationEmpirical Economic Research, Part II
Based on the text book by Ramanathan: Introductory Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna December 7, 2011 Outline Introduction
More information1 Quantitative Techniques in Practice
1 Quantitative Techniques in Practice 1.1 Lecture 2: Stationarity, spurious regression, etc. 1.1.1 Overview In the rst part we shall look at some issues in time series economics. In the second part we
More informationThe Chi-Square Distributions
MATH 03 The Chi-Square Distributions Dr. Neal, Spring 009 The chi-square distributions can be used in statistics to analyze the standard deviation of a normally distributed measurement and to test the
More informationLecture 3: Multiple Regression
Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u
More informationThe Chi-Square Distributions
MATH 183 The Chi-Square Distributions Dr. Neal, WKU The chi-square distributions can be used in statistics to analyze the standard deviation σ of a normally distributed measurement and to test the goodness
More informationECON 4230 Intermediate Econometric Theory Exam
ECON 4230 Intermediate Econometric Theory Exam Multiple Choice (20 pts). Circle the best answer. 1. The Classical assumption of mean zero errors is satisfied if the regression model a) is linear in the
More informationChapte The McGraw-Hill Companies, Inc. All rights reserved.
12er12 Chapte Bivariate i Regression (Part 1) Bivariate Regression Visual Displays Begin the analysis of bivariate data (i.e., two variables) with a scatter plot. A scatter plot - displays each observed
More information3. Linear Regression With a Single Regressor
3. Linear Regression With a Single Regressor Econometrics: (I) Application of statistical methods in empirical research Testing economic theory with real-world data (data analysis) 56 Econometrics: (II)
More information13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity
Outline: Further Issues in Using OLS with Time Series Data 13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process I. Stationary and Weakly Dependent Time Series III. Highly Persistent
More informationLinear Regression with 1 Regressor. Introduction to Econometrics Spring 2012 Ken Simons
Linear Regression with 1 Regressor Introduction to Econometrics Spring 2012 Ken Simons Linear Regression with 1 Regressor 1. The regression equation 2. Estimating the equation 3. Assumptions required for
More information5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is
Practice Final Exam Last Name:, First Name:. Please write LEGIBLY. Answer all questions on this exam in the space provided (you may use the back of any page if you need more space). Show all work but do
More informationLinear Regression. In this lecture we will study a particular type of regression model: the linear regression model
1 Linear Regression 2 Linear Regression In this lecture we will study a particular type of regression model: the linear regression model We will first consider the case of the model with one predictor
More informationIntroduction to Econometrics. Multiple Regression (2016/2017)
Introduction to Econometrics STAT-S-301 Multiple Regression (016/017) Lecturer: Yves Dominicy Teaching Assistant: Elise Petit 1 OLS estimate of the TS/STR relation: OLS estimate of the Test Score/STR relation:
More informationApplied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics
Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish
More informationLecture 11: Simple Linear Regression
Lecture 11: Simple Linear Regression Readings: Sections 3.1-3.3, 11.1-11.3 Apr 17, 2009 In linear regression, we examine the association between two quantitative variables. Number of beers that you drink
More informationChapter 1 Statistical Inference
Chapter 1 Statistical Inference causal inference To infer causality, you need a randomized experiment (or a huge observational study and lots of outside information). inference to populations Generalizations
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 13 Nonlinearities Saul Lach October 2018 Saul Lach () Applied Statistics and Econometrics October 2018 1 / 91 Outline of Lecture 13 1 Nonlinear regression functions
More information5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1)
5. Erroneous Selection of Exogenous Variables (Violation of Assumption #A1) Assumption #A1: Our regression model does not lack of any further relevant exogenous variables beyond x 1i, x 2i,..., x Ki and
More informationSummary of OLS Results - Model Variables
Summary of OLS Results - Model Variables Variable Coefficient [a] StdError t-statistic Probability [b] Robust_SE Robust_t Robust_Pr [b] VIF [c] Intercept 12.722048 1.710679 7.436839 0.000000* 2.159436
More informationFinQuiz Notes
Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable
More informationWISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A
WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, 2015-16 Academic Year Exam Version: A INSTRUCTIONS TO STUDENTS 1 The time allowed for this examination paper is 2 hours. 2 This
More informationLab 11 - Heteroskedasticity
Lab 11 - Heteroskedasticity Spring 2017 Contents 1 Introduction 2 2 Heteroskedasticity 2 3 Addressing heteroskedasticity in Stata 3 4 Testing for heteroskedasticity 4 5 A simple example 5 1 1 Introduction
More informationOne-Way ANOVA. Some examples of when ANOVA would be appropriate include:
One-Way ANOVA 1. Purpose Analysis of variance (ANOVA) is used when one wishes to determine whether two or more groups (e.g., classes A, B, and C) differ on some outcome of interest (e.g., an achievement
More information