Econometrics Review questions for exam

Similar documents
ECON 4230 Intermediate Econometric Theory Exam

Lecture 4: Multivariate Regression, Part 2

Practice exam questions

CHAPTER 6: SPECIFICATION VARIABLES

Lecture 4: Multivariate Regression, Part 2

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Contest Quiz 3. Question Sheet. In this quiz we will review concepts of linear regression covered in lecture 2.

Applied Statistics and Econometrics

WISE International Masters

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Multiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE. Sampling Distributions of OLS Estimators

At this point, if you ve done everything correctly, you should have data that looks something like:

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

ECON3150/4150 Spring 2016

Simple Linear Regression: One Quantitative IV

Making sense of Econometrics: Basics

Unit 6 - Introduction to linear regression

Economics Introduction to Econometrics - Fall 2007 Final Exam - Answers

Motivation for multiple regression

Ec1123 Section 7 Instrumental Variables

Answers to Problem Set #4

Wooldridge, Introductory Econometrics, 3d ed. Chapter 9: More on specification and data problems

Multiple Regression Analysis

ECO220Y Simple Regression: Testing the Slope

Lectures 5 & 6: Hypothesis Testing

5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is

Ch 2: Simple Linear Regression

Wooldridge, Introductory Econometrics, 4th ed. Chapter 2: The simple regression model

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A

Chapter 2: simple regression model

Final Exam - Solutions

ECON3150/4150 Spring 2015

Problem Set #6: OLS. Economics 835: Econometrics. Fall 2012

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Econometrics Honor s Exam Review Session. Spring 2012 Eunice Han

ECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47

Iris Wang.

Econometrics. 4) Statistical inference

Essential of Simple regression

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

WISE International Masters

Christopher Dougherty London School of Economics and Political Science

Unit 6 - Simple linear regression

Homework Set 2, ECO 311, Fall 2014

Homoskedasticity. Var (u X) = σ 2. (23)

ECON Introductory Econometrics. Lecture 5: OLS with One Regressor: Hypothesis Tests

Econometrics Homework 1

Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham

WISE MA/PhD Programs Econometrics Instructor: Brett Graham Spring Semester, Academic Year Exam Version: A

Econ 510 B. Brown Spring 2014 Final Exam Answers

Multiple Regression Analysis

Review of Econometrics

Introduction to Econometrics. Heteroskedasticity

Correlation and Linear Regression

2. Linear regression with multiple regressors

Exam ECON3150/4150: Introductory Econometrics. 18 May 2016; 09:00h-12.00h.

ECNS 561 Multiple Regression Analysis

ECON3150/4150 Spring 2016

Sociology 593 Exam 1 Answer Key February 17, 1995

LECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity

Statistical Inference with Regression Analysis

Introduction to Regression Analysis. Dr. Devlina Chatterjee 11 th August, 2017

Making sense of Econometrics: Basics

Rockefeller College University at Albany

Answer all questions from part I. Answer two question from part II.a, and one question from part II.b.

Lecture 3: Multiple Regression

Unless provided with information to the contrary, assume for each question below that the Classical Linear Model assumptions hold.

Intermediate Econometrics

Econometrics -- Final Exam (Sample)

Inference for the Regression Coefficient

Econometrics Summary Algebraic and Statistical Preliminaries

Likely causes: The Problem. E u t 0. E u s u p 0

Answer Key: Problem Set 6

9. Linear Regression and Correlation

ECON The Simple Regression Model

THE MULTIVARIATE LINEAR REGRESSION MODEL

Hypothesis testing Goodness of fit Multicollinearity Prediction. Applied Statistics. Lecturer: Serena Arima

The multiple regression model; Indicator variables as regressors

Outline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation

Nonlinear Regression Functions

Econometrics Midterm Examination Answers

Measurement Error. Often a data set will contain imperfect measures of the data we would ideally like.

14.32 Final : Spring 2001

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Please discuss each of the 3 problems on a separate sheet of paper, not just on a separate page!

The Simple Regression Model. Simple Regression Model 1

Lecture 5: Omitted Variables, Dummy Variables and Multicollinearity

1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11

Econometrics I Lecture 3: The Simple Linear Regression Model

Midterm 2 - Solutions

Lab 11 - Heteroskedasticity

Homework Set 2, ECO 311, Spring 2014

The Simple Linear Regression Model

Warwick Economics Summer School Topics in Microeconometrics Instrumental Variables Estimation

Linear Regression with 1 Regressor. Introduction to Econometrics Spring 2012 Ken Simons

Lecture 8. Using the CLR Model. Relation between patent applications and R&D spending. Variables

Introductory Econometrics

Lecture 4: Regression Analysis

Transcription:

Econometrics Review questions for exam Nathaniel Higgins nhiggins@jhu.edu, 1. Suppose you have a model: y = β 0 x 1 + u You propose the model above and then estimate the model using OLS to obtain: ŷ = β 0 x 1. Label each of the items in the preceding two equations as either random, potentially random, or definitely deterministic. 2. Your computer is broken. You need to determine the OLS estimator for a simple model: y = β 0 + β 1 x 1 + u. Luckily, the data set is small. It is displayed below. To the nearest whole digit (no need for decimals) find the OLS estimates β 0 and β 1. 3. y x 2 1 5 2 6 3 7 3 3 1 Type II error refers to the following case: (a) reject the null when the null is false (b) reject the null when the null is true (c) fail to reject the null when the null is false (d) fail to reject the null when the null is true 1

4. Which of the following statements about the OLS residuals is not true? (a) The sample covariance between the residuals and the regressors equals zero. (b) The sum of the residuals equals zero. (c) The sample covariance between the residuals and the dependent variable equals zero. (d) The sample average of the residuals equals zero. 5. Which expression correctly describes the predicted value (ŷ) of a multivariate regression model? (a) β 0 + β 1 x 1 + β 2 x 2 + u (b) β 0 + β 1 x 1 + β 2 x 2 + û (c) β 0 + β 1 x 1 + β 2 x 2 (d) β 0 + β 1 x 1 + β 2 x 2 6. True or false: In the multiple regression model, the goodness-of-fit measure R-squared always increases or remains the same when an additional explanatory variable is added. 7. True or false: If the error term in a linear regression model is normally distributed, then the OLS estimator is the best linear unbiased estimator. 8. True or false and explain: Suppose you are interested in estimating the relationship between birth weight and prenatal visits. You have data on birth weight, in pounds, number of prenatal visits, and the age and race of the mother. However, you do not observe family income. As a result, the coefficient on prenatal visits will have a positive bias. 9. True or false and explain: If there is a lot of variation in the data for a regressor, then the variance of the coefficient on that regressor will be smaller than if there is not a lot of variation in the data for that regressor. 2

10. Suppose we are interested in estimating a simple model that relates the number of crimes on college campuses (crime) to student enrollment (enroll). (a) Set up a constant elasticity (log-log) model that explains the number of crimes on a college campus using student enrollment. (b) Suppose that after running the model from part (a) on a random sample of 500 colleges, you find a coefficient of 1.27 (standard error of 0.11) on the variable associated with enrollment. Interpret this coefficient. (c) Next you would like to know whether crime is more of a problem on larger campuses than smaller campuses. Write down a hypothesis test that examines whether a one percent increase in student enrollment is associated with an increase of more than one percent in the number of crimes. (d) Larger schools may be located in areas with higher crime rates, e.g., urban centers. Discuss the potential of omitted variable bias in the model from part (a). Derive the sign of the bias; be explicit about your assumptions. 11. Suppose that a regression of college GPA (in points) on high school GPA (in points), a dummy for male, and a dummy for computer ownership yields the following 95% confidence interval for the estimated regression slope coefficient on high school GPA (reported by Stata): β1 (0.292, 0.667). What would the result be of a test of the null hypothesis that β 1 = 0 against a two-sided alternative. 12. Two explanatory variables are included in a multivariate model. They are positively correlated. You run an OLS regression model and obtain: ŷ = β 0 + β 1 x 1 + β 2 x 2 You then run a regression of the same dependent variable on each of the independent variables individually. Call the coefficients on x 1 and x 2 in these individual regressions δ 1 and δ 2, respectively. The mean and std. dev. of each of the variables are shown in the table below variable mean std. dev. x1 5 2 x2 3 6 Which of the coefficients changes more? I.e., which is bigger in absolute value: β1 δ 1 or β 2 δ 2? 3

13. Suppose your R-squared is 0.30, and the explained sum of squares is 16. What is the unexplained sum of squares? 14. Consider two estimated slope coefficients ( β 1 and δ 1 ) that result from the following regression models: y = β 0 + β 1 x 1 + u y = δ 0 + δ 1 x 1 + δ 2 x 2 + v Explain the circumstances under which β 1 = δ 1. 15. E(GP A SAT ) = 0.70 + 0.002SAT The mean and std. dev. of GP A and SAT are: variable mean std. dev. GPA 2.9 0.45 SAT 1050 300 What is the unconditional expectation of GPA? 16. Consider two estimators of y: z 1 and z 2. z 1 is unbiased. z 2 is biased. The variance of z 1 is 3. The variance of z 2 is 1. Which estimator do you prefer? Why? 17. When building models like: y = β 0 + β 1 x 1 + u, we usually assume that E(u) = 0. When we estimate the model above by OLS, is it important that E(u) = 0? What about E(u x 1 )? Can E(u x 1 ) be equal to something other than 0? 18. What is multicollinearity? Why is it scary? Why is it not that scary? 4

19. If Stata reports that the p-value associated with a variable in an OLS regression is 0.06, would you reject the null hypothesis that the coefficient associated with this variable is equal to 0 at the 10% level? 20. Recall that the variance of the OLS estimator is given by: V ar( β j ) = σ 2 SST j (1 R 2 j ). Suppose the std. dev. of x j, an independent variable in an OLS regression model, was 3.45. You then went and got more data (you increased N). After adding more observations, the std. dev. of x j in your sample was 3.0. What happens to the variance of β j? 21. Suppose you estimated a model by OLS and obtained the equation: ŷ = 10 + 4x 1 6x 2. What happens to β 1 if you multiply x 1 by 10? 22. Suppose that you have estimated a model by OLS and you obtain residuals (predictions of the unobservable component of the model, usually denoted by u). The table of residuals is What is the value of the missing residual? -0.6 0.4-0.6 0.4 5