(a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1.

Similar documents
Final Exam. 1. (6 points) True/False. Please read the statements carefully, as no partial credit will be given.

Review: General Approach to Hypothesis Testing. 1. Define the research question and formulate the appropriate null and alternative hypotheses.

Stat 5102 Final Exam May 14, 2015

STATS 200: Introduction to Statistical Inference. Lecture 29: Course review

parameter space Θ, depending only on X, such that Note: it is not θ that is random, but the set C(X).

Hypothesis Testing One Sample Tests

Econ 583 Final Exam Fall 2008

Economics 583: Econometric Theory I A Primer on Asymptotics: Hypothesis Testing

Lecture 15. Hypothesis testing in the linear model

Unobservable Parameter. Observed Random Sample. Calculate Posterior. Choosing Prior. Conjugate prior. population proportion, p prior:

Comparing two independent samples

Institute of Actuaries of India

Ma 3/103: Lecture 24 Linear Regression I: Estimation

Statistics & Data Sciences: First Year Prelim Exam May 2018

UNIVERSITY OF TORONTO Faculty of Arts and Science

STA442/2101: Assignment 5

MS&E 226: Small Data

Hypothesis Testing for Var-Cov Components

Probability and Statistics Notes

Central Limit Theorem ( 5.3)

First Year Examination Department of Statistics, University of Florida

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Math 3330: Solution to midterm Exam

EC2001 Econometrics 1 Dr. Jose Olmo Room D309

Is the cholesterol concentration in blood related to the body mass index (bmi)?

An Introduction to Bayesian Linear Regression

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

Machine Learning Gaussian Naïve Bayes Big Picture

ECON 4160, Autumn term Lecture 1

Simple Linear Regression

Lecture 3. Inference about multivariate normal distribution

Logistic Regression Review Fall 2012 Recitation. September 25, 2012 TA: Selen Uguroglu

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Mathematics Qualifying Examination January 2015 STAT Mathematical Statistics

Machine Learning, Fall 2012 Homework 2

Machine Learning Tom M. Mitchell Machine Learning Department Carnegie Mellon University. September 20, 2012

STA 2101/442 Assignment 3 1

Ch 2: Simple Linear Regression

CSE 312 Final Review: Section AA

Stat 5101 Lecture Notes

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables THE UNIVERSITY OF MANCHESTER. 21 June :45 11:45

Limited Dependent Variables and Panel Data

STAT Financial Time Series

ECON Introductory Econometrics. Lecture 16: Instrumental variables

Bayesian Regression (1/31/13)

Spring 2012 Math 541B Exam 1

Swarthmore Honors Exam 2012: Statistics

First Year Examination Department of Statistics, University of Florida

McGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper

Mathematics Ph.D. Qualifying Examination Stat Probability, January 2018

Heteroskedasticity. Part VII. Heteroskedasticity

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Hypothesis testing. Anna Wegloop Niels Landwehr/Tobias Scheffer

Hypothesis Test. The opposite of the null hypothesis, called an alternative hypothesis, becomes

Machine Learning Linear Classification. Prof. Matteo Matteucci

Chapter 4 HOMEWORK ASSIGNMENTS. 4.1 Homework #1

Confidence Intervals and Hypothesis Tests

Parameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn

Bayesian Interpretations of Regularization

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

[y i α βx i ] 2 (2) Q = i=1

The Delta Method and Applications

Master s Written Examination - Solution

INTERVAL ESTIMATION AND HYPOTHESES TESTING

TAMS39 Lecture 10 Principal Component Analysis Factor Analysis

Problem Selected Scores

Master s Written Examination

Frequentist Statistics and Hypothesis Testing Spring

TUTORIAL 8 SOLUTIONS #

Chapter 9: Hypothesis Testing Sections

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

Chapters 9. Properties of Point Estimators

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

The joint posterior distribution of the unknown parameters and hidden variables, given the

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Part 6: Multivariate Normal and Linear Models

y Xw 2 2 y Xw λ w 2 2

Econ 510 B. Brown Spring 2014 Final Exam Answers

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li.

Math 562 Homework 1 August 29, 2006 Dr. Ron Sahoo

Solution: First note that the power function of the test is given as follows,

Summary: the confidence interval for the mean (σ 2 known) with gaussian assumption

Simple and Multiple Linear Regression

Machine Learning

STAT 461/561- Assignments, Year 2015

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

17: INFERENCE FOR MULTIPLE REGRESSION. Inference for Individual Regression Coefficients

Wald s theorem and the Asimov data set

Review. December 4 th, Review

UNIVERSIDAD CARLOS III DE MADRID ECONOMETRICS FINAL EXAM (Type B) 2. This document is self contained. Your are not allowed to use any other material.

Bayesian Statistics. Debdeep Pati Florida State University. February 11, 2016

Machine Learning

Probability and Statistics qualifying exam, May 2015

MATH 644: Regression Analysis Methods

1. The Multivariate Classical Linear Regression Model

Topic 19 Extensions on the Likelihood Ratio

Notes on the Multivariate Normal and Related Topics

Generalized, Linear, and Mixed Models

MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30

Institute of Actuaries of India

Transcription:

Problem 1 (21 points) An economist runs the regression y i = β 0 + x 1i β 1 + x 2i β 2 + x 3i β 3 + ε i (1) The results are summarized in the following table: Equation 1. Variable Coefficient Std. Error β 0 0.82 1.04 β 1-1.16 5.74 β 2 7.63 3.70 β 3-2.57 0.94 R-squared 0.102 Sum squared resid 10297 Number of observations 100 (a) (3 points) Construct a 95% confidence interval for β 2 in Equation 1. 1

(b) (3 points) Test whether β 1 in Equation 1 equals 0 (against the alternative that it differs from 0). Test at a 5% level of significance. (c) (6 points) How would you test whether β 1 = β 2 = β 3 = 0 in equation 1? How does your answer depend on the assumptions made on ε i?. 2

Using the same data, the economist also runs the regression y i = β 0 + x 2i β 2 + ν i (2) Some of the results are summarized in the following table: Equation 2. Variable Coefficient Std. Error β 0 0.83 1.07 β 2 6.86 3.81 R-squared? Sum squared resid 11095 Number of observations 100 (d) (9 points) What is R 2 in equation 2? 3

Problem 2 (14 points) Suppose that the true relationship between four variables y, x, z and w is given by y i = δ 0 + δ 1 z i + δ 2 x i + u i (3) z i = α 0 + α 1 x i + α 2 w i + v i (4) where the unobserved error terms u i and v i both have mean 0 and are uncorrelated with x i and w i. It is not assumed that u i and v i are uncorrelated. Answer the questions below under the assumption of random sampling of (y i,x i,z i,w i ) (i.e., the observations are i.i.d.). (a) (7 points) Suppose that you regress y i on a constant, z i and x i. Are the regression coefficient on z i and x i consistent estimators of δ 1 and δ 2? Explain. If not, how would you estimate δ 1 and δ 2? (State exactly what additional assumptions, if any, you need to make) 4

(b) (7 points) Assume that w i and x i are independent, and consider the regression y i = β 0 + β 1 x i + error Interpret the coefficients β 0 and β 1 in terms of the coefficients in (3) and (4). 5

Problem 3 (15 points) Consider the model y i = z i δ + ε i, with E [ε i z i ] = 0 Moreover assume that there is another variable x i such that E [ε i x i ] = 0. Both z i and x i are one dimensional. Assume that assumptions (3.1) (3.5) of Hayashi are satisfied and that [( )( ) ] ( ) xi xi 3 1 E = 1 3 z i z i and E [ ] ε 2 i x i,z i = 1 (a) (5 points) Find the asymptotic distribution of the OLS estimator in a regression of y i on z i. 6

(b) (5 points) Find the asymptotic distribution of the 2SLS estimator of δ that uses x i as instrument. 7

(c) (5 points) Find the asymptotic distribution of the OLS estimator in a regression of y i on x i. 8

Problem 4 (10 points) Consider two linear regression models and y t = x tβ + ε t w t = z tδ + ν t Assume that the standard assumptions for asymptotics in regression models (Hayashi assumptions 2.1 2.5) are satisfied for each of the models. Suppose that β and δ are consistent estimators of β and δ (but they are not necessarily the OLS estimators). Is T 1 T t=1 ( ) ( ) y t x t β w t z t δ necessarily a consistent estimator of the covariance between ε t and ν t? Explain. 9

Problem 5 (15 points) Suppose Y i µ iid N(µ, 1), i = 1,...,n, µ N(2, 1), and L( µ,µ) = ( µ µ) 2. Suppose n = 1. (a) (3 points) What is the MLE of µ? (You do not need to derive the answer; you may just state it.) (b) (3 points) Show the Bayes risk of the MLE.(You do not need to derive the answer, just state it.) 10

(c) (3 points) Show the posterior distribution for µ. (You do not need to derive the answer; just may state it.) (d) (3 points) Show the Bayes estimator of µ.(you do not need to derive the answer; you may just state it.) (e) (3 points) Show the Bayes risk of the Bayes estimator. (You do not need to derive the answer; you may just state it.) 11

Problem 6 (5 points) X,Y,U are independent random variables: X N(5, 4), Y χ 2 1 and U is distributed Bernoulli with P(U = 1) = 0.3. Let W = UX + (1 U)Y. Find the mean and variance of W. 12

Problem 7 (10 points) Y i are iid Bernoulli random variables with P(Y i = 1) = p. I am interested in H 0 : p = 1/2 versus H a : p > 1/2. The sample size is n = 100, and I decide to the reject the null hypothesis if n i=1 Y i > 55. (a) (5 points) Use the central limit theorem to derive the approximate size of the test. 13

(b) (5 points) Suppose p = 0.60. Use the central limit theorem to derive the power of the test. 14

Problem 8 (30 points) Suppose Y i µ iid N(µ, 1), i = 1,...,n. Suppose that you know µ 5. Let µ MLE denote the value of µ that maximizes the likelihood, subject to the constraint µ 5, and let Y denote the sample mean of the Y i. (a) (12 points) Show that µ MLE = a(y )5 + (1 a(y ))Y, where a(y ) = 1 for Y 5, and a(y ) = 0 otherwise. (Hint: you might find it useful to write (y i µ) = (y i Y ) + (Y µ).) 15

(b) (14 points) Suppose that µ = 3. (b.i) (6 points) Show that a(y ) p 0. (Hint: What is P(a(Y ) = 0)?) (b.ii) (4 points) Show that µ MLE p µ. 16

(b.iii) (4 points) Show that n( µ MLE µ) d N(0, 1). (c) (4 points) Suppose that µ = 5. Does µ MLE p µ? Explain 17