Econometrics II - EXAM Answer each question in separate sheets in three hours

Similar documents
Econometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets

Chapter 4: Constrained estimators and tests in the multiple linear regression model (Part III)

Generalized Method of Moment

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

GARCH Models Estimation and Inference

13.2 Example: W, LM and LR Tests

Lecture 3: Multiple Regression

Econ 583 Final Exam Fall 2008

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

UNIVERSIDAD CARLOS III DE MADRID ECONOMETRICS FINAL EXAM (Type B) 2. This document is self contained. Your are not allowed to use any other material.

The outline for Unit 3

1. You have data on years of work experience, EXPER, its square, EXPER2, years of education, EDUC, and the log of hourly wages, LWAGE

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Introduction to Estimation Methods for Time Series models Lecture 2

Linear Regression with Time Series Data

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Advanced Econometrics

Birkbeck Working Papers in Economics & Finance

Introduction to Estimation Methods for Time Series models. Lecture 1

Wooldridge, Introductory Econometrics, 4th ed. Chapter 15: Instrumental variables and two stage least squares

Instrumental Variables, Simultaneous and Systems of Equations

Instrumental Variables

(θ θ ), θ θ = 2 L(θ ) θ θ θ θ θ (θ )= H θθ (θ ) 1 d θ (θ )

MEI Exam Review. June 7, 2002

Specification testing in panel data models estimated by fixed effects with instrumental variables

Multiple Linear Regression

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Introductory Econometrics

1 Motivation for Instrumental Variable (IV) Regression

1. The Multivariate Classical Linear Regression Model

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

A Course in Applied Econometrics Lecture 14: Control Functions and Related Methods. Jeff Wooldridge IRP Lectures, UW Madison, August 2008

The generalized method of moments

Econometrics of Panel Data

1 Introduction to Generalized Least Squares

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Economics 583: Econometric Theory I A Primer on Asymptotics

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

Econometric Methods and Applications II Chapter 2: Simultaneous equations. Econometric Methods and Applications II, Chapter 2, Slide 1

Testing Linear Restrictions: cont.

UNIVERSIDAD CARLOS III DE MADRID ECONOMETRICS Academic year 2009/10 FINAL EXAM (2nd Call) June, 25, 2010

The regression model with one fixed regressor cont d

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Econometrics Master in Business and Quantitative Methods

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Heteroskedasticity and Autocorrelation

Final Exam. Economics 835: Econometrics. Fall 2010

Missing dependent variables in panel data models

Cointegration Lecture I: Introduction

LECTURE ON HAC COVARIANCE MATRIX ESTIMATION AND THE KVB APPROACH

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Maximum Likelihood (ML) Estimation

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

Association studies and regression

The BLP Method of Demand Curve Estimation in Industrial Organization

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008

A Primer on Asymptotics

Topic 7: Heteroskedasticity

Economics 582 Random Effects Estimation

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Simple Linear Regression Model & Introduction to. OLS Estimation

Linear Regression with Time Series Data

Advanced Econometrics

Introduction Large Sample Testing Composite Hypotheses. Hypothesis Testing. Daniel Schmierer Econ 312. March 30, 2007

WISE International Masters

MS&E 226: Small Data

Finite Sample Performance of A Minimum Distance Estimator Under Weak Instruments

Econometrics Summary Algebraic and Statistical Preliminaries

Ch 2: Simple Linear Regression

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017

Chapter 4: Models for Stationary Time Series

MFE Financial Econometrics 2018 Final Exam Model Solutions

Estimation and Testing for Common Cycles

8. Hypothesis Testing

Diagnostics of Linear Regression

Suggested Solution for PS #5

Economics 620, Lecture 18: Nonlinear Models

Quick Review on Linear Multiple Regression

Nonstationary Panels

LECTURE 5 HYPOTHESIS TESTING

Econ 510 B. Brown Spring 2014 Final Exam Answers

Dealing With Endogeneity

Statistics and econometrics

Advanced Quantitative Methods: maximum likelihood

Lecture 11 Weak IV. Econ 715

Linear Regression with Time Series Data

Next is material on matrix rank. Please see the handout

Econometrics Multiple Regression Analysis: Heteroskedasticity

Central Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, / 91. Bruce E.

Simultaneous Equation Models (Book Chapter 5)

GARCH Models Estimation and Inference

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University

Econometrics I, Estimation

Nonlinear GMM. Eric Zivot. Winter, 2013

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

Warwick Economics Summer School Topics in Microeconometrics Instrumental Variables Estimation

ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 9: Multiple equation models II

Transcription:

Econometrics II - EXAM Answer each question in separate sheets in three hours. Let u and u be jointly Gaussian and independent of z in all the equations. a Investigate the identification of the following system of simultaneous equations y + γ y + δ + δ z = u y + δ + δ z = u. How does your answer change if δ = 0 but this is ignored? If some of the equations are not identified, propose a minimum set of additional restrictions that would exactly identify them. b Obtain the reduced form for the following system with nonlinear endogenous variables, y + γ y + δ + δ z = u log y + δ + δ z = u, and check that it produces the following conditional expectations { } E y z] = γ exp σ δ δ z δ δ z { } E y z] = exp σ δ δ z where σ = V ar u ] by using that for a Gaussian r.v. X N 0, σ we have that E e X] = exp { σ}. c Analyze the identification of the two equations -, proposing appropriate instrumental variables IV and studying the corresponding moment conditions needed. d If δ where known, which one would be your optimal choice of IV s for the first equation? Provide the asymptotic variance of the SLS estimates for this equation. e Analyze the identification when δ = 0.

. Consider the time series model wherein y t = βy t + v t v t = θe t + e t is a sequence of disturbances generated by a first-order moving average process which is driven by a white-noise sequence e t of independently and identically distributed random variables with zero mean and variance σ e. a Assuming that β < obtain the first three elements of the autocovariance sequence of y t, γ y 0, γ y, γ y. For which values of θ is y t stationary? b Show that the estimate of β obtained by applying the ordinary least-squares procedure on the first equation, would tend to the value of β + θ β / + θ β + θ as the size of the sample increases. c Propose a test of H 0 : θ = 0. d Imagine that the model y t = β y t + β y t + error t is fitted to the data by OLS. What values would you expect to obtain for β and β in the limit, as the size of the sample increases indefinitely, when β = 0? e If θ 0 were known, propose a consistent estimate of β and find its asymptotic distribution.

3. Consider the following nonlinear regression model, y = h z; θ 0 + v with E y z] = h z; θ 0 and usual identification conditions hold. a Consider the estimation by OLS of a misspecified linear regression model y = z β + u and find the solution for β that minimizes the variance of u. Would the OLS estimate of β be consistent for it? And for θ 0? b Consider the following linearization around an arbitrary value θ, and set the following linear regression model, where y = h z; θ 0 + v 3 h z; θ + h z; θ θ θ 0 θ + v y = z θ 0 + error, 4 y = y h z; θ + h z; θ θ θ, z = h z; θ. θ Assuming that the approximation in 3 is perfect if we include a further term a = θ 0 θ h z; θ θ θ θ 0 θ, study the probability limit of the OLS estimate θ n of θ 0 in 4. c Show that if θ p θ 0, then θ n p θ 0 under standard identification assumptions, and that θn corresponds to a Gauss-Newton iteration of Nonlinear Least Squares NLS starting with θ = θ. d Assume that h z; θ = h z θ and that z θ = z θ + z θ. Consider testing the hypothesis H 0 : θ 0 = 0, under the condition that E v z ] = σ, by means of a Wald test based on NLS estimates. Provide a test statistic and the asymptotic distribution under the null. e Find the distribution of the restricted estimate of θ 0 under H 0 and compare it with that of the unrestricted one. Propose now a Lagrange Multiplier test for H 0 and show that this amounts to a linear regression of restricted NLS residuals on an appropriate set of regressors. 3

Econometrics II - EXAM SOLUTIONS. Let u and u be jointly Gaussian and independent of z in all the equations. a Investigate the identification of the following system of simultaneous equations y + γ y + δ + δ z = u y + δ + δ z = u. nd equation: is a reduced form, always identified. st equation: the order condition is not satisfied, no omitted exogenous variables, is not identified in any case. How does your answer change if δ = 0 but this is ignored? If this is ignored and not used, it would not help. If it is known, then eq. will be identified if δ 0. If some of the equations are not identified, propose a minimum set of additional restrictions that would exactly identify them. Alternative conditions would be that it is known that δ = 0, or that δ = 0, or that Cov u, u = 0. b Obtain the reduced form for the following system with nonlinear endogenous variables, y + γ y + δ + δ z = u log y + δ + δ z = u, and check that it produces the following conditional expectations { } E y z] = γ exp σ δ δ z δ δ z { } E y z] = exp σ δ δ z where σ = V ar u ] by using that for a Gaussian r.v. X N 0, σ we have that E e X] = exp { σ}. Using that y = exp {u δ δ z} the reduced form is given by y y ] = u γ exp {u δ δ z} δ δ z exp {u δ δ z} ], so that { } E y z] = exp { δ δ z} E exp {u } z] = exp σ δ δ z. c Analyze the identification of the two equations -, proposing appropriate instrumental variables IV and studying the corresponding moment conditions needed. The second equation is still identified, just defining y = log y. 4

To identify the second equation we can now use E y z] and expand exp { δ z} in power series, so any higher moment of z could be an IV for y in eq.. Set Z =, z, z and X =, z, y. Then we need rank {E ZX ]} = 3, where, a = exp { σ } δ, z y E ZX ] = E z z zy = E z z 3 z y z a exp { δ z} = E z z za exp { δ z} z z 3 z a exp { δ z} z exp { σ δ δ z } z z z exp { σ δ δ z } z z 3 z exp { σ δ δ z } d If δ where known, which one would be your optimal choice of IV s for the first equation? Provide the asymptotic variance of the SLS estimates for this equation. In general the optimal IV s are Z = Ω z E X z] =E X z] =, z, E y z]. Therefore in this case the optimal choice would be to take as extra IV exp { δ z} itself, so Z =, z, exp { δ z} and z y E Z X ] = E z z zy exp { δ z} z exp { δ z} exp { δ z} y z exp { σ δ δ z } = E z z z exp { σ δ δ z } exp { δ z} z exp { δ z} exp { δ z} exp { σ δ δ z } z a exp { δ z} = E z z az exp { δ z}, exp { δ z} z exp { δ z} a exp { δ z} exp { δ z} which produces almost the same result as OLS. e Analyze the identification when δ = 0. In this case E ZX ] = E. z z z za z z 3 z a and clearly the first and third columns are proportional, so rank is at most. Identification relies on z appearing in the second equation.. Consider the time series model wherein y t = βy t + v t v t = θe t + e t is a sequence of disturbances generated by a first-order moving average process which is driven by a white-noise sequence e t of independently and identically distributed random variables with zero mean and variance σ e., 5

a Assuming that β < obtain the first three elements of the autocovariance sequence of y t, γ y 0, γ y, γ y. For which values of θ is y t stationary? We can write, since β <, y t = β j v t j, j=0 so that if v t is covariance/strict stationary, so is y t. But v t is a MA on iid w.n., so it is always s.s. and cov. stationary. Then γ y 0 = V ar y t ] = β V ar y t ] + V ar v t ] + βcov y t, v t ] = { σ β e + θ + βθσ } e = σ { e + θ β + βθ } since Cov y t, v t ] = Cov β j v t j, v t = Cov β 0 ] v t 0, v t = Cov vt, v t ] = θσ e. Next, and j=0 γ y = Cov y t, y t ] = βv ar y t ] + Cov y t, v t ] = βγ y 0 + Cov y t, v t ] = β σ { e + θ β + βθ } + θσ e { β = σ e + θ β + βθ ] } + θ γ y = Cov y t, y t ] = βcov y t, y t ] + Cov y t, v t ] = βγ y + 0 = βγ y. b Show that the estimate of β obtained by applying the ordinary least-squares procedure on the first equation, would tend to the value of β + θ β / + θ β + θ as the size of the sample increases. We find that or p lim ˆβ = β + Cov y t, v t V ar y t = β + θσ e γ y 0 = β + p lim ˆβ = Cov y t, y t V ar y t = βγ 0 + θσ e γ 0 = γ γ 0 θ β + θ θ + β = β + θ β + θ θ + β. 6

c Propose a test of H 0 : θ = 0. If θ = 0, then the OLSE of β is consistent, so that the residuals ˆv t = y t ˆβy t, should be approximately uncorrelated. This can be tested using uncorrelation tests based on ˆρˆv j, such as Box-Pierce. Alternatively, LM tests could be derived for an MA model, which implies testing for ρ = 0. d Imagine that the model y t = β y t + β y t + error t is fitted to the data by OLS. What values would you expect to obtain for β and β in the limit, as the size of the sample increases indefinitely, when β = 0? We have that in this case y t is just a MA, γ y 0 = σ { e + θ } γ y = σ eθ γ y = 0, so that the OLSE will converge to the corresponding population parameters, β γ = y 0 γ y γ y β γ y γ y 0 γ y + θ θ θ = θ + θ 0 = + θ + θ θ θ θ θ + θ 0 + θ = + θ θ θ θ, and in particular β β = 0. e If θ 0 were known, propose a consistent estimate of β and find its asymptotic distribution. The idea would be to implement a GLS estimate, so that y t = βy t + v t and the vt, t =,..., T are uncorrelated. Note that in this case v t MA AR with coefficients α j = θ j as far as θ <, so vt depends on v j, j =,..., t, and when t, vt e t. Then this regression is asymptotically equivalent to regress where y t = j=0 θj y t j. 3. Consider the following nonlinear regression model, y t = βy t + e t y = h z; θ 0 + v with E y z] = h z; θ 0 and usual identification conditions hold. a Consider the estimation by OLS of a misspecified linear regression model y = z β + u 7

and find the solution for β that minimizes the variance of u. Would the OLS estimate of β be consistent for it? And for θ 0? As usual β 0 = arg min E y z β ] = E zz ] E zy] b = E zz ] E zh z; θ 0 ], which can be estimated consistently by OLS, while θ 0 = arg min E θ Of course β 0 θ 0 in general unless h z; θ = z θ. y h z; θ ]. b Consider the following linearization around an arbitrary value θ, y = h z; θ 0 + v 5 h z; θ + h z; θ θ θ 0 θ + v and set the following linear regression model, where y = z θ 0 + error, 6 y = y h z; θ + h z; θ θ θ, z = h z; θ. θ Assuming that the approximation in 5 is perfect if we include a further term a = θ 0 θ hz;θ θ θ θ 0 θ, study the probability limit of the OLS estimate θ n of θ 0 in 6. We have that θn = E n z z ] E n z y ] = E n z z ] E n z y h z; θ + h z; ] θ θ θ = E n z z ] h z; θ ] E n z θ θ 0 + a + v = E n z z ] E n z z ] θ 0 + E n z z ] E n z a] + E n z z ] E n z v] = θ 0 + E z z ] E z a] + E z z ] E z v] + o p = θ 0 + b + o p, where b = E z z ] E z a] is not necessarily negligible since θ could be far away from θ 0. c Show that if θ p θ 0, then θ n p θ 0 under standard identification assumptions, and that θn corresponds to a Gauss-Newton iteration of Nonlinear Least Squares NLS starting with θ = θ. In this case we can show that E n z a] = O p θ θ 0 E n z hz;θ θ θ ] = o p O p = o p under standard moment conditions, so θ n = θ 0 +o p. Then for Q n θ = E n θ Q n θ = E n y h z; θ h z; ] θ θ = E n y h z; ] θ θ θ z = E n y z ] + E n z θ z ] = E n z y ] + E n z z ] θ y h z; θ 0 ], 8

while θ θ Q n θ = E n h z; θ θ E n z z ] := G n h z; θ ] θ E n y h z; θ h z; θ ] θ θ so that the Gauss Newton iterations are θn = θ G n θ Q n θ = θ E n z z ] { E n z y ] + E n z z ] θ } = E n z z ] E n z y ]. d Assume that h z; θ = h z θ and that z θ = z θ + z θ. Consider testing the hypothesis H 0 : θ 0 = 0, under the condition that E v z ] = σ, by means of a Wald test based on NLS estimates. Provide a test statistic and the asymptotic distribution under the null. Wald test. We assume that the unrestricted estimate satisfies n / ˆθn θ 0 d N 0, E DE 0, σ E where E = p lim θ θ Q n θ 0 = E z z ] D = V ar θ Q n θ 0 = E v z z ] = σ E z z ] = σ E. Note that h z; θ / θ = h z θ / θ = ḣ z θ z, where z = ḣ z θ z. Then the test statistics is ḣ x = h x / x, so that W n = ṋ σ ˆθ n {Ê } where Ê :=E n z z ] and ˆσ is the residual variance., ˆθn d χ #θ e Find the distribution of the restricted estimate of θ 0 under H 0 and compare it with that of the unrestricted one. Propose now a Lagrange Multiplier test for H 0 and show that this amounts to a linear regression of restricted NLS residuals on an appropriate set of regressors. LM Test. In this case we need to study the restricted estimates of θ 0, so that where θ n = θn = arg min Q n θ H 0 0, θ n and we set Ln θ = Q n θ λ θ. Then so that Q n θn θ Q n θn θ n / θn θ 0 = λ = θ Q n θ 0 + = 0 = θ Q n θ 0 + d θ θ Q n θ 0 θn θ 0 + op θ θ Q n θ 0 θn θ 0 + op = θ θ Q n θ 0 n / N 0, E D E = N 0, σ E Q n θ 0 + o p θ, 9

while n / ˆθn θ 0 N Also 0, σ { E } and { E },, E unless E = E = 0. n / λ = n / θ Q n θ 0 + E θn θ 0 + op d = n / Q n θ 0 E E θ n/ Q n θ 0 + o p θ = I E E n / θ Q n θ 0 + o p N 0, σ I E E = N 0, σ { E E E = N 0, σ { E } E I E E E } and the test statistic is { LM n = n λ Ê } λ d χ #θ. Now note that λ = θ Q n θn = En y h z θ ] n h z θn = E n y h z θn ḣ z θn z ] = E n v θn ḣ z θn z ] which can be obtained -up to scale- from running a linear regression of the residuals v θn on the additional linearized regressors ḣ z θ n z. θ 0