Solutions for Econometrics I Homework No.3

Similar documents
Solutions for Econometrics I Homework No.1

The outline for Unit 3

Interpreting Regression Results

Linear Models Review

LECTURE 10: NEYMAN-PEARSON LEMMA AND ASYMPTOTIC TESTING. The last equality is provided so this can look like a more familiar parametric test.

The Linear Regression Model

5.1 Consistency of least squares estimates. We begin with a few consistency results that stand on their own and do not depend on normality.

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Econometrics Multiple Regression Analysis: Heteroskedasticity

The Statistical Property of Ordinary Least Squares

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Lecture 7 - Separable Equations

DA Freedman Notes on the MLE Fall 2003

Statistics and econometrics

Regression: Lecture 2

Gaussian vectors and central limit theorem

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

1 Solution to Problem 2.1

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

Math Homework 2

Lecture 6: Geometry of OLS Estimation of Linear Regession

Statistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MLES & Multivariate Normal Theory

Ma 3/103: Lecture 24 Linear Regression I: Estimation

MATH 31B: MIDTERM 2 REVIEW. sin 2 x = 1 cos(2x) dx = x 2 sin(2x) 4. + C = x 2. dx = x sin(2x) + C = x sin x cos x

Both these computations follow immediately (and trivially) from the definitions. Finally, observe that if f L (R n ) then we have that.

MA22S3 Summary Sheet: Ordinary Differential Equations

Projection. Ping Yu. School of Economics and Finance The University of Hong Kong. Ping Yu (HKU) Projection 1 / 42

Lebesgue s Differentiation Theorem via Maximal Functions

Advanced Quantitative Methods: maximum likelihood

Introduction to Estimation Methods for Time Series models Lecture 2

Sensitivity of GLS estimators in random effects models

Linear Algebra Review

Graduate Econometrics I: Unbiased Estimation

p. 6-1 Continuous Random Variables p. 6-2

1. The accumulated net change function or area-so-far function

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

(I AL BL 2 )z t = (I CL)ζ t, where

Regression and Statistical Inference

Math 250B Final Exam Review Session Spring 2015 SOLUTIONS

Advanced Quantitative Methods: maximum likelihood

Math 162: Calculus IIA

SOLUTION GUIDE TO MATH GRE FORM GR9367

Solutions to Tutorial 11 (Week 12)

10550 PRACTICE FINAL EXAM SOLUTIONS. x 2 4. x 2 x 2 5x +6 = lim x +2. x 2 x 3 = 4 1 = 4.

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

Master s Written Examination

Maximum Likelihood Estimation

2.2 Separable Equations

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Econometrics of Panel Data

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

CITY UNIVERSITY LONDON. BEng (Hons) in Electrical and Electronic Engineering PART 2 EXAMINATION. ENGINEERING MATHEMATICS 2 (resit) EX2003

REAL ANALYSIS I Spring 2016 Product Measures

For example, the real line is σ-finite with respect to Lebesgue measure, since

Math 10C - Fall Final Exam

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA

8. Hypothesis Testing

Topics in Probability and Statistics

Math Refresher Course

Review of Econometrics

4. CONTINUOUS RANDOM VARIABLES

LEBESGUE INTEGRATION. Introduction

Heteroskedasticity and Autocorrelation

Exercise The solution of

Review for the First Midterm Exam

Partial Fractions. June 27, In this section, we will learn to integrate another class of functions: the rational functions.

Georgia Tech PHYS 6124 Mathematical Methods of Physics I

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

Large Sample Properties of Estimators in the Classical Linear Regression Model

Analysis IV : Assignment 3 Solutions John Toth, Winter ,...). In particular for every fixed m N the sequence (u (n)

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous:

Chapter 8 Integral Operators

6.2 Fubini s Theorem. (µ ν)(c) = f C (x) dµ(x). (6.2) Proof. Note that (X Y, A B, µ ν) must be σ-finite as well, so that.

1 Mixed effect models and longitudinal data analysis

Final practice, Math 31A - Lec 1, Fall 2013 Name and student ID: Question Points Score Total: 90

ARCS IN FINITE PROJECTIVE SPACES. Basic objects and definitions

Folland: Real Analysis, Chapter 2 Sébastien Picard

x n cos 2x dx. dx = nx n 1 and v = 1 2 sin(2x). Andreas Fring (City University London) AS1051 Lecture Autumn / 36

Math 147, Homework 1 Solutions Due: April 10, 2012

Mathematics 136 Calculus 2 Everything You Need Or Want To Know About Partial Fractions (and maybe more!) October 19 and 21, 2016

Lecture 3: Multiple Regression

g(t) = f(x 1 (t),..., x n (t)).

Math 181, Exam 2, Study Guide 2 Problem 1 Solution. 1 + dx. 1 + (cos x)2 dx. 1 + cos2 xdx. = π ( 1 + cos π 2

Math The Laplacian. 1 Green s Identities, Fundamental Solution

Methods of Integration

Introduction to Estimation Methods for Time Series models. Lecture 1

AMS-207: Bayesian Statistics

Quick Review on Linear Multiple Regression

CS Tutorial 5 - Differential Geometry I - Surfaces

Lesson 17: Vector AutoRegressive Models

Math 205A - Fall 2015 Homework #4 Solutions

Math 423/533: The Main Theoretical Topics

MATH 452. SAMPLE 3 SOLUTIONS May 3, (10 pts) Let f(x + iy) = u(x, y) + iv(x, y) be an analytic function. Show that u(x, y) is harmonic.

Multiple Regression Analysis

LINEAR ALGEBRA REVIEW

DRAFT - Math 102 Lecture Note - Dr. Said Algarni

Transcription:

Solutions for Econometrics I Homework No3 due 6-3-15 Feldkircher, Forstner, Ghoddusi, Pichler, Reiss, Yan, Zeugner April 7, 6 Exercise 31 We have the following model: y T N 1 X T N Nk β Nk 1 + u T N 1 In Matrix notation this looks like: y 11 y 1T y N1 y NT X 1T k X T k X NT k β 1k 1 β k 1 β Nk 1 + u 11 u 1T u N1 u NT The cont correlation among the observations Eu i,t u j,t σ ij and Eu it σ i results into a VCV of the following form: Σ uut N T N σ 1I T σ 1, I T σ 1N I T σ I T σ N,1 I T σ N I T 1

To write the VCV in more compact form we introduce Σ N N given by: σ 1 σ 1, σ 1N σ Σ N N σ N,1 σn We can write now Σ uut N T N Σ N N I T The generalized least squares estimator is given by: β GLS [X Σ 1 I T X] 1 [X Σ 1 I T y] Show that if X 1 X X 3 X N X then β GLS coincides with the OlS estimator for N seperate OLS Regressions The X-matrix X T N Nk looks now like X X X

which can be rewritten as X I N X T k The GLS estimator then becomes: β GLS [X Σ 1 I T X] 1 [X Σ 1 I T y] 1 [I N X k T Σ 1 N N I T I N X T k ] 1 [I N X Σ 1 I T y N 1 ] [Σ 1 N N X k T I N X T k ] 1 [Σ 1 X y] 3 [Σ 1 N N X X k k ] 1 [Σ 1 X y] 4 [Σ 1 N N X X k k 1 ][Σ 1 X y] 5 [Σ N N Σ 1 }{{ N N } X X 1X ]y 6 I N I N X X 1 X y 7 X X 1 X y 1 8 X X 1 X y N Where we have used the following rules concerning the Kronecker product: For A, B nonsingular it holds that For A, B with matching dimensions A B 1 A 1 B 1 A BC D AC BD Exercise 3 i Show that the function fy 1 π it integrates to one over R Show that y y dy π 3 is a density function, ie show that

One may split up the integral on the left-hand side: y dy y dy + y dy y dy 9 Now take the square of it and rename y into t in one factor Since the integral with respect to t can be considered constant with respect to y, it may by put into the integral with respect to y: y dy t dt y dy t dt y dy Now substitute in 1 for t xy, which implies dt y dx Note: we substitute for t since it is in the inner integral and thus y can be regarded as constant? right? y x y dx y dy 1 + x y y dx dy By Fubini s Theorem*, we may interchange the -signs in 1 Now set z : 1+x y, ie y z y 1 + x y and dy 1+x 1 1 + x dy dx dz : 1+x z z dz dx We know the antiderivative of the inner integral : z z dz z 1 1 4

Substituting into the entire ression, and remembering that the resulting formula is the definition of arc tan yields the following ression for 1: So we know that Since the ression y 1 1 + x dx arc tan x π y dy π is non-negative over the interval [, ], the whole ression 9 and its integral is non-negative, therefore the root of it is non-negative as well y dy which is what we aimed to show y π dy π Any non-negative Lebesgue-integrable function fx on R with total integral over fxdx is a density function for some probability distribution and vice versa R Note: we could equally have used Fubini s Theorem in order to transform y dy into polar coordinates see http://enwikipediaorg/wiki/ Gaussian_integral * Outline of use of Fubini s Theorem in this exercise: Let µ and ν some σ-finite measures on Ω 1, B 1 and Ω, B respectively Further let fx, y L + Ω 1 Ω, B 1 B, µ ν Then f dµ ν fx, y dνy dµx fx, y dνx dµy ii variance 1 Show that a random variable with density function fy has mean and Ey y fydy 1 π y y dy 5

The antiderivative of y Therefore Ie Ey d dy y y Since Ey, Vary Ey : Vary is easily recognized: y y y dy y fydy 1 π y y y y y dy For this we use partial integration: b a uv uv b a b a u v where we set u y and v y y : 1 π y y + y dy }{{} π by i The first summand in the above ression is zero since vanishes faster as any polynomial as y : y y Hence we have Vary 1 π π 1 y lim n + n n n Exercise 33 Let z i for i 1,, T be independently distributed N, σ Denote with z 1 T T t1 z t the empirical mean and with s 1 T T t1 z t z the empirical variance Show that T s σ 6

is X T 1 distributed From our Q1 Theorem we know that the quadratic form Q z Az σ is X r,λ distributed, with λ µ Aµ σ form and A being a projector with rkar Proof: Since in our example µ it follows that λ Now consider the quadratic T s T 1 σ σ T T z t z t1 1 σ z I T 11 T I T 11 T z z I T 11 T z σ We know from a previous exercise, that I T 11 is a projector projecting on the T orthocomplement of the space spanned by 1 Hence its rank is equal to T 1 So we can apply the Q1 Theorem, whith A I T 11, λ to get that T is X T 1 distributed Q z Az σ Exercise 34 Show that the ected value under the true probability measure of the score is equal to, ie Esθ y and that the variance of the score is equal to the information matrix Iθ y Remark: you can assume throughout that you can interchange differentiation and integration when necessary or helpful The random variable is y which is a function from a σ-field in a probability space to R Moreover any transformation R R applied on y is still a random variable The probability measure of, x] is thus the function x fydy which satisfies the requirements for a probability measure 7

We know Therefore: E θ sθ y sθ y : θ lθ y θ lnfy θ θ fy θ fy θ θ lθ yfy θdy θ fy θdy θ fy θ fy θ fy θdy Here we may change integration and differentiation signs if there exists a function gy, θ such that fy,θ +δ fy,θ gx, θ δ and gy, θ R dy 1 E θ sθ y θ fy θdy d 1fy θdy d dθ dθ 1 }{{} E11 is: We know θ sθ y θ lnfy θ The score lnfy θ θ θ [ fy θ ] is a vector whose i-th element θ i fy θ i Differentiating this ression with respect to the vector θ is equivalent to differentiating each element of the score This results into a matrix, whose i-j-th entry is the following: [ θ j fy θ θ i fy θ ] i,j fy θ θ j θ i fy θ fy θ θ j θ i fy θ fy θ θ j fy θ fy θ θ j fy θ }{{} j-th element of the score fy θ θ i fy θ fy θ θ i i,j Since the latter two terms are equal to the j-th and the i-th element of the score, respectively, while the very first second derivative is equal the i-j-th entry of the Hessian of f, we may write the matrix whose i-j-th element is given above as: θ sθ y 1 fy θ H f sθ ysθ y 1 compare Casella/Berger 199: Statistical Inference; Duxbury Press, Belmont, CA p7 i,j 8

Taking the ected value of the i,j-th element yields the following: [ ] E s i θ y fy θ θ j θ i θ j i,j fy θ s iθ ys j θ y fy θdy [ ] fy θ dy E s i θ ys j θ y θ j θ i i,j The integral first ression the second derivative thing is zero if we may interchange integration and differentiation signs: fy θdy fy θdy θ j θ i θ i θ } i {{} * Where we know that ression * is zero from the first part of Exercise 34 i,j Knowing that θ sθ y E l, the Hessian of l, we have: θ θ l θ θ E sθ ysθ y E H l θ y E sθ ysθ y Therefore the two definitions of the information Iθ y yield the same result Exercise 35 proof of Lemma : F b β e β X X b β e β σ T k bu bu m σ is F m,t k The denominator bu bu σ is χ T k we know that by theorem Q4 β β X X 1 R [ RX X 1 R ] 1 R β r 9

so we can rearrange: β β X X β β R β r [ RX X 1 R ] 1 RX X 1 X XX X 1 R [ RX X 1 R ] 1 R β r R β r [ RX X 1 R ] 1 R β r null hypothesis holds: Rβ r we know that β X X 1 X y X X 1 X Xβ + u β + X X 1 X u R β r Rβ r + RX X 1 X u as Rβ r by assumption Plugging in above yields: So [ R β r ] RX X 1 R 1 [ R β r u XX X 1 R ] RX X 1 R 1 RX X 1 X u Note that the matrix between the u is symmetric and idempotent and has Rank m number of restrictions rank of R So by Theorem Q1, this quantity divided by σ is χ m as the u are centered As û I P y I P u where P XX X 1 X we can write: F β β X X β β T k û û m [ u XX X 1 R ] RX X 1 R 1 RX X 1 X u T k u I P u m What s left to check for proving that the whole quantity is F m,t k is the independence of the denominator and the nominator Note that I P XX X 1 By theorem Q3, this establishes independence X XX X 1 X X X X 1 1

β β, where Rβ r if this is the case, we know from class theorem Q5 and the stuff before that β β β + X + u β [ X + u + Xβ β ] u + Xβ β is NXβ β, σ I T Caution: This conclusion is incorrect: We did the correct thing in class F β β X X β β T k û û m u + Xβ β XX X 1 X XX X 1 X u + Xβ β T k u I P u m u + Xβ β XX X 1 X XX X 1 X u + Xβ β T k u I P u m u + Xβ β XX X 1 X u + Xβ β T k u I P u m By theorem Q1, the nominator divided by σ is χ λ k as XX X 1 X is symmetric and idempotent λ β e β X Xβ e β σ Note that I P XX X 1 X So by theorem Q3, denominator and nominator are independent has rank k and Exercise 36 leave out T k and m ũ ũ û û û û ũ ũ û û 1 ũ ũ/t û û/t 1 1 R 1 R 1 1 R 1 R R R 1 R 1 R Exercise 37 Exercise 37 is very similar to Exercise 35 11

Exercise 38 Denote with ˆβ the unrestricted OLS estimator and with β the restricted OLS estimator under the restriction Rβ r Then the quantities R ˆβ r RX X 1 R 1 R ˆβ r and ˆβ β X X ˆβ β are equal and can be written, by Lemma 3, as ũ ũ û û ˆβ X X 1 X y, β ˆβ QR ˆβ r with Q X X 1 R [RX X 1 R ] 1 Thus ˆβ β QR ˆβ r and ˆβ β X X ˆβ β R ˆβ r Q X XQR ˆβ r Now Q [ X X 1 R [RX X 1 R ] 1] [ [RX X 1 R ] ] 1 [ X X 1 R ] [ R[RX X 1 ] ] 1 RX X 1 [ RX X 1 R ] 1 RX X 1 Thus ˆβ β X X ˆβ β R ˆβ r [ RX X 1 R ] 1 RX X 1 X XX X 1 R [RX X 1 R ] 1 R ˆβ r R ˆβ r [ RX X 1 R ] 1 [ RX X 1 R ] [ RX X 1 R ] 1 R ˆβ r R ˆβ r [ RX X 1 R ] 1 R ˆβ r QED Exercise 39 Show Lemma 7 y X β M 1 y X β y X ˆβ y X ˆβ + ˆβ β H ˆβ β Proof: Let y X β y x ˆβ + x ˆβ X β y x ˆβ + x 1 ˆβ1 + x ˆβ β, then we have 1

y X β M 1 y X β [y x ˆβ + x 1 ˆβ1 + x ˆβ β] M 1 [y x ˆβ + x 1 ˆβ1 + x ˆβ β] y x ˆβ M }{{} 1 y x ˆβ + y x ˆβ {}}{ M 1 X }{{} 1 ˆβ 1 + y x ˆβ M 1 X }{{} ˆβ û û β + ˆβ 1 X 1M }{{} 1 X ˆβ β + ˆβ β X M 1 X }{{} ˆβ β H We know that M 1 is orthogonal to the space spanned by colx 1, thus M 1 X 1 and û [colx 1, X ] [colx 1 ], so û M 1 û Similarly, û [colx ], so û X Hence the right side of the equation above only remains y X ˆβ y X ˆβ + ˆβ β H ˆβ β, which is just what we have to show Exercise 31, first part i Show Lemma 8 of the document Some Basics on Testing : Under the null hypothesis that β β with β R k, the quantity is F k,t k distributed F ˆβ β H ˆβ β T k û û k By the Frisch-Waugh Theorem ˆβ X M 1 X 1 X M 1 y Under the Null-Hypothesis, we hence receive Y X β + u 13

Inserting this into ˆβ yields ˆβ X M 1 X 1 X M 1 X β + X M 1 X 1 X M 1 u β + X M 1 X 1 X M 1 u We also know that H X M 1 X Inserting these into the nominator above yields u X M 1 X 1 X M 1 X M 1 X X M 1 X 1 X }{{} M 1 u I u M 1X X M 1 X 1 X M }{{} 1 u symmetric+idempotent The rank of this projector is k By Theorem Q1 and exercise 35 we hence know that this, if divided by σ, is distributed as χ m We also again know by Theorem Q4 that the denominator is, if divided by σ, distributed as χ T k So what remains is to show independence by multiplying both projectors and receiving: I X + X }{{} I P X X + }{{} idempotent X X + X X + 14

Again by Theorem Q3 this establishes independence ii Derive the distribution of the quantity F when the ture value of β is equal to some arbitrary, fixed β Exercise 311 Exercise 311 is concerned with MATLAB programming in groups of two 15