MEI Exam Review. June 7, 2002

Similar documents
Final Exam November 24, Problem-1: Consider random walk with drift plus a linear time trend: ( t

ECON 616: Lecture Two: Deterministic Trends, Nonstationary Processes

Econometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

Moreover, the second term is derived from: 1 T ) 2 1

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing

ECON 4160, Spring term Lecture 12

STAT Financial Time Series

Non-Stationary Time Series and Unit Root Testing

Ch 2: Simple Linear Regression

Empirical Market Microstructure Analysis (EMMA)

Quick Review on Linear Multiple Regression

Advanced Econometrics

Multivariate Time Series: VAR(p) Processes and Models

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Unit Root and Cointegration

Problem Set #6: OLS. Economics 835: Econometrics. Fall 2012

Lecture 10: Panel Data

Review of Econometrics

Asymptotic Least Squares Theory

Autoregressive Moving Average (ARMA) Models and their Practical Applications

MA Advanced Econometrics: Applying Least Squares to Time Series

Simple and Multiple Linear Regression

BCT Lecture 3. Lukas Vacha.

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Chapter 2: Unit Roots

[y i α βx i ] 2 (2) Q = i=1

11. Further Issues in Using OLS with TS Data

7 Introduction to Time Series

Lecture 3: Multiple Regression

STAT763: Applied Regression Analysis. Multiple linear regression. 4.4 Hypothesis testing

Econometrics II - EXAM Answer each question in separate sheets in three hours

Threshold models: Basic concepts and new results

Statistics 910, #5 1. Regression Methods

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Econometrics of Panel Data

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

10) Time series econometrics

Vector Auto-Regressive Models

E 4101/5101 Lecture 9: Non-stationarity

VAR Models and Applications

Lecture 1: Stationary Time Series Analysis

Econometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Time Series Methods. Sanjaya Desilva

ECON 4160, Lecture 11 and 12

Lecture 15. Hypothesis testing in the linear model

3 Multiple Linear Regression

Trending Models in the Data

Non-Stationary Time Series, Cointegration, and Spurious Regression

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

Panel Data Models. Chapter 5. Financial Econometrics. Michael Hauser WS17/18 1 / 63

Chapter 4: Models for Stationary Time Series

4 Multiple Linear Regression

Introduction to Estimation Methods for Time Series models Lecture 2

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

Parameter estimation: ACVF of AR processes

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Discrete time processes

Eksamen på Økonomistudiet 2006-II Econometrics 2 June 9, 2006

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

MA Advanced Econometrics: Spurious Regressions and Cointegration

Linear Regression with Time Series Data

F9 F10: Autocorrelation

7. Integrated Processes

Unit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2

Ch 3: Multiple Linear Regression

Testing for non-stationarity

Cointegration Lecture I: Introduction

Multivariate Time Series: Part 4

An Improved Specification Test for AR(1) versus MA(1) Disturbances in Linear Regression Models

Økonomisk Kandidateksamen 2004 (II) Econometrics 2 June 14, 2004

Regression with time series

Section 6: Heteroskedasticity and Serial Correlation

Final Exam. Economics 835: Econometrics. Fall 2010

Dynamic Regression Models (Lect 15)

Stationary and nonstationary variables

Lecture 2: Univariate Time Series

Advanced Quantitative Methods: maximum likelihood

Econometrics of Panel Data

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Midterm Suggested Solutions

Empirical Economic Research, Part II

Financial Time Series Analysis Week 5

1 Introduction to Generalized Least Squares

Linear Regression. Junhui Qian. October 27, 2014

Simple Linear Regression

Financial Time Series Analysis: Part II

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

GARCH Models Estimation and Inference

Association studies and regression

11/18/2008. So run regression in first differences to examine association. 18 November November November 2008

Econometrics. Week 6. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Univariate, Nonstationary Processes

The Statistical Property of Ordinary Least Squares

ECON3327: Financial Econometrics, Spring 2016

Linear Regression with Time Series Data

Transcription:

MEI Exam Review June 7, 2002

1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB) 1 = (B 1 A 1 ). (AB) 1 = (B 1 A 1 ). Simple linear regression coefficient with only one independent variable and an intercept. [ ] [ ] ˆβ = (X X) 1 X T xt y = yt xt x 2. (1) t xt y t Proof about projections. ˆɛ = M X ɛ = (I X(X X) 1 X )(y Xβ) = y X(X X) 1 X y Xβ + X(X X) 1 X Xβ = y X(X X) 1 X y = M X y = (I X(X X) 1 X )y = y X(X X) 1 X y = y X ˆβ = ˆɛ. Simple linear regression with a constant. R squared. Inverse Rule. y = β 1 + β 2 x t + ɛ t. ˆβ 2 = t(x t x)y t t (x t x). 2 R 2 = 1 RSS T SS = ESS T SS. A 1 = A A. If A is a 2x2, Switch the Main, Negate the Off. 2

F test of a restriction. F = (RSS R RSS U )/q RSS U /(N K) With q restrictions and K parameters. F (q, N K). Just algebra. Variance of the OLS estimator. with, s 2 = Multiplying out RSS, RSS N K = RSS = (y X ˆβ) (y X ˆβ) = (y ˆβ X )(y X ˆβ) X X = X ɛ = T x t x t. 1 T x t ɛ t. ˆ V ar( ˆβ) = s 2 (X X) 1, = y y ˆβ X y y X ˆβ + ˆβ X X ˆβ) 1 ˆɛ ˆɛ N K = (y X ˆβ) (y X ˆβ). N K = y y ˆβ X y y X(X X) 1 X y + ((X X) 1 X y) X X(X X) 1 X y = y y ˆβ X y y X(X X) 1 X y + y X (X X) 1 X X(X X) 1 X y }{{} I = y y ˆβ X y y X(X X) 1 X y + y X(X X) 1 X y }{{} 0 = y y ˆβ X y. = y y y X ˆβ. Thus, Mean Lag. (A general lag polynomial.) s 2 = RSS N K = y y ˆβ X y N K. δ(l) = δ (L) = δ j L j. j=0 jδ j L j 1. j=0 3

Then, Mean Lag δ (1) δ(1). Lagged dependents PLUS AR(1) errors yields inconsistent estimators! Autoregressive Final Form. 1.2 Characteristic Polynomials Consider an AR(2) process: Rewriting with the lag operator: A(L)y = B(L)X + ɛ. y = A 1 BX + A 1 ɛ. y = A A BX + A A ɛ. A y = A BX + A ɛ. y t = φ 1 y t 1 + φ 2 y t 2 + ɛ t. y t φ 1 Ly t φ 2 L 2 y t = ɛ t. (1 φ 1 L φ 2 L 2 )y t = ɛ t. Factorizing the left hand side: 1 φ 1 L φ 2 L 2 = (1 Z 1 L)(1 Z 2 L). Divide through by L 2 : Multiply out the right hand side: L 2 φ 1 L 1 φ 2 = L 2 (1 Z 1 L)(1 Z 2 L). Let Z = L 1 : L 2 φ 1 L 1 φ 2 = L 2 (1 Z 2 L Z 1 L + Z 1 Z 2 L 2 ). L 2 φ 1 L 1 φ 2 = L 2 Z 2 L 1 Z 1 L 1 + Z 1 Z 2. L 2 φ 1 L 1 φ 2 = L 2 (Z 1 + Z 2 )L 1 + Z 1 Z 2. L 2 φ 1 L 1 φ 2 = (L 1 Z 1 )(L 1 Z 2 ). Z 2 φ 1 Z φ 2 = (Z Z 1 )(Z Z 2 ). Thus the roots of the characteristic polynomial are Z 1 and Z 2. And explicitly: Z 1,2 = φ 1 ± φ 2 1 + 4φ 2. 2 4

1.3 Instrumental Variables Consider a regression equation: y = Xβ + ɛ. If E[Xɛ] 0, then use Instrumental Variables (IV). Suppose there exists a set of instruments Z, for the problem X variable. We will use the 2 stage least squares technique to derive the IV estimator. Step 1: Run the following regression: The estimator will be the following form: X = Zγ + u. ˆγ = (Z Z) 1 Z X. Step 2: Compute the fitted values such that: Step 3: Run the following regression: The estimator will be the following: Substitute in for ˆX: Simplifying, Which is the IV estimator. ˆX = Zˆγ = Z(Z Z) 1 Z X. y = ˆXβ + ɛ. ˆβ = ( ˆX ˆX) 1 ˆX y. ˆβ = ((Z(Z Z) 1 Z X) Z(Z Z) 1 Z X) 1 (Z(Z Z) 1 Z X) y. ˆβ = (X Z(Z Z) 1 Z Z(Z Z) 1 Z X) 1 X Z(Z Z) 1 Z y. ˆβ IV = (X Z(Z Z) 1 Z X) 1 X Z(Z Z) 1 Z y. An important note. If X is exactly identified in that Z is the same dimension as X, then we can carry the analysis further. Inverting: ˆβ IV = (Z X) 1 (Z Z) (X Z) 1 X Z(Z Z) 1 Z y. }{{} I ˆβ IV = (Z X) 1 (Z Z)(Z Z) 1 Z y. }{{} I ˆβ IV = (Z X) 1 Z y. 5

1.4 Dicky Fuller Tests Use DF tables for tests of stationarity with: H 0 : φ = 1, Non Stationary. H 1 : φ < 1, Stationary. A Guide to Dicky Fuller Tables. First Panel Nothing y = φy t 1 + ɛ t Second Panel Constant y = α + φy t 1 + ɛ t (2) Third Panel Constant and Time Trend y = α + βt + φy t 1 + ɛ t Example. Consider the model: To test: Use: And check against middle panel. y t = α + φy t 1 + ɛ t. H 0 : φ = 1, α = 0, (random walk). H 1 : φ < 1, α <=> 0. τ u = ˆφ 1 SE( ˆφ). If test is anything other than H 0 : φ = 1, use t test! 1.5 Lagrange Multiplier Test We ll use the LM test to test for serially correlated errors. Consider the model: Test: y t = x tβ + u t, u t = φu t 1 + ɛ t, φ < 1, ɛ t iid. H 0 : φ = 0 (Errors NOT Serially Correlated). H 1 : φ 0 (Errors Serially Correlated). Rewrite the model by lagging once, multiplying by φ and subtracting, y t φy t 1 = x tβ φx t 1β + u t φu t 1. y t = φy t 1 + x tβ φx t 1β + ɛ t. 6

Log Likelihood: With, Ln(L) = (T 1) ln(2π) 2 (T 1) ln(σ 2 ) 1 2 2σ 2 ɛ t = y t φy t 1 x tβ + φx t 1β. T ɛ 2 t. t=2 First Order Conditions. ln(l) φ = 1 T ɛ σ 2 t ( y t 1 + x t 1β). t=2 ln(l) β = 1 T ɛ σ 2 t ( x t + φx t 1). t=2 To simplify, let: With ψ = (φ β). z t = ɛ t ψ. Thus, z t = ɛ t φ ɛ t β [ yt 1 + x = t 1β x t + φx t 1 ] [ yt 1 x = t 1β x t φx t 1 ]. (3) So the first order conditions simplify to: T z t ɛ t = 0. t=2 So the next step is to calculate the β coefficient under the null hypothesis of non-serially correlated errors. Run OLS of y t on x t and note the estimated coefficient ˆβ 0. Evaluate ɛ t and z t at the restricted values, ˆβ 0 and ˆφ 0 = 0. Thus, ˆɛ t = y t x t ˆβ 0. [ yt 1 x z t = ˆβ t 1 0 x t ]. (4) 7

Finally regress ˆɛ t on z t or run the regression: Take R 2 from this regression and test: 1.6 Test for Co-Integration ˆɛ t = y t 1 x ˆβ t 1 0 and x t. }{{} z t LM = T R 2 a χ 2 (1). y t I(1) is cointegrated with x t I(1) iff there exists a vector α, such that, y t α x t = u t I(0). To check, regress y t on x t and save the residuals, û t. û t needs to be I(0) if y t and x t are to be cointegrated. Regress: û t = φ 0 û t + ɛ t + φ j û t j. j }{{} Other Lags The other lags added on make this an Augmented Dicky Fuller Test. Test: H 0 : φ 0 = 0 (NOT Cointegrated). H 1 : φ 0 < 0 (Cointegrated). Why? If φ 0 = 0, then (ignoring the other lags), û t = ɛ t I(0). Thus û t I(0) which means that û t is not I(0). We don t know for sure that it s I(1), but it may be. ˆφ 0 So compute SE( ˆφ 0 ) and reject H 0 if it is less than the critical value in the MacKinnon tables. (NOTE: we can t use the DF tables here). More on Tests for Co-Integration. If u t is a stationary AR(1), then x and y will be conintegrated. Consider the equation for u t : u t = φu t 1 + ɛ t, ɛ t iid φ < 1. 8

Thus, u t u t 1 = φu t 1 u t 1 + ɛ t. u t = u t 1 (φ 1) + ɛ t. u t = u t 1 γ + ɛ t, γ = φ 1. Note that u t is stationary if φ < 1 and u t is stationary if γ < 1. Thus test: H 0 : γ = 0 = u t I(0) φ = γ+1 = 1 u t I(1) (x, y) are NOT conintegrated. H 1 : γ < 0 = φ = γ + 1 < 1 u t I(0) (x, y) are conintegrated. 1.7 Properties of Standard Processes 1.7.1 AR(1) Model: y t = φy t 1 + ɛ t. Yields: Stationary if φ < 1. E[y t ] = 0. V ar(y t ) = E[yt 2 ] = Stationary Cov(y t, y t 1 ) = φσ2 1 φ 2. Cov(y t, y t s ) = φs σ 2 1 φ 2. σ2 1 φ 2. In another form after backward substitution: s 1 y t = φ s y t s + φ j ɛ t j. 1.7.2 MA(1) j=0 Model: Yields: y t = θɛ t 1 + ɛ t. E[y t ] = 0. V ar(y t ) = E[yt 2 ] = σ 2 (1 + θ 2 ). Cov(y t, y t 1 ) = θσ 2. Cov(y t, y t s ) = 0 s > 1. 9

1.7.3 Random Walk Model: Yields: y t = ɛ t = y t = y t 1 + ɛ t. E[y t ] = 0. V ar(y t ) = tσ 2. 1.7.4 Random Walk with Drift Model: Yields: y t = α + ɛ t = y t = α + y t 1 + ɛ t. E[y t ] = αt. V ar(y t ) = tσ 2. 1.8 Asymptotic Distribution of ˆβ We would first like to check that the ˆβ estimator is consistant. ˆβ = (X X) 1 X y. ˆβ = (X X) 1 X [Xβ + ɛ] = β + (X X) 1 X ɛ. ˆβ β = (X X) 1 X ɛ. Or rewriting: Taking the probability limit: [ 1 ˆβ β = n n ] 1 [ 1 x i x i n n x i ɛ i ]. [ [ plim( ˆβ 1 β) = plim n By Slutsky s Theorem: n ] 1 [ 1 x i x i n n ] ] x i ɛ i. plim( ˆβ β) = [ plim 1 n n ] 1 [ x i x i plim 1 n n x i ɛ i ]. Since x i x i is an iid sequence, and E[x i x i] = Σ xx, then by the Weak Law of Large Numbers (WLLN), 10

plim 1 n n x i x i = Σ xx. Also, x i ɛ i is an iid sequence with E[x i ɛ i ] = 0 because of independence. Thus, by the WLLN, plim 1 n n x i ɛ i = 0. Thus, substituting these last two equations in, plim( ˆβ β) = ] 1 [ ] [Σ xx 0 = 0. Thus, ˆβ is a consistant estimator. Note that the two substitutions could have also been done via the ergodic theorem which does not rely on iid but rather on stationarity and limited memory. (iid processes are always Ergodic). 1.9 Lagged Dependents and Serial Correlated Errors Consider the following model: Compute E[y t 1 u t ]. y t = γy t 1 + x tβ + u t, u t = φu t 1 + ɛ t. E[y t 1 u t ] = E[(γy t 2 + x t 1β + u t 1 )(φu t 1 + ɛ t )]. = γφe[y t 2 u t 1 ] + βφe[x t 1u t 1 ] + φe[u 2 t 1] + γe[y t 2 ɛ t ] + βe[x t 1ɛ t ] + E[u t 1 ɛ t ]. = γφe[y t 2 u t 1 ] + βφe[x t 1u t 1 ] + φe[u 2 t 1]. = γφe[y t 1 u t ] + βφe[x tu t ] + φe[u 2 t ]. = βφe[x tu t ] + φe[u 2 t ]. 1 γφ = φe[u2 t ] 1 γφ. = φv ar(u t) 1 γφ. = φ σ2 1 φ 2 1 γφ. = φσ 2 (1 φ 2 )(1 γφ) 0. 11

Thus, in this situation, OLS is biased. If φ is unknown, use the C-O transformation to estimate the true parameter, β. If φ is unknown (more realistic), then rewrite model by lagging, multiplying by φ and subtracting which gives us iid errors. Then do MLE on resulting equation. Note that running OLS will work here but you won t be able to distinguish the parameter φ from the β parameter. M LE is equivalent (linearly) to OLS and allows us to discern to two. 1.10 Final Review before Exam Variances. Roots. Roots solve: Thus, Roots solve: Note that Z = 1 satisfies this equation. T SS = RSS + ESS. V ar( ˆβ) = E[ ˆβ β ] 2. y t = φ 1 y t 1 + φ 2 y t 2 + ɛ t. y t φ 1 Ly t φ 2 L 2 y t = ɛ t. Z 2 φ 1 Z φ 2 = 0. y t = φ 1 y t 1 + φ 2 y t 2 + ɛ t. y t φ 1 Ly t φ 2 Ly t = ɛ t. (1 φ 1 L φ 2 L) y t = ɛ t. (1 φ 1 L φ 2 L)(y t y t 1 ) = ɛ t. (1 φ 1 L φ 2 L)(1 L)y t = ɛ t. (Z 2 φ 1 Z φ 2 )(Z 2 Z) = 0. R 2 = ESS T SS = ŷ ŷ y y = T SS RSS T SS = y X ˆβ y y. s 2 = RSS N K = y y y X ˆβ N K. s 2 ML = RSS N. Wald. ( R ˆβ ) ) 1 ( q (Rs 2 (X X) 1 R R ˆβ ) q r F (r, N K). 12

Likelihood: LM test. Π T t [ ] 1 ( exp ɛ ) t. 2πσ 2 2σ 2 [ 2 log L ] I(ψ) = E. ψ ψ V ar( ˆψ) = I( ˆψ) [ 1 2 log L ] 1 = E. ψ ψ Under R(ψ) = 0, LM = log L 1 log L I( ˆψ) ψ ψ χ2 (q). Note that I(ψ) is block diagonal with the block corresponding to β as: 1 zt z t. ˆσ 2 0 Thus we can also write the LM stat (seen from regression of ɛ t on z t and computing T R 2 ): LM = ( ) ( ) 1 ( ) zt ɛ t zt z t zt ɛ t. ˆσ 2 0 Note: log L( ˆψ 0 ) = 1ˆσ ( z ψ 0 2 t ɛ t ). 13