Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Similar documents
R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ Autocorrelation. Sanjaya DeSilva

Testing the Random Walk Model. i.i.d. ( ) r

1. Diagnostic (Misspeci cation) Tests: Testing the Assumptions

Math 10B: Mock Mid II. April 13, 2016

Regression with Time Series Data

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1

Comparing Means: t-tests for One Sample & Two Related Samples

Distribution of Estimates

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

REVIEW OF MAXIMUM LIKELIHOOD ESTIMATION

The General Linear Test in the Ridge Regression

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A

Dynamic Econometric Models: Y t = + 0 X t + 1 X t X t k X t-k + e t. A. Autoregressive Model:

Testing for a Single Factor Model in the Multivariate State Space Framework

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Time series Decomposition method

Stationary Time Series

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

Distribution of Least Squares

Forecasting optimally

Solutions to Odd Number Exercises in Chapter 6

Generalized Least Squares

GMM - Generalized Method of Moments

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Hypothesis Testing in the Classical Normal Linear Regression Model. 1. Components of Hypothesis Tests

Unit Root Time Series. Univariate random walk

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1

The Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form

Lecture 10 Estimating Nonlinear Regression Models

How to Deal with Structural Breaks in Practical Cointegration Analysis

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

5. NONLINEAR MODELS [1] Nonlinear (NL) Regression Models

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits

Cointegration and Implications for Forecasting

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

The Effect of Nonzero Autocorrelation Coefficients on the Distributions of Durbin-Watson Test Estimator: Three Autoregressive Models

Dynamic Models, Autocorrelation and Forecasting

Lecture 4. Classical Linear Regression Model: Overview

Økonomisk Kandidateksamen 2005(II) Econometrics 2. Solution

Chapter 11. Heteroskedasticity The Nature of Heteroskedasticity. In Chapter 3 we introduced the linear model (11.1.1)

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Lecture 6 - Testing Restrictions on the Disturbance Process (References Sections 2.7 and 2.10, Hayashi)

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Challenge Problems. DIS 203 and 210. March 6, (e 2) k. k(k + 2). k=1. f(x) = k(k + 2) = 1 x k

Properties of Autocorrelated Processes Economics 30331

A New Unit Root Test against Asymmetric ESTAR Nonlinearity with Smooth Breaks

4.1 Other Interpretations of Ridge Regression

Wisconsin Unemployment Rate Forecast Revisited

Robust critical values for unit root tests for series with conditional heteroscedasticity errors: An application of the simple NoVaS transformation

Has the Business Cycle Changed? Evidence and Explanations. Appendix

OBJECTIVES OF TIME SERIES ANALYSIS

Chapter 16. Regression with Time Series Data

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting

Lecture 5. Time series: ECM. Bernardina Algieri Department Economics, Statistics and Finance

The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information

dy dx = xey (a) y(0) = 2 (b) y(1) = 2.5 SOLUTION: See next page

The Overlapping Data Problem

ESTIMATION OF DYNAMIC PANEL DATA MODELS WHEN REGRESSION COEFFICIENTS AND INDIVIDUAL EFFECTS ARE TIME-VARYING

AN EXACT TEST FOR THE CHOICE OF THE COMBINATION OF FIRST DIFFERENCES AND PERCENTAGE CHANGES IN LINEAR MODELS

Lecture 33: November 29

DEPARTMENT OF STATISTICS

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

Heteroskedasticity Autocorrelation Robust Inference in Time. Series Regressions with Missing Data

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS

Recent Developments in the Unit Root Problem for Moving Averages

Chapter 6. Systems of First Order Linear Differential Equations

Empirical Process Theory

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

14 Autoregressive Moving Average Models

Types of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

References are appeared in the last slide. Last update: (1393/08/19)

7 The Itô/Stratonovich dilemma

Navneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi

20. Applications of the Genetic-Drift Model

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

( ) = b n ( t) n " (2.111) or a system with many states to be considered, solving these equations isn t. = k U I ( t,t 0 )! ( t 0 ) (2.

Computer Simulates the Effect of Internal Restriction on Residuals in Linear Regression Model with First-order Autoregressive Procedures

Theory of! Partial Differential Equations!

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

Solution of Assignment #2

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

EXERCISES FOR SECTION 1.5

Estimation Uncertainty

DEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND

Math 315: Linear Algebra Solutions to Assignment 6

Maximum Likelihood Parameter Estimation in State-Space Models

DEPARTMENT OF ECONOMICS

Part III: Chap. 2.5,2.6 & 12

Object tracking: Using HMMs to estimate the geographical location of fish

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix

6.003 Homework #8 Solutions

Wednesday, November 7 Handout: Heteroskedasticity

Chickens vs. Eggs: Replicating Thurman and Fisher (1988) by Arianto A. Patunru Department of Economics, University of Indonesia 2004

Transcription:

Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange Muliplier es. Chrisian Julliard Deparmen of Economics and FMG London School of Economics Bayesian inference will no be presened. Ouline Ouline Key assumpions Ouline We have already seen ha even if observaions are dependen, he resuls derived for he MLE in he iid seing carry over for ergodic processes, and we ll be assuming ha: he MLE of a vecor of parameers ψ is consisen 2 And ha T ˆψ ψ D N 0, T Iψ where Iψ is he informaion marix. Wald Tes Linear resricions for he sandard linear regression Wald es for nonlinear consrains 2 The Likelihood Raio Tes 3 Lagrange Muliplier Tess Example: LM es in Nonlinear Leas Squares 4 Comparison beween he Wald, LR and LM ess 5 Durbin-Wason Tes

The Wald ess Idea: use he MLE of he unresriced model. Suppose we have a model wih k unknown parameers ψ ha delivers he log likelihood loglψ. We know ha T ˆψ ψ d N 0, IAψ where IAψ = T Iψ, Iψ = E 2 log Lψ. Suppose we wan o es a linear hypohesis H 0 : Rψ = q vs. H A : Rψ q, where R has r < k linearly independen rows r resricions hen under H 0 d T R ˆψ ψ = T R ˆψ q N 0, RIAψ R }{{} r r Recall: if he n dimensional vecor x N0, A x A x χ 2 n. This implies ha T R ˆψ q RIAψ R ] T R ˆψ q d χ 2 r. Bu: we do no observe IAψ. If we can find a consisen esimaor, he disribuion remains unchanged Possible esimaors are: he empirical informaion marix based, T I ˆψ, and he empirical hessian based, T 2 log L ˆψ ]. Assuming he firs is available, hen W = T R ˆψ q R T I ˆψ R ] T R ˆψ q d χ 2 r Wald es for he sandard linear regression Linear resricions for he sandard linear regression We showed las ime ha for he sandard linear regression model y i = x i β + ε i N 0, σ 2 ; i =,..., n; E x i ε s] = 0 s, i wih unknown parameers β, σ 2, we have ha T ˆβ d β N = 0, x x T ˆσ 2 Suppose β = β, β 2 ] and we wan o es he null: or equivalenly in marix form β β 2 = q β ], ] = q }{{} β 2 R }{{} β Wald es for he sandard linear regression Noing ha under he null T R ˆβ β = T We can consruc he Wald es as R ˆβ D q N = 0, R x x T ˆσ 2 R W = T R ˆβ q = R x x ] T ˆσ 2 R T R ˆβ q 2 = ˆβ = ˆβ 2 q R x x ] ˆσ 2 R d χ 2

Wald wih nonlinear consrains Wald es for nonlinear consrains Wald wih nonlinear consrains Example: nonlinear resricions for he SLR Consider H 0 : Rψ = 0, a se of r linear or nonlinear consrains. R is a column r-vecor. Le R = R, R 2,..., R be a well defined r k marix k is he number of parameers in ψ. k ] Then, under H 0 he saisic ] W = R ˆψ R ˆψ I ˆψ R ˆψ R ˆψ χ 2 r Suppose β = β, β 2 ] and we wan o es he null: 2 β2 + 2 β2 2 q = 0 }{{} Rψ in he sandard linear regression model Noicing ha R ψ R ψ = ; β ] R ψ = β ; β 2 ] β 2 Inuiion: Dela Mehod/Taylor Expansion. Than he Wald saisic becomes 2 ˆβ 2 + 2 ˆβ 2 2 q 2 { ˆβ ; ˆβ 2 ] = x x ˆσ 2 ˆβ ˆβ 2 ]} d χ 2 The Likelihood Raio Tes Again suppose he model can be expressed in erms of a likelihood funcion Lψ. Suppose we also have a se of r resricions, eiher linear or nonlinear i.e. Rψ = q or Rψ = 0. I can be shown ha under he null { } L ˆψ0 { } LR = 2 log = 2 log L ˆψ log L ˆψ 0 χ L ˆψ 2 r If he daa conforms wih he null you expec L ˆψ o be close o L ˆψ 0 and for LR o be close o 0. If he daa does no conform you expec L ˆψ >> L ˆψ 0 and LR >> 0. Hence he es is o rejec H 0 a he α level if LR > χ 2 αr. Idea: Esimae he unresriced model o obain ML esimaes, ˆψ and L ˆψ. 2 Esimae he model under he resricions o obain resriced esimaes ˆψ 0 and L ˆψ 0. 3 Then compare L ˆψ and L ˆψ 0

Lagrange Muliplier Tess Again suppose he model can be expressed in erms of a likelihood funcion Lψ and ha we have r resricions Rψ = 0. If he resricions are valid ˆψ 0 he MLE of he resriced model will be close o ˆψ he MLE of he unresriced model and he parial derivaives in he vecor log L ˆψ 0 also be close o zero noe: log L ˆψ = 0 by consrucion will I can be shown ha under he null, he quadraic form LM = T log L ˆψ 0 IAψ 0 log L ˆψ 0 d χ 2 r. As usual, we normally do no know IAψ 0 and his mus be replaced by a consisen esimae. Assuming ha T I ˆψ or a consisen alernaive is available, hen log L ˆψ 0 I ˆψ 0 log L ˆψ 0 d χ 2 r 2 and is referred o as a Lagrange Muliplier saisic. The LM es in Nonlinear Leas Squares LM es for nonlinear leas squares This resul can be specialized for nonlinear leas squares problems. Thus we have y = gx ; β + ε, ε iid N0, σ 2 x independen of ε, =,..., T. Then he unresriced log likelihood has he form log Lβ, σ 2 = T 2 log 2π T 2 log σ2 2σ 2 ε β = y gx ; β. T ε β 2 = The LM es in Nonlinear Leas Squares Assume ha he r resricions involve only β no σ 2 : Rβ = 0. Then log Lβ, σ 2 = β σ 2 z ε, and as before, Iψ = E T T z = ε β. 3 2 ] log Lψ Bu as σ 2 is no in he resricion he informaion marix is block diagonal. Consider only he sub marix associaed wih β. Since x is independen of ε, 2 ] log L I ββ ψ = E β β = σ 2 E z z. 4

The LM es in Nonlinear Leas Squares Evaluaing he LM-saisics a ˆβ 0, ˆσ 0 2, where ˆσ 0 2 = T ε2 ˆβ 0, and replacing he expecaions wih heir sample analog LM = ˆσ 0 2 z ε ] z z z ε. By inspecion, LM is relaed o he regression of ε on z i.e. ε = z γ + u, ˆγ = Σz z Σz ε. Define fied values for such a regression η = z ˆγ = z z z ] z ε. The LM es in Nonlinear Leas Squares Now consider he R 2 from his regression T R 2 = T η 2 ε 2 = η η T ε ε = z ε z z ] z z ] z z ] z ε ˆσ 2 0 = LM. Hence a valid LM saisic can always be obained by regressing ε ˆψ 0 on z ˆψ 0 and calculaing LM = T R 2. Then rejec H 0 a he α level if LM > χ 2 αr. Inuiion if ˆβ 0 is close o ˆβ, he ε ˆβ 0 shouldn be forecasable. All hree ess are asympoically equivalen. Warning: hese are asympoic disribuion resuls, so cauion should be used in small sample. In small sample bu here are excepions: In general he LR es is he bes, in he sense ha is finie sample behavior mos closely approximaes is expeced large sample properies. 2 The Wald es is second bes and he LM procedure wors. The Durbin-Wason Tes The Durbin Wason es is he only es for which we have small sample properies. Unforunaely he circumsances in which i is valid are so resriced ha i is almos always inappropriae. The model: y = x β + u u = φu + ε, ε iid N0, σ 2. We wan o es H 0 : φ = 0 agains H A : φ > 0.

Under he null, esimae he model by leas squares and calculae he es saisic =2 d = û û 2 T =2 = û2 =2 + û2 2 ûû =2. = û2 = û2 = û2 = û2 Noe: d 2 r, where r is he simple correlaion beween û and û. d lies in he inerval 0,4]. Unforunaely he exac disribuion of d depends on X Bu d is subjec o an upper d U and lower bound d L ha depend on boh he sample size and he number of regressors. We are esing agains posiive serial correlaion so we rejec if d is oo small. If d < d L rejec, if d > d U fail o rejec. If d L < d < d U inconclusive. Noe: o be valid, i he regression mus conain a consan, ii all RHS variables are processed independen of he errors