Introduction to Estimation Methods for Time Series models. Lecture 1

Similar documents
Introduction to Estimation Methods for Time Series models Lecture 2

Multivariate Regression Analysis

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata'

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017

Linear models and their mathematical foundations: Simple linear regression

Basic Distributional Assumptions of the Linear Model: 1. The errors are unbiased: E[ε] = The errors are uncorrelated with common variance:

Econometrics A. Simple linear model (2) Keio University, Faculty of Economics. Simon Clinet (Keio University) Econometrics A October 16, / 11

Linear Models and Estimation by Least Squares

Advanced Quantitative Methods: ordinary least squares

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

The Simple Regression Model. Part II. The Simple Regression Model

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

ECONOMETRICS (I) MEI-YUAN CHEN. Department of Finance National Chung Hsing University. July 17, 2003

Xβ is a linear combination of the columns of X: Copyright c 2010 Dan Nettleton (Iowa State University) Statistics / 25 X =

In the bivariate regression model, the original parameterization is. Y i = β 1 + β 2 X2 + β 2 X2. + β 2 (X 2i X 2 ) + ε i (2)

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Review of Econometrics

3. Linear Regression With a Single Regressor

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Sensitivity of GLS estimators in random effects models

Ma 3/103: Lecture 24 Linear Regression I: Estimation

Homoskedasticity. Var (u X) = σ 2. (23)

Ch 2: Simple Linear Regression

Ch 3: Multiple Linear Regression

Ma 3/103: Lecture 25 Linear Regression II: Hypothesis Testing and ANOVA

Estimation of the Response Mean. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 27

Business Economics BUSINESS ECONOMICS. PAPER No. : 8, FUNDAMENTALS OF ECONOMETRICS MODULE No. : 3, GAUSS MARKOV THEOREM

Ordinary Least Squares Regression

Reference: Davidson and MacKinnon Ch 2. In particular page

Making sense of Econometrics: Basics

Introductory Econometrics

Regression #3: Properties of OLS Estimator

L2: Two-variable regression model

Quick Review on Linear Multiple Regression

The regression model with one stochastic regressor.

2. A Review of Some Key Linear Models Results. Copyright c 2018 Dan Nettleton (Iowa State University) 2. Statistics / 28

Introduction to Simple Linear Regression

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

BANA 7046 Data Mining I Lecture 2. Linear Regression, Model Assessment, and Cross-validation 1

Maximum Likelihood Estimation

Empirical Market Microstructure Analysis (EMMA)

Simple Linear Regression: The Model

08 Endogenous Right-Hand-Side Variables. Andrius Buteikis,

Final Exam. Economics 835: Econometrics. Fall 2010

4 Multiple Linear Regression

Empirical Economic Research, Part II

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK 4. Prof. Mei-Yuan Chen Spring 2008

Linear Regression. Junhui Qian. October 27, 2014

MAT2377. Rafa l Kulik. Version 2015/November/26. Rafa l Kulik

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Multiple Regression Analysis

General Linear Model: Statistical Inference

Problem Selected Scores

Part IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015

The BLP Method of Demand Curve Estimation in Industrial Organization

1. The OLS Estimator. 1.1 Population model and notation

Statement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.

Statistics and Econometrics I

3 Multiple Linear Regression

Simple Linear Regression

Simple Linear Regression Model & Introduction to. OLS Estimation

CHAPTER 2: Assumptions and Properties of Ordinary Least Squares, and Inference in the Linear Regression Model

Greene, Econometric Analysis (7th ed, 2012)

Advanced Econometrics I

9. Least squares data fitting

Applied Econometrics (QEM)

Classical Least Squares Theory

Association studies and regression

BIOS 2083 Linear Models c Abdus S. Wahed

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Regression Analysis. y t = β 1 x t1 + β 2 x t2 + β k x tk + ϵ t, t = 1,..., T,

Simple Linear Regression

ECON The Simple Regression Model

Restricted Maximum Likelihood in Linear Regression and Linear Mixed-Effects Model

Econometrics Master in Business and Quantitative Methods

Estimating Estimable Functions of β. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 17

Lecture 11: Regression Methods I (Linear Regression)

Lecture 3: Multiple Regression

Lecture 11: Regression Methods I (Linear Regression)

Regression and Statistical Inference

INTRODUCTORY ECONOMETRICS

ECON 4160, Autumn term Lecture 1

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators

Økonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning

Instrumental Variables, Simultaneous and Systems of Equations

Bias Variance Trade-off

Regression #4: Properties of OLS Estimator (Part 2)

Multiple Regression Analysis. Part III. Multiple Regression Analysis

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

The regression model with one fixed regressor cont d

FIRST MIDTERM EXAM ECON 7801 SPRING 2001

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

Generalized Method of Moments (GMM) Estimation

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Mathematical statistics

Analisi Statistica per le Imprese

3. For a given dataset and linear model, what do you think is true about least squares estimates? Is Ŷ always unique? Yes. Is ˆβ always unique? No.

Econometrics II - EXAM Answer each question in separate sheets in three hours

Econometrics I Lecture 3: The Simple Linear Regression Model

Transcription:

Introduction to Estimation Methods for Time Series models Lecture 1 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 1 / 19

Estimation Methods We will briefly review the following estimation methods: LS: Least Square OLS: Ordinary Least Square NLS: Nonlinear Least Square (idea) ML: Maximum Likelihood GMM: Generalized Method of Moments (idea) Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 2 / 19

Econometric model Econometrics: intersection of Economics and Statistics Econometric model = association between y i and x i E.g.: personal income y i and personal QI x i stock return y i and market return x i current return y t and past returns y t h Econometric model provides approximate i.e. probabilistic description of the association. The relation will be stochastic and not deterministic. Econometrics provides estimation methods for parametric model Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 3 / 19

Estimators Given a parametric model X f(x;θ 0 ) with θ 0 Ω(θ) and a random sample {x 1, x 2,...,x n} Estimator: is any function T() of the random sample {x 1, x 2,...,x n}, i.e. ˆθ T(x 1, x 2,...,x n) Ex: if x i.i.d.(µ,σ) then X n = 1 n n t=1 xt is an estimator for µ Sampling Distribution: the estimator, being a function of the random sample, is also a random variable. Ex: X n N(µ,σ/n) Estimate: a single realization of the statistics on a particular sample. Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 4 / 19

Estimators: Finite Sample Properties Unbiased Estimator E[ˆθ] = θ 0 or Bias[ˆθ] E[ˆθ θ 0 ] = 0 Efficient Unbiased Estimator Var[ˆθ 1 ] < Var[ˆθ 2 ] Mean Square Error MSE[ˆθ] E[(ˆθ θ 0 ) 2 ] = Var[ˆθ]+Bias[ˆθ] 2 Best Linear Unbiased Estimator (BLUE): linear function of the data with minumum variance among linear unbiased estimators. Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 5 / 19

Ordinary Least Square (OLS): Linear model Linera model y i = f(x i1, x i2,..., x ik )+ɛ i = β 0 +β 1 x i1 +β 2 x i2 + +β K x ik +ɛ i i = 1,...,N where - y i : dependent or explained variable (observed) - x i : regressors or covariates or explanatory variables (observed) - ɛ i : error term or random disturbance (unobserved) - β i : unknown parameters or regression coefficient (unobserved) Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 6 / 19

Ordinary Least Square (OLS): Vector notation can be written in vector notation and in the even more compact matrix notation with Y = y 1 y 2.. y N }{{} N 1 y i = β 0 +β 1 x i1 +β 2 x i2 + +β K x ik +ɛ i y i = x i β + ɛ i }{{}}{{} 1 K K 1 Y }{{} N 1 = X }{{} N K β + ɛ }{{} K 1 x 1 1 x 1,1 x 2,1... x 1,K x 2 1 x X =... = 2,1 x 2,2... x 2,K....... x N } 1 x N,1 x N,2 {{... x N,K } N K ɛ 1 ɛ 2 ɛ =.. ɛ N }{{} N 1 Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 7 / 19

Standard OLS Assumptions Standard OLS Assumptions: H.1 Strict exogeneity of regressors: E[ɛ X] = 0 Note: ɛ i does not depend on any x j, neither past nor future xs E[ɛ X] = 0 E[ɛ] = 0 E[ɛ X] = 0 E[y X] = Xβ i.e. Xβ is the conditional mean of y X. H.2 Identification: X is N K with rank K with probability 1 H.3 Spherical errors Var[ɛ X]= σ 2 I N homoscedastic: Var[ɛ i X] = σ 2, i = 1,..., n and uncorrelated errors: Cov[ɛ i ɛ j X] = 0 i j Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 8 / 19

Univariate OLS E.g., univariate regression: y i = α+βx i +ɛ i with ɛ i N(0,σ 2 ) i Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 9 / 19

Ordinary Least Square Idea: minimize the square of the estimation errors e Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 10 / 19

OLS Estimator Goal: statistical inference on β, e.g. estimate β Least Square find β that minimize the sum of squared residuals in Y = Xβ +ɛ: N SS = ɛ 2 i = ɛ ɛ i=1 = (Y Xβ) (Y Xβ) = YY 2X Yβ +β X Xβ F.O.C. : 2X Y + 2X Xβ = 0 X (Y Xβ) = 0 X Xβ = X Y OLS estimator: ˆβ = (X X) 1 X Y ( N ) 1 ( N ) = x i x i x i y i i=1 i=1 Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 11 / 19

Finite sample Properties Unbiasedness: E[ˆβ X] = β ˆβ = (X X) 1 X (Xβ +ɛ) = β +(X X) 1 X ɛ Then E[ˆβ X] = β +(X X) 1 X E[ɛ X] = β }{{} =0 (H.1) Variance: Var(ˆβ X) = σ 2 (X X) 1 Var[ˆβ X] = (X X) 1 X Var[ɛ X] }{{} σ 2 I N (H.3) X(X X) 1 = σ 2 (X X) 1 Efficiency (Gauss-Markov Theorem): ˆβ is BLUE, i.e. Var(ˆβ X) Var( β X), β linear unbiased estimator Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 12 / 19

Population and estimated coefficient Notation true values y α β ɛ fitted/estimated ŷ i a b e Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 13 / 19

Projections being we have that Y = Xβ }{{} + ɛ }{{} explained by the model random/unexplained Ŷ ˆβ = (X X) 1 X Y = Xˆβ = X(X X) 1 X Y = P xy P x = X(X X) 1 X is called the OLS projection matrix. Moreover, e = Y Xˆβ = Y X(X X) 1 X Y = (I X(X X) 1 X )Y = (I P x)y = M xy M x = I P x is called the residual Maker matrix as M xy = e Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 14 / 19

Orthogonal projection M x and P x are - symmetric P = P - idempotent P 2 = PP = P - orthogonal PM = MP = 0 Hence, OLS partitions Y in two orthogonal parts: Y = P xy + M xy = Ŷ + e = projection + residual Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 15 / 19

Orthogonal projection Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 16 / 19

Goodness of fit being Ŷ e then Var(Y) = Var(Ŷ) + Var(e) TSS = ESS n n + RSS n Total Var = Explained Var + Residual Var A common measure of goodness of fit is the coefficient of determination R 2 : R 2 = Explained Var Total Var = 1 Residual Var Total Var = 1 RSS TSS since R 2 always increases when a regressor is added (even if uncorrelated) Adjusted R 2 = 1 Residual Var/(n K) Total Var/(n 1) Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 17 / 19

Goodness of fit of a linear regression Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 18 / 19

OLS with Normality if H.4: ɛ X N(0,σ 2 I N ) then ˆβ N(β,σ 2 (X X) 1 ) Rao-Blackwell Theorem: ˆβ is the ML estimator (i.e. most efficient unbiased estimator) Hypothesis Testing to make inference on ˆβ N(β,σ 2 (X X) 1 ) we need an estimator for σ 2. Using e y ŷ with ŷ Xˆβ we can define: we can prove that ˆσ 2 = N i=1 e2 N K = e e N K = RSS N K E[ σ 2 ] = σ 2 and RSS/σ 2 χ 2 N K Hence, denoting Var( β) = σ 2 (X X) 1 we have that β β N(0, 1) t N K Var( β) χ 2 N K /(N K) Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 19 / 19