Econometrics I: Univariate Time Series Econometrics (1)

Similar documents
Advanced Econometrics

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Univariate ARIMA Models

Empirical Market Microstructure Analysis (EMMA)

Econometrics of Panel Data

AR, MA and ARMA models

10. Time series regression and forecasting

Applied time-series analysis

Non-Stationary Time Series and Unit Root Testing

STAT Financial Time Series

Univariate Time Series Analysis; ARIMA Models

Lecture 2: Univariate Time Series

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

Midterm Suggested Solutions

Econometrics II Heij et al. Chapter 7.1

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

FE570 Financial Markets and Trading. Stevens Institute of Technology

LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Empirical Economic Research, Part II

Some Time-Series Models

Lecture 4a: ARMA Model

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

Time Series Analysis

Problem Set 2: Box-Jenkins methodology

Econ 623 Econometrics II Topic 2: Stationary Time Series

Lecture on ARMA model

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

3 Theory of stationary random processes

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 2: Unit Roots

Time Series I Time Domain Methods

E 4101/5101 Lecture 6: Spectral analysis

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Review Session: Econometrics - CLEFIN (20192)

Nonlinear time series

Chapter 4: Models for Stationary Time Series

Quantitative Finance I

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Lecture 1: Stationary Time Series Analysis

Ch 6. Model Specification. Time Series Analysis

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

Trending Models in the Data

Stationary Stochastic Time Series Models

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Vector autoregressions, VAR

F9 F10: Autocorrelation

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Chapter 8: Model Diagnostics

Econ 424 Time Series Concepts

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

The Identification of ARIMA Models

WORKSHOP. Introductory Econometrics with EViews. Asst. Prof. Dr. Kemal Bağzıbağlı Department of Economic

at least 50 and preferably 100 observations should be available to build a proper model

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1

5 Autoregressive-Moving-Average Modeling

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Contents. Part I Statistical Background and Basic Data Handling 5. List of Figures List of Tables xix

Estimating AR/MA models

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Akaike criterion: Kullback-Leibler discrepancy

Class 1: Stationary Time Series Analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Modelling using ARMA processes

Financial Time Series Analysis: Part II

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution

ARIMA Models. Richard G. Pierse

Trends and Unit Roots in Greek Real Money Supply, Real GDP and Nominal Interest Rate

Problem set 1 - Solutions

Introduction to Economic Time Series

Chapter 6: Model Specification for Time Series

A time series is called strictly stationary if the joint distribution of every collection (Y t

Lecture note 2 considered the statistical analysis of regression models for time

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

Univariate, Nonstationary Processes

Heteroskedasticity and Autocorrelation

Discrete time processes

IDENTIFICATION OF ARMA MODELS

Box-Jenkins ARIMA Advanced Time Series

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Lecture 1: Stationary Time Series Analysis

Introduction to ARMA and GARCH processes

Econometric Forecasting

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Transcription:

Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia

Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2

Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 nd EViews Session VII: An AR model for the Italian Unemployment Rate 2-a

Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 nd EViews Session VII: An AR model for the Italian Unemployment Rate 3 rd EViews Session VIII: Simulation of ARMA models 2-b

Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 nd EViews Session VII: An AR model for the Italian Unemployment Rate 3 rd EViews Session VIII: Simulation of ARMA models 4 th EViews Session IX: The Box-Jenkins Empirical Analysis + Exercises 2-c

EViews Session VI: Some Theoretical Premises Autocovariance The j-th autocovariance of Y t is given by: Cov[Y t, Y t j ] γ t,t j E[(Y t E[Y t ])(Y t j E[Y t j ])], correspondingly the variance of Y t is defined as: V[Y t ] γ t,t E[(Y t E[Y t ]) 2 ]. Autocorrelation The j-th autocorrelation of Y t is given by: Corr[Y t, Y t j ] ρ t,t j Cov[Y t, Y t j ] V[Y t ] 1 2 V[Yt j ] 2 1 = γ t,t j γ 1 2 t,t γ 1 2 t j,t j 3

EViews Session VI: Some Theoretical Premises Covariance Stationarity A time series {Y t } t= is called covariance stationary, weakly stationary, or second order stationary if E[Y t ] = µ Y <, V[Y t ] = γ t,t = σ 2 Y <, Cov[Y t, Y t j ] = γ t,t j = γ j <, t t t, j Autocovariance Function The autocovariance function (ACVF) of a covariance stationary process {Y t } t= is the sequence of autocovariances γ j for all j = 0, 1, 2,.... Autocorrelation Function The autocorrelation function (ACF) of a covariance stationary process {Y t } t= is the sequence of autocorrelations ρ j for all j = 0, 1, 2,.... 4

EViews Session VI: Some Theoretical Premises Empirical Autocorrelation Function The empirical (or sample) autocorrelation function of a time series Y t is the sequence of sample autocorrelation coefficients ˆρ j for all j = 0, 1, 2,..., where, ˆρ j = ˆγ j / ˆγ 0 = T t=j+1 (Y t Ȳ )(Y t j Ȳ ) T t=1 (Y t Ȳ )2 with ˆγ j = 1 T Ȳ = 1 T T t=j+1 T t=1 Y t (Y t Ȳ )(Y t j Ȳ ) The graphical depiction of the empirical autocorrelation function is called autocorrelogram. 5

EViews Session VI: Some Theoretical Premises Partial Autocorrelation Function Consider the autoregressive process of order p: Y t = c + φ 1p Y t 1 + φ 2p Y t 2 +... + φ pp Y t p + ǫ t, where φ ip, i p is the coefficient on the lagged value Y t p in the AR(p) model. Then, the sequence of the last coefficients φ 11, φ 22,..., φ pp is called the partial autocorrelation function (PACF). The partial autocorrelation function is the second major explorative tool in time series analysis besides the autocorrelation function and is also used to describe the properties of linear time series processes. In order to understand its meaning let us consider a linear model that explains Y t by its j latent historical values: Y t = φ 1j Y t 1 + φ 2j Y t 2 +... + φ jj Y t j + u t (1) = X tφ j + u t with X t = (Y t 1, Y t 2,..., Y t j ) and φ j = (φ 1j, φ 2j,..., φ jj ). Without loss of generality we assume that Y t is centered so that no intercept occurs. 6

EViews Session VI: Some Theoretical Premises Assume now that a time series is generated by an AR(p) process, where p < j. In this case only the p lagged values of Y t are useful to predict Y t and the remaining j p coefficients would be zero. In particular the coefficient on the lagged dependent variable Y t j, φ jj, which is the j t h partial autocorrelation coefficient would be zero. Since this result holds for every integer j > p, all partial autocorrelation coefficients φ p+1,p+1, φ p+2,p+2,... would be zero. On the other hand if the series is generated by an AR(p) with p j the coefficient φ jj must be different from zero. However, if we consider an MA(p)-process, we ll see that the partial autocorrelations die out at infinity: hence φ jj j, would be different from zero. Moreover, the autocorrelations die out at p, instead. (see pag. 151 onwards of the book I gave you for some examples). 7

EViews Session VII: An AR model for the Italian Unemployment Rate a) Open the workfile UTS_mod.WF1. You find the following variables: Variable Description u Italian unemployment rate 1960-1999 lwp logs of the real wage 1960-1999 b) Estimate the following white noise model, u t = c + ε t, ε t n.i.d.(0, σ 2 ) where u t is normally and independently distributed over time with constant variance and constant mean. Is it an appropriate model for the Unemployment rate?... 8

EViews Session VII: An AR model for the Italian Unemployment Rate... NO! The white noise model does not fit actual data for u because it does not feature the time series most common characteristic: PERSISTENCE. In fact, the actual u is by far more persistent than the simple WN process under and above the natural rate of about 7%. In order to have an idea, generate a corresponding artificial series with same sample mean and standard deviation of the historical u. genr uaswn = 6.94 + 3.13 * nrnd genr meanline = 6.94 plot u uaswn meanline 9

EViews Session VII: An AR model for the Italian Unemployment Rate 10

EViews Session VII: An AR model for the Italian Unemployment Rate c) Now consider another stochastic process, the AR(1) model, u t = c + αu t 1 + ε t, ε t n.i.d.(0, σ 2 ) where u t is not independently distributed over time because it depends on u t 1. We can estimate c and α by using the OLS method. What can we conclude? How are the diagnostic tests (no autocorrelation, no heteroschedasticity, normality)?... 11

EViews Session VII: An AR model for the Italian Unemployment Rate The estimate of α parameter is very close to one; The AR(1) model fits unemployment quite well Diagnostic tests: Under the null: no autocorrelation no heteroschedasticity normality AR(1) residuals rejected not rejected not rejected 12

EViews Session VII: An AR model for the Italian Unemployment Rate d) Expand the previous model by estimating the AR(2) model, u t = c + α 1 u t 1 + α 2 u t 2 + ε t, ε t n.i.d.(0, σ 2 ) where there is one more parameter, and the dynamics is extended to the second lag. What do you find? The residual tests are all fine (white noise errors); The AR(2) model equally fits well; The sum of the two a estimates is close to one. 13

EViews Session VII: An AR model for the Italian Unemployment Rate e) When we deal with real data, not all the economic series are untrended, and in case of trended variables we must introduce deterministic components in our statistical models in order to (potentially) give account of this further feature. Consider the initial white noise model and include a deterministic trend to model the logs of the real wage: lwp t = c + βt + ε t, ε t n.i.d.(0, σ 2 ) How are the residuals diagnostics? If you then include an AR(1) dynamics what are your findings? 14

EViews Session VII: An AR model for the Italian Unemployment Rate 1) Under the null: no autocorrelation no heteroschedasticity normality Only trend residuals rejected rejected not rejected 15

EViews Session VII: An AR model for the Italian Unemployment Rate 2) Under the null: no autocorrelation no heteroschedasticity normality Trend + AR(1) residuals not rejected not rejected not rejected 16

EViews Session VII: An AR model for the Italian Unemployment Rate Some final comments: Despite the inclusion of a deterministic trend, lwp persistence needs an AR dynamics. in general many economic series can be represented by AR models of different orders, with or without deterministic trends; The (sum of) AR parameter estimates is very often close to one... (evidence of non-stationarity)... 17

EViews Session VIII: Simulation of ARMA models Eviews program for Simulating an AR(1) model Determine AR and constant coefficients!phi1=0.5!c=0.1 error term series u=nrnd initialization of time series smpl 1 4000 series ar1=!c/(1-!phi1) generation of time series smpl 2 4000 series ar1=!c+!phi1*ar1(-1)+ u smpl 1 4000 delete u 18

EViews Session VIII: Simulation of ARMA models Eviews program for Simulating a MA(1) model Determine constant and MA coefficients!mu=0.1!theta=-0.5 innovation term series z=nrnd initialization series eps=z series r=!mu generation of time series smpl 2 4000 eps=z r=!mu+!theta*eps(-1)+eps smpl 1 4000 delete z 19

EViews Session VIII: Simulation of ARMA models Eviews program for Simulating an ARMA(1,1) model Determine AR, MA and constant coefficients!phi1=0.6!theta1=0.4!c=0.1 error term series u1=nrnd initialization of time series smpl 1 4000 series eps1=u1 series arma1=!c/(1-!phi1) generation of time series smpl 2 4000 series arma1=!c+!phi1*arma1(-1)+!theta1*eps1(-1)+eps1 smpl 1 4000 delete u1 u2 eps1 eps2 20

EViews Session IX: The Box-Jenkins Empirical Analysis Model Identification 1. Identification of the appropriate order of differencing d in an ARIMA(p,d,q) 2. Identification of the lag order p and q of an ARMA model. Calculation of descriptive statistics of the the time series mean variance ACF PACF 3. Comparison of the descriptive statistics with their theoretical counterparts that would hold true if a certain model is adequate Model Estimation Estimation of the ARMA(p, q) models. Model Evaluation Evaluation of the estimation results and diagnostics: 21

EViews Session IX: The Box-Jenkins Empirical Analysis 1. Having estimated an ARMA-model it is common to check whether the selection of the order of the process is correct. In case of a dynamic misspecification the residuals e t should reveal some type autocorrelation. Testing for the white noise property of e t can be done by applying a portmanteau test to the residuals. Note, that we test for the absence of autocorrelation up to the chosen lag order k, actually. Thus, a rejection of the null, rejects that the residual time series is a white noise. In the case, where we cannot reject the null, further test are necessary to verify, whether the residual time series is a white noise. The Ljung-Box statistics for a test of absence of autocorrelation up to order k based on the residuals of an ARMA(p, q) model is 22

EViews Session IX: The Box-Jenkins Empirical Analysis Q LB (k) = T(T + 2) k j=1 ˆρ j (e) 2 T j a χ 2 (k p q), where ˆρ j (e) denotes the j-th empirical autocorrelation coefficient of the residual time series e t. Diagnostics based on residuals: Lagrange Multiplier test to test an AR(p) model against an AR(p + r) model (Godfrey, 1978). Auxiliary regression: e t = α 1 Y t 1 +... + α p Y t p + β 1 e t 1 +... + β r e t r + ν t, where ê t are the residuals of the AR(p) model. LM test-statistic tests joint significance of the parameter β 1,..., β r. Test statistic LM = T R 2 χ 2 (r) where R 2 is the (uncentered) coefficient of determination from (2). diagnostic testing for homoscedasticity of the residuals: White test, ARCH test (more details later) diagnostic testing for normality of the residuals (Jarque-Bera t.) (2) 23

EViews Session IX: The Box-Jenkins Empirical Analysis 2. Model selection by evaluating the in-sample fit. Principle: Comparing in-sample fit, measured by the based on residual variance, to the number of estimated parameters. Models are selected by minimzing the information criteria. Akaike Information Criterion (AIC): AIC(k) = T ln ˆσ 2 + 2k Bayes Information Criterion (BIC) or Schwartz (SIC) Information Criterion: BIC(k) = T ln ˆσ 2 + k ln T with ˆσ 2 = 1/T T t=1 e2 t. Respecification of the Model if Necessary Repeat all previous steps. Forecasting...in the following days 24

EViews Session IX: Exercises Do on your own the following exercises: 1. Simulate an estimate an AR(2), a MA(2) and an ARMA(2,2) model. 2. Find out the mysterious series contained in the workfile mystery.wf1! 25