Review Session: Econometrics - CLEFIN (20192)

Similar documents
Empirical Market Microstructure Analysis (EMMA)

Advanced Econometrics

Problem Set 2: Box-Jenkins methodology

Lecture 2: ARMA(p,q) models (part 2)

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Brief Sketch of Solutions: Tutorial 3. 3) unit root tests

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

7. Integrated Processes

Econ 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

Lecture 2: Univariate Time Series

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

3. ARMA Modeling. Now: Important class of stationary processes

7. Integrated Processes

FE570 Financial Markets and Trading. Stevens Institute of Technology

Econ 623 Econometrics II Topic 2: Stationary Time Series

STAT Financial Time Series

Econometría 2: Análisis de series de Tiempo

Econometric Forecasting

Class 1: Stationary Time Series Analysis

Econometrics II Heij et al. Chapter 7.1

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Final Exam Financial Data Analysis at the University of Freiburg (Winter Semester 2008/2009) Friday, November 14, 2008,

10. Time series regression and forecasting

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Univariate linear models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

1 Linear Difference Equations

Ch 6. Model Specification. Time Series Analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Applied time-series analysis

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

AR, MA and ARMA models

Non-Stationary Time Series and Unit Root Testing

Time Series 2. Robert Almgren. Sept. 21, 2009

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

Introduction to ARMA and GARCH processes

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Discrete time processes

Estimation and application of best ARIMA model for forecasting the uranium price.

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Frequency Forecasting using Time Series ARIMA model

ARIMA Modelling and Forecasting

Lecture 1: Stationary Time Series Analysis

ECON 616: Lecture 1: Time Series Basics

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Chapter 2: Unit Roots

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

Stationary Stochastic Time Series Models

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Problem set 1 - Solutions

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

MCMC analysis of classical time series algorithms.

Financial Time Series Analysis: Part II

Univariate, Nonstationary Processes

Univariate Time Series Analysis; ARIMA Models

Introduction to Stochastic processes

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Lecture 1: Stationary Time Series Analysis

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

Time Series Analysis -- An Introduction -- AMS 586

Time Series I Time Domain Methods

The Evolution of Snp Petrom Stock List - Study Through Autoregressive Models

Midterm Suggested Solutions

Univariate ARIMA Models

Non-Stationary Time Series and Unit Root Testing

5 Transfer function modelling

Estimating AR/MA models

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

Quantitative Finance I

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Classic Time Series Analysis

2. An Introduction to Moving Average Models and ARMA Models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

SOME BASICS OF TIME-SERIES ANALYSIS

Austrian Inflation Rate

Non-Stationary Time Series and Unit Root Testing

End-Semester Examination MA 373 : Statistical Analysis on Financial Data

Time Series Solutions HT 2009

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Author: Yesuf M. Awel 1c. Affiliation: 1 PhD, Economist-Consultant; P.O Box , Addis Ababa, Ethiopia. c.

Lesson 9: Autoregressive-Moving Average (ARMA) models

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Model selection using penalty function criteria

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Gaussian Copula Regression Application

MAT3379 (Winter 2016)

Transcription:

Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013

Fundamentals Stationarity A time series is a sequence of random variables x t, t = 1,..., T usually measured at equal intervals. The bulding block of time series analysis is the concept of stationarity. Two main formulations: Strict stationarity: A time series x t is said to be strictly stationary if the joitn distribution of (x t1,..., x tk ) is identical to that of (x t1 +m,..., x tk +m) for all t, with k arbitrarily positive. This is a very strong assumption which is hard to verify empirically. Weak (or covariance) stationarity: A time series x t is said to be covariance stationary if E[x t ]=μ Cov[x t, x t l ]=γ l l

Fundamentals Stationarity A simple example of a stationary process is the Gaussian White Noise: x t = ɛ t, ɛ t N(0, σ 2 ) E[x t ]=0, VAR(x t )=σ 2, γ l = 0, l 0

Fundamentals Stationarity 4 3 2 1 0 1 2 3 4 0 100 200 300 400 500 600 700 800 900 1000 1 Sample Autocorrelation Function Sample Autocorrelation 0.5 0 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag

A simple autoregressive model Formulation and distribution properties A simple AR(1) model is defined as x t = α + φx t 1 + ɛ t with ɛ t iid(0, σ 2 )

A simple autoregressive model Formulation and distribution properties A simple AR(1) model is defined as x t = α + φx t 1 + ɛ t with ɛ t iid(0, σ 2 ) This simple model entails a first order Markov dependence structure since E t 1 x t = α + φx t 1 Var t 1 x t = σ 2 Let us define E(x t )=μand Var(x t )=γ 0. Under stationarity the unconditional moments are given by μ = α + φμ s.t. μ = α 1 φ γ 0 = γ 0 φ 2 + σ 2 s.t. γ 0 = σ2 1 φ

A simple autoregressive model Properties: Stationarity The standard AR(1) model boils down to be stationarity if φ < 1, indeed therefore solving (1 φl)x t = ɛ t (1 φz) gives z = 1 which is φ 1 > 1 if φ < 1 φ given stationarity the AR(1) can be decomposed as a linear combination of white noise processes x t = α t 1 1 φ + φ i ɛ t i This can be easily proved by starting with x 2 = α + φx 1 + ɛ 2 then iterating forward. i=0

A simple autoregressive model Properties: Autocorrelation AR(1) with φ = 0.9 8 6 4 2 0 2 4 6 8 0 100 200 300 400 500 600 700 800 900 1000 4 AR(1) with φ = 0.1 3 2 1 0 1 2 3 4 0 100 200 300 400 500 600 700 800 900 1000

A simple autoregressive model Properties: Autocorrelation The autocovariance function can be easily derived as { φγ γ l = 1 + σ 2 ifl = 0 φγ l 1 ifl > 0 By using the standard definition of correlation the autocorrelation function can be defined as ρ l = γ l γ 0 = φγ l 1 γ 0 = φρ l 1 for l 0 now since ρ 0 = 1wehaveρ l = φ l The autorrelation, namely persistency, depends on the autoregressive coefficient in the simple AR(1) model.

A simple autoregressive model Properties: Autocorrelation ρ l for φ=0.8 1 Sample Autocorrelation 0.5 0 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag Sample Autocorrelation 1 0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 0.8 ρ l for φ= 0.8 1 0 2 4 6 8 10 12 14 16 18 20 Lag

The general AR(p) model The AR(1) model can be generalized for p lags as p x t = α + φ i x t i + ɛ t i=1 the unconditional moments are defined as α E(x t )= 1 p with Var(x t )=γ 0 = i=1 φ i p φ i γ i + σ 2 i=1 Cov(x t, x t i )=γ i = φ 1 γ i 1 + φ 2 γ i 2 +... + φ p γ j p dividing the autocovariance by γ 0 we get the ACF as p ρ j = φ i ρ j i i=1

A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 )

A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 ) The conditional moments can be easily defined as E t 1 x t = α + θɛ t 1 Var t 1 x t = σ 2

A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 ) The conditional moments can be easily defined as E t 1 x t = α + θɛ t 1 Var t 1 x t = σ 2 The unconditional moments can be defined as E(x t )=α Var(x t )=γ 0 = σ 2 + θ 2 σ 2 =(1 + θ 2 )σ 2

A simple Moving Average model Properties: Autocorrelation 0.6 0.4 MA(1) with θ = 0.1 0.2 0 0.2 0.4 0 100 200 300 400 500 600 700 800 900 1000 1.5 1 MA(12) with θ i = 0.9 for i=1,..,12 0.5 0 0.5 1 0 100 200 300 400 500 600 700 800 900 1000

A simple Moving Average model Properties: Autocorrelation The autocovariance function can be defined as γ 1 = E [(x t α)(x t 1 α)] = E [(ɛ t + θɛ t 1 )(ɛ t + θɛ t 1 )] = σ 2 θ being γ l = 0forl > 1, such that the autocorrelation can be defined as ρ 1 = γ 1 γ 0 = σ 2 θ (1 + θ 2 )σ 2 = θ (1 + θ 2 ) The autocorrelation depends on the moving average parameter θ

A simple Moving Average model Properties: Autocorrelation 1 Sample Autocorrelation 0.5 0 ρ 1 for θ = 0.1 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag 1 Sample Autocorrelation 0.5 0 ρ 1 for θ = 0.9 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag

The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t

The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t Now the unconditional moments can be derived under the assumption of weak stationarity Var(x t )= [ 1 +(a 1 + c 1 ) 2 + c 2 1(a 1 + c 1 ) 2 +... ] σ 2 ɛ = [1 + (a 1 + c 1 ) 2 ] 1 c 2 σ 2 ɛ 1

The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t Now the unconditional moments can be derived under the assumption of weak stationarity Var(x t )= [ 1 +(a 1 + c 1 ) 2 + c 2 1(a 1 + c 1 ) 2 +... ] σ 2 ɛ = [1 + (a 1 + c 1 ) 2 ] 1 c 2 σ 2 ɛ 1 we can clearly see that for lim c 1 1 Var(x t)=

The General ARMA(p,q) model The ARMA(1,1) model can be generalized for and order p of the autoregressive structure and q for the moving average part such that x t = ρ 1 x t 1 + ρ 2 x t 2 +... + ρ p x t p + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 +... + θ q ɛ t q ( 1 ρ1 L ρ 2 L 2... ρ p L p) x t = ( 1 + θ 1 L + θ 2 L 2 +... + θ q L q) ɛ t such that x t ρ(l) =θ(l)ɛ t x t = θ(l) ρ(l) ɛ t = Φ(L)ɛ t

Estimate the ARMA model The Box-Jenkins approach Step 1: Make sure that the time series is stationary (e.g. Augmented Dickey -Fuller test). If not stationary takes first order differences. Step 2: Model selection (Information Criteria) Step 3: Model checking (Residuals tests)

Example: Step 1 Check for stationarity Test for stationarity of the x t variable (i.e. US Stock market returns) by using the Augmented Dickey-Fuller test x t = c + ˆδx t 1 + in order to check for stationarity k ˆφ i Δx t i 1 + ˆɛ t i=0 t-statistic Prob. -19.068 0.0000 Test critical values: 1% level -3.445 5% level -2.868 10% level -2.570 Notice the t stats is defined as t (ˆδ 1 ) /SE (ˆδ ) where the null hypothesis is H 0 : ˆδ = 1.

Example: Step 2 Model selection The ACF and the PACF might be misleading the lag structure of the ARMA(p,q) can be investigate by using the information criteria AIC: The Akaike s Information Criteria is defined as AIC = 2log(L)+2(p q) SBC: The Schwarz Bayesian Information Criteria is defined as SBC = 2log(L)+log(T)(p q) where L is the value of the maximized likelihood and T the number of observations Model AIC SBC ARMA(1,1) -3.38-3.35 ARMA(1,2) -3.35-3.33 ARMA(1,3) -3.29-3.31 ARMA(2,1) -3.37-3.33 ARMA(3,1) -3.37-3.33

Example: Step 3 Model estimates and checking Variable Coefficient Std. Error t-statistic Prob. C 0.008 0.002 3.57 0.0004 AR(1) -0.649 0.214-3.029 0.0026 MA(1) 0.737 0.190 3.876 0.0001 Adjusted R-squared 0.014 Log likelihood 729.80 Akaike info criterion -3.381 Schwarz criterion -3.352 Hannan-Quinn criter. -3.369

Example: Step 3 Model estimates and checking 140 120 Distribution of the residuals 100 80 60 40 20 0 0.25 0.2 0.15 0.1 0.05 0 0.05 0.1 0.15

Example: Step 3 Model estimates and checking 0.8 Residuals Autocorrelation Function 0.6 Sample Autocorrelation 0.4 0.2 0 0.2 0 2 4 6 8 10 12 14 16 18 20 Lag