Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Similar documents
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 2: Univariate Time Series

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Some Time-Series Models

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Empirical Market Microstructure Analysis (EMMA)

Time Series Analysis

Econometría 2: Análisis de series de Tiempo

3. ARMA Modeling. Now: Important class of stationary processes

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

3 Theory of stationary random processes

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Introduction to Stochastic processes

Econ 424 Time Series Concepts

Econometrics II Heij et al. Chapter 7.1

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

FE570 Financial Markets and Trading. Stevens Institute of Technology

at least 50 and preferably 100 observations should be available to build a proper model

Chapter 6: Model Specification for Time Series

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Chapter 4: Models for Stationary Time Series

Univariate Time Series Analysis; ARIMA Models

2. An Introduction to Moving Average Models and ARMA Models

Time Series Analysis -- An Introduction -- AMS 586

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

1 Linear Difference Equations

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Time Series 2. Robert Almgren. Sept. 21, 2009

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Introduction to ARMA and GARCH processes

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Ch 4. Models For Stationary Time Series. Time Series Analysis

Quantitative Finance I

Discrete time processes

Nonlinear time series

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

7. MULTIVARATE STATIONARY PROCESSES

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Time Series Analysis

Lecture 4a: ARMA Model

Covariances of ARMA Processes

Forecasting with ARMA

Ch. 14 Stationary ARMA Process

Review Session: Econometrics - CLEFIN (20192)

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Lecture 2: ARMA(p,q) models (part 2)

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Autoregressive and Moving-Average Models

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Midterm Suggested Solutions

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

A time series is called strictly stationary if the joint distribution of every collection (Y t

Multivariate Time Series

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

A Beginner s Introduction. Box-Jenkins Models

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econometría 2: Análisis de series de Tiempo

Applied time-series analysis

Time Series I Time Domain Methods

LINEAR STOCHASTIC MODELS

1 Class Organization. 2 Introduction

5: MULTIVARATE STATIONARY PROCESSES

Econometrics I: Univariate Time Series Econometrics (1)

Time Series 3. Robert Almgren. Sept. 28, 2009

Lecture 3-4: Probability models for time series

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Classic Time Series Analysis

Advanced Econometrics

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Lesson 9: Autoregressive-Moving Average (ARMA) models

Time Series: Theory and Methods

Exercises - Time series analysis

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

AR, MA and ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

ARIMA Models. Richard G. Pierse

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

MCMC analysis of classical time series algorithms.

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

IDENTIFICATION OF ARMA MODELS

E 4101/5101 Lecture 6: Spectral analysis

ECON 616: Lecture 1: Time Series Basics

Introduction to time series econometrics and VARs. Tom Holden PhD Macroeconomics, Semester 2

Ch 6. Model Specification. Time Series Analysis

Long-range dependence

Chapter 2: Unit Roots

Stationary Stochastic Time Series Models

Econometric Forecasting

Transcription:

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018

Overview Moving average processes Autoregressive processes: moments and the Yule-Walker equations Wold s decomposition theorem Moments, ACFs and PACFs of AR and MA processes 2

Moving Average Process MA(q) models are always stationary as they are finite, linear combination of white noise processes o Therefore a MA(q) process has constant mean, variance and autocovariances that differ from zero up to lag q, but zero afterwards 3

Moving Average Process: Examples MA(q) models are always stationary as they are finite, linear combination of white noise processes o Therefore a MA(q) process has constant mean, variance and autocovariances that differ from zero up to lag q, but zero afterwards o Simulations are based on 4

Moving Average Process : Examples 5

Autoregressive Process An autoregressive (henceforth AR) process of order p is a process in which the series yy tt is a weighted sum of p past variables in the series (yy tt 1, yy tt 2,, yy tt pp ) plus a white noise error term, εε tt o AR(p) models are simple univariate devices to capture the observed Markovian nature of financial and macroeconomic data, i.e., the fact that the series tends to be influenced at most by a finite number of past values of the same series, which is often also described as the series only having a finite memory 6

The Lag and Difference Operators The lag operator, generally denoted by L, shifts the time index of a variables regularly sampled over time backward by one unit o Therefore, applying the lag operator to a generic variable yy tt, we obtain the value of the variable at time t -1, i.e., LLyy tt = yy tt 1 o Equivalently, applying LL kk means lagging the variable k > 1 times, i.e., LL kk yy tt = LL kk 1 (LLyy tt ) = LL kk 1 yy tt 1 = LL kk 2 (LLLL tt 1 ) = = yy tt kk The difference operator, Δ, is used to express the difference between consecutive realizations of a time series, Δyy tt = yy tt yy tt 1 o With Δ we denote the first difference, with Δ 2 we denote the secondorder difference, i.e., Δ 2 yy tt = Δ Δyy tt = Δ yy tt yy tt 1 = Δyy tt Δyy tt 1 = yy tt yy tt 1 yy tt 1 yy tt 2 = yy tt 2yy tt 1 + yy tt 2 and so on o Note that Δ 2 yy tt yy tt yy tt 2 o Δyy tt can also be rewritten using the lag operator, i.e., Δyy tt = (1 LL)yy tt o More generally, we can write a difference equation of any order, Δ kk yy tt as Δ kk yy tt = (1 LL) kk yy tt, k 1 7

Stability and Stationarity of AR(p) Processes In case of an AR(p), because it is a stochastic difference equation, it can be rewritten as or, more compactly, as φφ(ll)yy tt = φφ 0 + εε tt, where φφ(ll) is a polynomial of order p, Replacing in the polynomial φφ(ll) the lag operator by a variable λλ and setting it equal to 0, i.e., φφ λλ = 0, we obtain the characteristic equation associated with the difference equation φφ(ll)yy tt = φφ 0 + εε tt o A value of λλ which satisfies the polynomial equation is called a root o A polynomial of degree p has p roots, often complex numbers If the absolute value of all the roots of the characteristic equations is higher than one the process is said to be stable A stable process is always weakly stationary o Even if stability and stationarity are conceptually different, stability conditions are commonly referred to as stationarity conditions 8

Wold s Decomposition Theorem φφ jj εε tt jj An autoregressive process of order p with no constant and no other predetermined, fixed terms can be ex-pressed as an infinite order moving average process, MA( ), and it is therefore linear If the process is stationary, the sum ii=0 φφ jj εε tt jj will converge The (unconditional) mean of an AR(p) model is o The necessary and sufficient condition for the mean of an AR(p) process to exist and be finite is that the sum of the AR coefficients is less than one in absolute value, 9

Moments and ACFs of an AR(p) Process The (unconditional) variance of an AR(p) process is computed as: o For AR(p) models, the characteristic polynomials are rather convoluted it is infeasible to define simple restrictions on the AR coefficients that ensure covariance stationarity The autocovariances and autocorrelations functions of AR(p) processes can be computed by solving a set of simultaneous equations known as Yule-Walker equations o It is a system of K equations that we recursively solve to determine the ACF of the process, i.e., ρρ h for h = 1, 2, o See example concerning AR(2) process given in the lectures and/or in the lecture notes For a stationary AR(p) model, the ACF will decay geometrically to zero 10

ACF and PACF of AR(p) Process The SACF and SPACF are of primary importance to identify the lag order p of a process o F 11

ACF and PACF of AR(p) and MA(q) Processes An AR(p) process is described by an AFC that may slowly tail off at infinity and a PACF that is zero for lags larger than p Conversely, the AFC of a MA(q) process cuts off after lag q, while the PACF of the process may slowly tails off at infinity 12