Time Series Analysis Fall 2008

Similar documents
Chapter 4: Models for Stationary Time Series

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Basic concepts and terminology: AR, MA and ARMA processes

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Univariate Time Series Analysis; ARIMA Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

3. ARMA Modeling. Now: Important class of stationary processes

Some Time-Series Models

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Variance Decomposition

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

1 Linear Difference Equations

Identifiability, Invertibility

Time Series 2. Robert Almgren. Sept. 21, 2009

ECON 616: Lecture 1: Time Series Basics

11. Further Issues in Using OLS with TS Data

Lecture 2: Univariate Time Series

FE570 Financial Markets and Trading. Stevens Institute of Technology

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Discrete time processes

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

Univariate ARIMA Models

3 Theory of stationary random processes

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Lecture 4a: ARMA Model

Time Series Analysis Fall 2008

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

Chapter 9: Forecasting

Class 1: Stationary Time Series Analysis

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Chapter 3 - Temporal processes

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

A time series is called strictly stationary if the joint distribution of every collection (Y t

Ch. 14 Stationary ARMA Process

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Ch 4. Models For Stationary Time Series. Time Series Analysis

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Empirical Market Microstructure Analysis (EMMA)

Review Session: Econometrics - CLEFIN (20192)

Covariances of ARMA Processes

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem

Autoregressive and Moving-Average Models

Multivariate Time Series: VAR(p) Processes and Models

Classic Time Series Analysis

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Time Series Analysis

Nonlinear time series

Multivariate Time Series

Quantitative Finance I

Time Series 3. Robert Almgren. Sept. 28, 2009

9) Time series econometrics

Introduction to ARMA and GARCH processes

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

STAD57 Time Series Analysis. Lecture 8

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 5: Models for Nonstationary Time Series

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Ch 6. Model Specification. Time Series Analysis

E 4101/5101 Lecture 6: Spectral analysis

2. An Introduction to Moving Average Models and ARMA Models

1.1. VARs, Wold representations and their limits

6 NONSEASONAL BOX-JENKINS MODELS

7. MULTIVARATE STATIONARY PROCESSES

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Lecture on ARMA model

LECTURE 10: MORE ON RANDOM PROCESSES

Stochastic processes: basic notions

5: MULTIVARATE STATIONARY PROCESSES

Chapter 6: Model Specification for Time Series

Lesson 9: Autoregressive-Moving Average (ARMA) models

Notes on Time Series Modeling

10) Time series econometrics

Stochastic Processes

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

1 Teaching notes on structural VARs.

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Econ 424 Time Series Concepts

Vector autoregressions, VAR

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 2: Unit Roots

Introduction to Stochastic processes

Advanced Econometrics

Lecture 2: ARMA(p,q) models (part 2)

Econometric Forecasting

Non-Stationary Time Series and Unit Root Testing

More Empirical Process Theory

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

ARMA Estimation Recipes

Parametric Inference on Strong Dependence

ARIMA Models. Richard G. Pierse

STAT 248: EDA & Stationarity Handout 3

Applied time-series analysis

Gaussian processes. Basic Properties VAG002-

Lecture 1: Stationary Time Series Analysis

Transcription:

MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

Introduction 1 14.384 Time Series Analysis, Fall 007 Professor Anna Mikusheva Paul Schrimpf, scribe September 6, 007 Lecture 1 Stationarity, Lag Operator, ARMA, and Covariance Structure Introduction History popular in early 90s, making comeback now. Current comeback is largely due to macro-applications. Can roughly divide time series into macro and finance related stuff. Macro stuff mostly focuses on means. Finance on higher moments. Macro limited by short horizon of data available. Outline Can divide course into 1. Classics. DSGE Goals stationary nonstationary Univariate ARMA unit root Multivariate VARMA cointegration simulated GMM ML Bayesian Most of you probably interested in empirical research, so we ll give you the tools needed to do this. However, we ll also cover theory and highlight open questions. Problem Sets Will have an empirical part requires programming. Use whatever language you prefer. We recommend Matlab and discourage Stata. You need not write your programs from scratch. You can freely download programs from the web, but make sure you use them correctly and cite them. Working in groups is encouraged, but you should write your own solutions. ARMA Processes Stationarity We need what we have observed to be stable, in some sense, so that we can make statements about the future. Definition 1. white noise {e t } such that Ee t = 1, Ee t e s = 0, Ee t = σ Definition. strict stationarity A process, {y t }, is strictly stationarity if for each k, the distribution of {y t,..., y t+k } is the same for all t

Stationarity Definition 3. nd order stationarity {y t }, is nd order stationary if Ey t, Ey t, and cov(y t, y t+k ) do not depend on t Examples of non-stationary Example 4. Break: y t = { β + e t β + λ + e t t k t > k Example 5. Random Walk (also known as unit root process) y t = y t 1 + e t Definition 6. Lag operator Denoted L. Ly t = y t 1. The lag operator can be raised to powers, e.g. L y t = y t. We can also form polynomials of it a(l) = a 0 + a 1 L + a L +... + a p L p a(l)y t = a 0 y t + a 1 y t 1 + a y t +... + a p y t p Lag polynomials can be multiplied. Multiplication is commutative, a(l)b(l) = b(l)a(l). Inversion Lag polynomials can also be inverted. Example (1 ρl)(1 ρl) 1 1 (1 ρl) ρ i L i = ρ i L i ρ i L i i=1 0 = ρ L 0 = 1 Of course, this only makes sense if ρ < 1, because then if x t is weakly stationary, J i i L ( ρ L )x t y t as J For higher order polynomials, we can invert them by factoring, using the formula for (1 ρl) 1, and then rearranging, for example: 1 a 1 L a L =(1 λ 1 L)(1 λ L), λ i < 1 (1 a 1 L a L ) 1 =(1 λ 1 L) 1 (1 λ L) 1 =( λ i 1 L i i )( λ L i ) ( ) j = L j j k λ 1 j=0 k=0 k λ

Simple Processes 3 Or, perhaps more easily we can do a partial fraction decomposition 1 a b = + (1 λ 1 x)(1 λ x) 1 λ 1 x 1 λ x λ 1 λ a =, b = λ 1 λ λ λ 1 a 1 i (L) = a λ 1 L i i + b λ L i This trick only works when the λ i are unique. The formula is slightly different otherwise. Note: the λ i are the inverse of the roots of the lag polynomials. To invert a polynomial, we needed λ i < 1, i.e., the roots of the polynomial are outside of unit circle. Simple Processes Autoregressive (AR) Moving average (MA) ARMA AR(1): y t = ρy t 1 + e t, ρ < 1 (1 ρl)y t = e t AR(p): a(l)y t = e t, where a(l) is order p MA(1): y t = e t + θe t 1 y t = (1 + θl)e t MA(q): y t = b(l)e t, where b(l) is order q ARMA(p, q): a(l)y t = b(l)e t, where a(l) is order p and b(l) is order q, and a(l) and b(l) are relatively prime. An ARMA representation is not unique. For example, an AR(1) (with ρ < 1) is equal to an MA(), as we saw above. In fact, this is more generally true. Any AR(p) with roots outside the unit circle has an MA representation. Covariances Definition 7. auto-covariance γ k cov(y t, y t+k ) Definition 8. auto-correlation ρ k AR(1) example γ γ k 0 y t = ρy t 1 + e t

Covariances 4 Observe V ar(yt) = ρ σ V ar(y t 1 ) + σ, and V ar(y t ) = V ar(y t 1) = γ 0, so γ 0 = ρ. Also, it is easy to see k ρ σ 1 by induction that γk = ρ. Another way to see this is from the MA representation: More generally, if y t = y t = c ie t i then ( ) cov(y t, y t+k ) =cov c i e t i, c i e t+k i =σ c j c j+k c(ξ)c(ξ 1 ) =( c i ξ i )( c i ξ i ) = c j c l ξ j l = ξ k c j c j+k 1 ρ i e i σ γ 0 = ρ i σ = 1 ρ ρ k σ γ k = cov( ρ i e t i, ρ i e t+k i ) = ρ i σ = 1 ρ i=k j=0 MA representation and covariance stationarity y t = c ie t i so y t has finite variance, and in fact is covariance stationary, if j=0 c j <. It is often easier to prove things with the stronger assumption of absolute summability, j=0 c j < (or stronger still j=0 j c j < ). Definition 9. covariance function γ(ξ) = i= γ i ξ i, where ξ is a complex number. Lemma 10. Covariance function of MA For an MA, y t = c(l)e t, γ(ξ) = σ c(ξ)c(ξ 1 ). Proof. j,l=0 k= j=0 σ b(ξ)b(ξ 1 ) Lemma 11. Covariance function of ARMA For an ARMA, a(l)y t = b(l)e t, γ(ξ) = a(ξ)a(ξ 1 )