LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Size: px
Start display at page:

Download "LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes."

Transcription

1 MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5; Hamilton 994), Ch. 6) A convenient way to represent the sequence of autocovariances {γj) : j 0,,...} of a covariance stationary process is by the means of the spectral density or spectrum. The spectral density is defined as follows. f λ) γj)e iλj, j where i. Notice that, since γ j) γ j), it follows that the spectral density is real valued. f λ) γ 0) + γj)e iλj + γj)e iλj j j γ 0) + γ j)e iλj + γj)e iλj j j γ0) + γj) e iλj + e iλj). Next, e iλj cos λj) + i sin λj), e iλj cos λj) i sin λj), and, therefore, j e iλj + e iλj cos λj). Hence, f λ) γ0) + j γj) cos λj) j γj) cos λj). Since cos λj) cos λj), the spectral density is symmetric around zero. Furthermore, since cos is a periodic function with the period, the range of values of the spectral density is determined by the values of f λ) for 0 λ π. The autocovariance function and spectral density are equivalent, as it follows from the result below. Theorem Suppose that h γh) <. Then γj) π f λ) eiλj dλ. Proof. f λ) e iλj dλ h h γh) γh)e iλh ) e iλj dλ e iλj h) dλ,

2 where summation and integration can be interchanged because h γh) <. Next, For j h, e iλj h) dλ if j h. e iλj h) dλ cos λ j h)) + i sin λ j h))) dλ π π sin λ j h)) cos λ j h)) j h i j h sin π j h)) sin j h)) j h cos π j h)) cos j h)) i. j h However, since cos and sin are periodic with the period, cos π j h)) cos j h) + j h)) cos j h)). Therefore, e iλj h) dλ 0 if j h. The result of Theorem implies in particular that γ 0) f λ) dλ. Thus, the area under the spectral density function of X t between and π gives the variance of X t. The argument λ of f λ) is called the frequency. Notice that if {X t } is covariance stationary with absolutely summable autocovariances, the long-run variance is determined by the spectral density at the zero frequency. n ) ω X lim V ar n / X t n h f 0). γ h) Next, we discuss how linear MA) transformations of a covariance stationary process affect the spectral density and long-run variance. Theorem Let {X t } be a covariance stationary process with the autocovariance function γ X such that j γ X j) <. Define Y t c jx t j, where c j <. Then {Y t} is covariance stationary and its spectral density is given by f Y λ) c je iλj fx λ), where f X is the spectral density of {X t }. t

3 Proof. Cov Y t, Y t h ) Cov c j X t j, c j X t h j k0 c j c k Cov X t j, X t h k ) k0 c j c k γ X h + k j). Hence, Cov Y t, Y t h ) is independent of t. Furthermore, by the same argument as on pages - of Lecture 9, c j c k γ X h + k j) γ X h) k0 Therefore, {Y t } is covariance stationary. Next, f Y λ) h <. Cov Y t, Y t h ) e iλh h k0 c j h0 c j c k γ X h + k j) e iλh c j e iλj c k e iλk γ X h + k j) e iλh+k j) k0 h c j e iλj c j e iλj f X λ) c j e iλj f X λ). The last equality follows because c j e iλj j j j c j cos λj) i sin λj)) c j cos λj) i j c j sin λj). Its complex conjugate is c j cos λj) + i j j c j sin λj) j j c j cos λj) + i sin λj)) c j e iλj, 3

4 and hence c j e iλj c j cos λj) + c j sin λj) c j e iλj c j e iλj. In the above theorem, the spectral density at the zero frequency and, as a result, the long-run variance is finite if c j <. However, absolute summability, c j <, is a stronger assumption than square summability, c j <, as we show next. Suppose c j <. First, c j < implies that c j 0 as j. Therefore, the sequence {c j } is uniformly bounded. Next, c j sup j c j c j <. Suppose that {X t } is covariance stationary and purely indeterministic. Then it has the MA ) representation X t a j ε t j, ) where {ε t } is a WN, and a j <. Let V ar ε t) σ. Since the spectrum of a WN process is flat: Theorem implies that and the long-run variance of {X t } is f λ) σ for all λ, f X λ) σ a j e iλj, ω X f X 0) σ If we take ) as the generating mechanism, the condition a j < ensures that {X t} is covariance stationary. However, the sufficient condition for the long-run variance to be finite is a j <. If the last condition fails, we can have that j γ X j). Such a process is called long memory. If a j < holds for a long memory process, then its autocovariance function converges to zero, however, at the rate that is too slow for the long-run variance to be finite. Let {Y t } be as defined in Theorem. Then its spectral density and long-run variance are given by f Y λ) σ a j e iλj c j e iλj, ω Y σ. a j a j c j. 4

5 Lag operator Gourieroux and Monfort 990), Ch. 5; Hamilton 994), Ch. ) The lag operator L transforms the process {X t } into itself such that LX t X t, L X t LLX t LX t X t,... L h X t X t h. The lag polynomial C L) c jl j transforms {X t } into another process {Y t } such that and Y t C L) X t c j L j X t c j X t j. Let A L) a jl j and B L) b jl j. Then We have that AL)BL) AL) + BL) a j + b j ) L j, a j L j b j L j h0 a j b h L j+h a 0 b 0 + a 0 b + b 0 a ) L + a b + a 0 b + a b 0 ) L AL) + BL) BL) + AL), AL)BL) BL)AL). Under certain conditions, a lag polynomial can be inverted. The inverse of a lag polynomial C L) is another lag polynomial, say B L) such that so we can write C L) B L), ) C L) B L). Inversion of lag polynomials is important for the following reason. Consider the following autoregressive process of order AR)): X t cx t + ε t. 3) This process is generated recursively given some exogenous white noise process {ε t }, a starting value X 0 a random variable with the variance equal to V ar X t ) to be determined later), and the coefficient c: X cx 0 + ε, X cx + ε, 5

6 and etc. Thus, the process {X t } is an endogenous solution to the difference equation 3). This difference equation can be also written as X t cx t ε t or C L) X t ε t, C L) cl. If C L) can be inverted, then the solution can be written as X t C L) ε t. Thus, it is important to determine under what conditions a polynomial in lag operator can be inverted and how to compute the coefficients of its inverse. Consider first a polynomial of order. Without loss of generality, we can set the coefficient associated with L 0 as c 0 : C L) cl. Suppose that c <. Then, we can define the inverse of cl as follows. This definition satisfies ) since cl) + cl + c L ) cl) c j L j. The definition 4) is not the only solution that satisfies ). Adding to it the term V c t, where V is some random variable, satisfies it as well, since cl) V c t V c t V clc t V c t V cc t 0. However, if we take cl) cj L j + V c t, then cl) ε t cj ε t j + V c t ε t and is not a covariance stationary process. Therefore, we restrict V 0. Next, consider a lag polynomial of order : We can factor the polynomial as where C L) c L c L. c L c L λ L) λ L) 5) c λ + λ, c λ λ. Another way to find λ and λ is as follows. Let z and z be the solutions possibly complex) to We can write 6) as and by comparing this with 5), we obtain c z c z 0. 6) c z c z z z) z z), λ z and λ z. This is due to the Wold decomposition result. Alternatively, this can be viewed as a normalization: multiplying all coefficients by some constant c 0 0 affects only the variance of the process. 6

7 The polynomial in 5) can be inverted provided that λ < and λ <. These conditions are equivalent to the condition that all the roots of the polynomial in 6) are outside the unit circle: z >. If this condition is satisfied then c L c L ) λ L) λ L) λ j Lj. λ j Lj Alternatively, write Then, λ L λ L λ λ c L c L ) λ λ λ λ λ λ L λ ). λ L λ λ j Lj λ λ j Lj) The result can be extended to a lag polynomial of order p, since we can factor it as where λ j s satisfy C L) c L... c p L p, c L... c p L p c z... c p z p ) ) λ j+ λ j+ L j. p λ j L), j p λ j z). The polynomial can be inverted provided that the roots of c z... c p z p are outside the unite circle. ARMA c L... c p L p ) j p λ j L). 7) Let {ε t } be a WN with V ar ε t ) σ. An MA q) process, say {X t } is generated as where j X t ε t + θ ε t θ q ε t q Θ L) ε t, By Theorem, MA q) has the spectral density Θ L) + θ L θ q L q. f X λ) σ Θ e iλ ) σ + θ e iλ θ q e iλq, 7

8 and the long-run variance ω X f X 0) σ Θ ) σ + θ θ q ). Notice that for certain values of θ j s, the long-run variance can be zero. For 0 j q, the effect of a shock in period t on X after j periods is θ j X t+j ε t, and the shocks have no effect after more than q periods. Definition Autoregression) A process {X t } is said to be autoregressive of order p or autoregression), denoted as AR p), if it is generated according to where {ε t } is a WN. Let Then, ARp) can be written as X t φ X t + φ X t φ p X t p + ε t, Φ L) φ L... φ p L p. Φ L) X t ε t. Provided that all the roots of Φ z) lie outside the unit circle, Φ L) can be inverted, and the process has the following M A ) representation Wold decomposition). X t Φ L) ε t for some Ψ L). Suppose that p. Then, and, according to 4), Ψ L) ε t ψ j ε t j, 8) φ L) X t ε t, X t φ L) ε t φ j Lj ε t φ j ε t j. Hence, in the case of the AR ) process, the coefficients of Ψ L) in 8) are given by ψ j φ j. To check the square-summability condition of Theorem, ψj φ j φ <, 8

9 provided that φ <. Then, Theorem implies that AR ) is covariance stationary, and its spectral density and long-run variance of the AR ) process are f X λ) σ ω X f X 0) φ j e iλj σ φ ). σ φ e iλ, The long-run variance of a stationary AR ) process is finite as well. Hence, in this case we actually have that the coefficients in MA ) for a stationary AR ) process are absolutely summable: ψ j φ j φ <. This is because ψ j φ j 0 as j at the exponential rate. In the case of AR p), p >, due to 7), ψ j < provided that all the roots of Φ z) lie outside the unit circle. Then AR p) is covariance stationary with the spectral density and long-run variance f X λ) σ Φ e iλ ), ω X σ Φ ). Definition ARMA) A process {X t } is ARMA p, q), if it is generated according to where {ε t } is a WN. Φ L) X t Θ L) ε t, When the roots of Φ z) lie outside the unit circle, ARMA p, q) has an MA ) representation X t Φ L) Θ L) ε t. Its spectral density and long-run variance are f X λ) σ Θ e iλ) Φ e iλ ), Θ ) ω X σ Φ ). When the roots of Θ z) lie outside the unit circle, ARMA p, q) has an AR ) representation Θ L) Φ L) X t ε t. Thus, in practice, any covariance stationary ARM A p, q) or M A ) process can be approximated by AR m n ) model, with m n increasing with n, however, at the slower rate. An ARMA process with the nonzero mean µ can be written as Φ L) X t µ) Θ L) ε t. 9

10 Vector case Suppose that the vector process {X t } is covariance stationary and EX t 0 for all t. Let Γ j) EX t X t j. Notice that, in order for {X t } to be covariance stationary, Γ j) does not need to be symmetric for j 0, however, Γ j) Γ j). Consider a sequence of k-matrices {C j }, and define The variance of Y t is given by EY t Y t E Let A tr A A). We have i0 i0 Y t C i X t i ) C j X t j. C j X t j C i Γ j i) C j C j Γ 0) C j + h Cj Γ h) C j+h + C j+h Γ h) C j ). EY t Y t C i Γ j i) C j i0 C i C j Γ j i) i0 Γ h) C j, h0 where the last inequality is by the same argument as on pages - of Lecture 9. Hence, EY t Y t is finite provided that C j <, 9) and h Γ h) <. 0) Definition 3 A k-vector process {ε t } is a vector WN if Eε t 0 and Eε t ε t Σ, a positive definite matrix, for all t, and Eε t ε s 0 for t s. If {X t } is purely indeterministic mean zero covariance stationary process such that 0) holds, similarly to the scalar case, it has the MA ) representation X t C j ε t j, 0

11 where {ε t } is a vector WN and C j s satisfy 9), and C 0 I k. Again, as in the scalar case, ε t s are the linear one step ahead prediction errors. The spectral density of a vector process is defined similarly to the scalar case: f λ) j Γ 0) + Γ j) e iλj Γ j) e iλj + Γ j) e iλj), j Again, the long-run variance-covariance matrix is given by In order for the long-run variance to be finite, Ω f 0). C j <. For the vector WN process {ε t }, the spectral density is flat: f ε λ) Σ. Let {X t } be a covariance stationary process with the spectral density f X. Define Y t C j X t j. Then, the spectral density of {Y t } is given by f Y λ) C j e iλj f X λ) C je iλj. Let {C j } be a sequence of k-matrices. The lag polynomial C L) in the vector case is defined as C L) I k + C L + C L +..., where we set C 0 I k according to the Wold decomposition. We say if B L) C L) B L) C L) I k. The polynomial C L) is invertible provided that the roots of det C z)) 0 lie outside the unit circle. For example, if C L) is of order p, det I + C z C p z p ) 0 has to satisfy that all the roots are greater than one in absolute value.

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Vector Time-Series Models

Vector Time-Series Models T. Rothenberg Fall, 2007 1 Introduction Vector Time-Series Models The n-dimensional, mean-zero vector process fz t g is said to be weakly stationary if secondorder moments exist and only depend on time

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem Vector autoregressive Moving Average Process Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem Road Map 1. Introduction 2. Properties of MA Finite Process 3. Stationarity of MA Process 4. VARMA

More information

Comprehensive Examination Quantitative Methods Spring, 2018

Comprehensive Examination Quantitative Methods Spring, 2018 Comprehensive Examination Quantitative Methods Spring, 2018 Instruction: This exam consists of three parts. You are required to answer all the questions in all the parts. 1 Grading policy: 1. Each part

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Class: Trend-Cycle Decomposition

Class: Trend-Cycle Decomposition Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Modeling and testing long memory in random fields

Modeling and testing long memory in random fields Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES

SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting the spectral densities

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 STAD57H3 Time Series Analysis Duration: One hour and fifty minutes Last Name: First Name: Student

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Define y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting

Define y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Time series models in the Frequency domain. The power spectrum, Spectral analysis

Time series models in the Frequency domain. The power spectrum, Spectral analysis ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ

More information