E 4101/5101 Lecture 6: Spectral analysis

Size: px
Start display at page:

Download "E 4101/5101 Lecture 6: Spectral analysis"

Transcription

1 E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011

2 References to this lecture Hamilton Ch 6 Lecture note (on web page)

3 For stationary variables/processes there is a close correspondence between analysis in the time domain and analysis in the frequency domain Both approaches aid the understanding of time series properties. The correspondence applies to: Representation of series properties (ACF and power spectral density function, PSD). Estimation, empirical ACF/regression in the time domain and empirical periodogram cross-periodogram. In this lecture the main emphasis is on the use of spectral analysis to aid the interpretation of the properties of ARMA and ARIMA series

4 For estimation purposes the time domain is usually most practical and we will not cover the technical aspects estimation in the frequency domain, but the lecture note gives the concepts.

5 Weak stationarity I Let {Y t ; t = 0, ±1, ±2, ±3,...} represent a time series each Y t is stochastic variable in the usual sense. Stationarity is defined in terms of linear properties: Expectation and covariance. The (theoretical) Auto-Correlation Function, ACF, {ρ 1, ρ 2,...}, is where ρ j,t = Corr{Y t, Y t j } = Cov[Y t, Y t j ] Var[Y t ] γ j,t = E[(Y t µ)(y t j µ)], j = 0, 1, 2,..., µ t = E[Y t ] = y t f Yt (y t )dy t for an unconditional probability density function f Yt (y t ). = γ j,t γ 0,t, (1)

6 Weak stationarity II Definitions (Stationarity) If µ t, γ j,t ( j = 0, 1, 2,...) do not depend on t, the Y t process is weakly stationary (covariance stationary): E[Y t ] = µ, for all t E[(Y t µ)(y t j µ)] = γ j for all t and j. Note also that if Y t is stationary, γ j = γ j. For an actual, finite sample, time series {y t ; t = 1, 2, 3,...T } we use the empirical autocorrelations ˆγ j = 1/T T (y t ȳ)(y t j ȳ), j = 0, 1, 2,..., T 1 (2) t=j+1

7 Weak stationarity III where ȳ j = 1/T T t=1 y t, and the empirical ACF is ˆρ j = ˆγ j ˆγ 0, (3) If {Y t ; t = 0, ±1, ±2, ±3,...} is stationary, the ACF can be estimated consistently from the empirical ACF. This provides the theoretical foundation for estimation of other parameters, notably the coefficients in dynamic equations (with OLS and GMM).

8 White-noise A process ε t is white-noise if A white-noise process is stationary. If, in addition to (4)-(6), ε t is Gaussian white-noise. E[ε t ] = 0, (4) Var[ε t ] = E [ε 2 t] = σ 2, (5) Cov[ε t, ε t j ] = γ j = 0. (6) ε t IIN(0, σ 2 ) (7)

9 Maintaining stationarity through linear filtering I A linear filter is a linear combination of L j, j = ±1, ±2, ±3,.... A linear filter can be of finite or infinite order. The linear filter ψ(l) = j= ψ j L j is well defined if j ψ j <. Theorem If j= ψ j < (alternatively j= ψ 2 j < ), a linear filtering of a stationary process will produce a new process which is also stationary. We have seen that if Y t is generated by a stable stochastic difference equation of order p, the solution defines Y t as a well defined linear filter of ε t.

10 Maintaining stationarity through linear filtering II ε t white-noise, Y t is stationary. ε t MA(q), Y t is also stationary. From now on we write Y t ARMA(p, q) (8) for the case when Y t is a stationary solution of Y t = φ 0 + φ 1 Y t φ p Y t p + ε t + θ 1 ε t φ p ε t q (9) Theorem If Y t ARMA(p, q), Y t is a stationary variable.

11 Maintaining stationarity through linear filtering III Theorem If Y t ARMA(p, q) and causal, Y t is a stationary variable given by a one-sided linear filter of ε t. Y t µ = ψ(l)ε t = ψ j ε t j, E[Y t ] = µ j=0 For the AR(p) process Y t φ 1 Y t 1.. φ p Y t p = φ 0 + ε t, with φ 0 = µ(1 φ 1... φ p )

12 Maintaining stationarity through linear filtering IV the autocovariances are given by γ j φ 1 γ j 1.. φ p γ j p = 0 (10) and the autocorrelations are given by the Yule-Walker equations: ρ j φ 1 ρ j 1.. φ p ρ j p = 0. (11) Hence the ACF follow the same dynamics as the Y t process itself. In particular: γ j = b 1 λ j 1 + b 2λ j b pλ j p, where the b i s are numbers and λ i are the roots of the characteristic equation associated with the solution of the homogenous part of the difference equation. Note that p i=1 b i = Var[Y t ].

13 Maintaining stationarity through linear filtering V The results that the ACF has the same dynamics as the Y t itself carries over to the case with a MA part. For ARMA(p, q) low order γ j are influenced by the MA part, but for for higher orders the same qualitative result holds: The ACFs are dominated by the autoregressive part of the generating equation.

14 Period and frequency I A periodic function with amplitude A and phase ϕ : f (t) = A cos(λt ϕ) = a cos(λt) + b sin(λt) (12) where a = A cos(ϕ) and b = A sin(ϕ). λ is frequency in radians. The period, C, is defined the length in time of one full cycle C = ϕ + 2π λ ϕ λ = 2π λ. If C = 2 years, the number of cycles per year is 1/2. We define frequency as the number of cycles per unit of time, hence v = C 1.

15 Period and frequency II For reference: the relationship between the two frequency variables λ = 2πv Later we shall plot a function (power spectral density, PSD) which is symmetric between 1/2 and 1/2 when v is the frequency. v Radians Freq in OxMetrics /4 π/2 1/2 1/2 π 1 Might be different in other software.

16 Discrete Fourier Transform I Heuristically we want to approximate a time series {x t } as closely as possible by a linear combination of cosine functions, as in x t = a 0 + P j=1 {a j cos(λ j t) + b j sin(λ j t)} + rest This problem has a solution in Fourier analysis. The Discrete Fourier Transform (DFT) of the time series {x t }: X (k) = X C (k) ix S (k), (13) where v k = k/t ; k = 0, 1, 2,..T 1, and X C (k) and X S (k) are called the cosine and sine transformations of {x t }:

17 Discrete Fourier Transform II and X C (k) = T X S (k) = T T 1 1/2 x t cos(2πv k t) (14) t=0 T 1 1/2 x t sin(2πv k t) (15) t=0 The lecture note shows that X (k) can be written as: X (k) = X C (k) ix s (k) = T T 1 1/2 x t exp{ 2πv k it} (16) t=0 which defines X (k) as complex numbers associated with the frequencies v k.

18 Discrete Fourier Transform III The DFT in (16), also have the inverse x t = T 1/2 T 1 k=0 X (k) exp{2πv kit} = T 1/2 T 1 k=0 X (k){cos(2πv kt) + i sin(2πv k t)} (17) showing that the DFT gives what we hoped for: A decomposition of {x t } in terms of cosine waves, with X (k) as weights for the different frequencies.

19 Periodogram Since X (k) is complex, the real numbered contribution from each frequency is defined as P x (k): P x (k) = X (k)x (k) = X (k) 2 = X C (k) 2 + X S (k) 2, (18) where X (k) is the conjugate and X (k) is the norm of X (k). P x (k) is real and P x (k) is proportional to the amplitude to the cosine function with frequency υ k. The plot of P x (k) against v k is called the periodogram.

20 Periodogram 0.75 LC Periodogram D LC P x (k) for quarterly Norwegian private consumption ln(cp t )and ln(cp t )

21 Infinite Fourier transform I The DFT has properties that are very similar to Fourier transformations of more general functions a t defined over, t = 0, ±1, ±2,... If s= a s <, the Infinite Fourier transformation, IFT, of {a s } is defined as with the inverse: A(v) = a t = 1/2 a t exp( 2πivt) (19) t= 1/2 A(v) exp(2πivt)dv. (20) v is without a subscript, since v is a continuous frequency here.

22 Population power spectrum density I A direct application if the IFT gives the spectral representation of the ACF R x (m) for a stationary time series x t : R x (m) = E [(x t+m µ)(x t µ)] (21) where µ = E [x t ]. Stationarity means that From (19), the IFT of R x (m) is f x (v) = R x (m) <. (22) m= R x (m) exp( 2πivm) (23) m=

23 Population power spectrum density II and R x (m) = 1/2 1/2 f x (v) exp(2πivm)dv. (24) where f x (v) is called the population power spectrum density, PSD The PSD f x (v) is unique and real if x t is real. f x (v) is also positive and symmetric. We therefore have for x t real f x (v) = f x ( v) R x (m) = 2 1/2 0 f x (v) exp(2πivm)dv (25) saying that both R x (m) and f x (v) are completely described by the frequencies in the interval 0 v 1/2.

24 Population power spectrum density III Note that, by setting m = 0 in (24) we have that the variance of x t can be written as Var[x t ] = R x (0) = 1/2 1/2 f x (v)dv (26) showing that f x (v)dv is the contribution to the variance from each frequency. If x t is white-noise, we have: { σ R x (m) = 2, m = 0 0, m = ±1, ±2... (23) gives f x (v) = σ 2, 1/2 v 1/2 the (PSD) is constant and equal at all frequencies.

25 The PSD of ARMA processes I The lecture note shows that the power spectral density for {y t } is: when f y (v) = A(v) 2 f x (v) = a(exp( 2πiv) 2 f x (v). (27) y t = s= a s x t s = a(l)x t s, (28) s= where a s <, is a filter a(l) = s= s= a s L s and {x t } is weakly stationary. is called the frequency response function. A(υ) = a(exp( 2πiv)) (29)

26 The PSD of ARMA processes II The relationship f y (v) = A(v) 2 f x (v) Shows that the power spectrum of the input series is changed by filtering, and that the effect of the change is described as a multiplication by the squared magnitude of the frequency response function

27 The PSD of ARMA processes III Armed with (27) it is possible to find the PSD of any y t ARMA(p, q): φ(l)y t = θ(l)ε t, ε t UIN(0, σ 2 ), (30) with φ(l) = 1 φ 1 L φ 2 L 2... φ p L p and θ(l) = 1 + θ 1 L + θ 2 L θ p L q. The lecture note shows f y,arma(p,q)] (v) = θ(exp( 2πiv) 2 φ(exp( 2πiv) 2 σ2 (31)

28 PSD of AR(1) The lecture note shows f y,arma(1,0) = σ 2 1 2φ 1 cos(2πv) + φ1 2. (32) v = min[1 2φ 1 cos(2πv) + φ1 2] = 0 when φ 1 > 0 and v 0 v 1/2. The PSD has a peak in v = 0 and declines with increasing v until v = 1/2. If φ 1 < 0, v = 1/2 and the spectral density is increasing in v,

29 PSD of ARMA(2,1) y t ARMA(2, 1) has the PSD: f y,arma(2,1) (v) = y t φ 1 y t 1 φ 2 y t 2 = ε t + θ 1 ε t 1, σ 2 (1 + θ 1 2 cos(2πv) + θ 2 1 ) 1 + φ φ2 2 φ 1(1 φ 2 )2 cos(2πv) φ 2 2 cos(4πv).

30 Power-shift and phase-shift of a filter I Let x t denote a stationary time series and let y t = s= a s x t s, where s= The IFT for the filter {a s } is, as we have seen A(υ) = s= a s <. s= a s exp( 2πivs). s= A(v) is the frequency response function and the filter a s (s = 0, ±1, ±2,...) is often called the impulse-response function in the literature.

31 Power-shift and phase-shift of a filter II Since A(v) often is complex, it is useful to write A(v) in polar-coordinate form: A(v) = A(v) exp(iκ(v)) A(v) is called the power-shift and κ(v) is called phase-shift. It can be shown that symmetric filters have no phase-shifting effect but that one-sided filters a s (s = 0, 1, 2,...) have such an effect. We concentrate on power-shifts.

32 Power-shift and phase-shift of a filter III From (27) we have f y (v) = A(v) 2 f x (v) (33) Filters can amplify or weaken certain frequencies in the input series x t. Filters are often classified as low-pass or high pass depending on whether high or low frequencies are amplified by the filter.

33 Effect of differencing I Let y t = x t which implies the filter a 0 = 1, a 1 = 1, a s = 0 for other values of s. A(v) 2 = 2(1 cos(2πv)) A plot starts in zero and increases in v. If x t has a root close to 1 a the v = 0, this root will be removed from the filtered series The differenced series will be more stationary than the level series itself.

34 The spectrum of ARIMA-series I We have found the PSD of an AR(1) process, cf. equation (32). When φ 1 = 1 that PSD becomes f y,rw (v) = σ 2 2(1 cos(2πv)) (34) The which is infinite near the zero-frequency and declines sharply with increasing frequency v. The simple random-walk is dominated by the long waves. The lecture note shows that if y t ARMA(p, q) so that y t ARIMA(p, 1, q)

35 The spectrum of ARIMA-series II we have f y,arima[p,1,q] (v) = f y,rw (v) f y,arma[p,q] (v) (35) Since f y,arma[p,q] (v) is finite for all frequencies, the PSD of ARIMA[p, q] will be dominated by the random-walk component f y,rw (v) which is infinite at the zero frequency. This defines what Granger called the typical spectral shape for economic time series.

36 Seasonally integrated series I Let y t be generated by: y t = y t 1 y t 2 y t 3 + ε t, ε t UIN(0, σ 2 ). (36) which we can write as S(L)y t = ε t (37) where S(L) : S(L) = 1 + L + L 2 + L 3. We can interpret S(L) as a filter.

37 Seasonally integrated series II The lecture note shows that the PSD of y t in this case is: σ2 f y (v) = A(v) 2 (38) = σ cos(2πv) + 4 cos(4πv) + 2 cos(6πv) which is infinite at v = {0, 25, 0, 5} and almost-zero elsewhere.

38 Seasonal random walk I If the generating equation is: y t = y t 4 + ε t (39) the power spectrum is: f y (v) = σ 2 2(1 cos(8πv)) (40) The graph which infinite at v = 0, 1/4, 1/2. f 4 y (v) flat in this case (white-noise by construction), as a result of the power-shift of the filter 1 L 4, which is zero at the same frequency.

39 Seasonal random walk II If we denote a unit-root by z j and the corresponding frequency by v j we have for (39): {z j, v j } = {1, 0; i, 1/2; 1, 1/2; i, 3/4}. The seasonally integrated (36) has roots {z j,v j } = { i, 1/2; 1, 1/2; i, 3/4} since S(L) = (1 + L)(1 + L 2 ).

40 Estimation I Non-parametric: The periodogram P x (v k ) can be used directly to estimate the power spectrum density, but crude relative to best practice estimation of PSD. Parametric: Estimate well specified ARIMA model and use to estimate to calculate empirical PSD analytically.

Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics

Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics Ragnar Nymoen March 31 2014 1 Introduction This reference note is a self contained supplement to for example Ch 6 in

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Time Series Econometrics 4 Vijayamohanan Pillai N

Time Series Econometrics 4 Vijayamohanan Pillai N Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

E 4101/5101 Lecture 9: Non-stationarity

E 4101/5101 Lecture 9: Non-stationarity E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

The Identification of ARIMA Models

The Identification of ARIMA Models APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, 61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Class: Trend-Cycle Decomposition

Class: Trend-Cycle Decomposition Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Chapter 5: Models for Nonstationary Time Series

Chapter 5: Models for Nonstationary Time Series Chapter 5: Models for Nonstationary Time Series Recall that any time series that is a stationary process has a constant mean function. So a process that has a mean function that varies over time must be

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 7: Single equation models

ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 7: Single equation models ECON 4160: Econometrics-Modelling and Systems Estimation Lecture 7: Single equation models Ragnar Nymoen Department of Economics University of Oslo 25 September 2018 The reference to this lecture is: Chapter

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Ch 9. FORECASTING. Time Series Analysis

Ch 9. FORECASTING. Time Series Analysis In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error

More information