Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Similar documents
Lecture 2: ARMA(p,q) models (part 2)

3. ARMA Modeling. Now: Important class of stationary processes

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Review Session: Econometrics - CLEFIN (20192)

Econometría 2: Análisis de series de Tiempo

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

7. MULTIVARATE STATIONARY PROCESSES

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Introduction to ARMA and GARCH processes

FE570 Financial Markets and Trading. Stevens Institute of Technology

Univariate Time Series Analysis; ARIMA Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

1 Linear Difference Equations

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Econ 623 Econometrics II Topic 2: Stationary Time Series

5: MULTIVARATE STATIONARY PROCESSES

Some Time-Series Models

3 Theory of stationary random processes

Lecture 4a: ARMA Model

Econometrics II Heij et al. Chapter 7.1

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Chapter 4: Models for Stationary Time Series

IDENTIFICATION OF ARMA MODELS

Time Series Analysis

Class 1: Stationary Time Series Analysis

Stationary Stochastic Time Series Models

Midterm Suggested Solutions

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Applied time-series analysis

E 4101/5101 Lecture 6: Spectral analysis

Lecture 2: Univariate Time Series

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

STAD57 Time Series Analysis. Lecture 8

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Lesson 9: Autoregressive-Moving Average (ARMA) models

Introduction to Stochastic processes

Empirical Market Microstructure Analysis (EMMA)

Time Series: Theory and Methods

Trend-Cycle Decompositions

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Chapter 6: Model Specification for Time Series

Time Series 2. Robert Almgren. Sept. 21, 2009

ECON 616: Lecture 1: Time Series Basics

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

2. An Introduction to Moving Average Models and ARMA Models

at least 50 and preferably 100 observations should be available to build a proper model

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

A time series is called strictly stationary if the joint distribution of every collection (Y t

Time Series Analysis -- An Introduction -- AMS 586

Advanced Econometrics

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Forecasting with ARMA

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Covariances of ARMA Processes

Univariate Nonstationary Time Series 1

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Time Series Analysis Fall 2008

Ross Bettinger, Analytical Consultant, Seattle, WA

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

STAT Financial Time Series

AR, MA and ARMA models

Time Series Solutions HT 2009

Time Series Examples Sheet

Lecture on ARMA model

Lecture 1: Stationary Time Series Analysis

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Ch. 14 Stationary ARMA Process

Chapter 3 - Temporal processes

Introduction to Signal Processing

Quantitative Finance I

Time Series Analysis

Basic concepts and terminology: AR, MA and ARMA processes

Stochastic processes: basic notions

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Nonlinear time series

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

Classic Time Series Analysis

The Identification of ARIMA Models

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

Time Series 3. Robert Almgren. Sept. 28, 2009

Lecture note 2 considered the statistical analysis of regression models for time

6.3 Forecasting ARMA processes

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

Regularized Autoregressive Approximation in Time Series

Discrete time processes

Multivariate Time Series: VAR(p) Processes and Models

Ch 6. Model Specification. Time Series Analysis

TMA4285 December 2015 Time series models, solution.

Multiresolution Models of Time Series

Transcription:

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 1 / 40

Stationarity, stability, and invertibility 6. Stationarity, stability, and invertibility Consider again a situation where the value of a time series at time t, X t, is a linear function of a constant term, the last p values of X t, the contemporaneous and last q values of a white noise process, denoted by ɛ t : X t = µ + This process rewrites : p φ k X t k + k=1 q θ j ɛ t j. j=0 Φ(L)X }{{} t = µ + Θ(L)ɛ }{{} t Autoregressive part Moving average part with Φ(L) = 1 φ 1 L φ 2 L 2 φ p L p and Θ(L) = 1 + θ 1 L + + θ q L q. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 2 / 40

Which conditions? Stationarity, stability, and invertibility Stationarity conditions regard the autoregressive part of the previous (linear) time series model (ARMA(p,q) model). Stability conditions also regard the autoregressive part of the previous (linear) time series model (ARMA(p,q) model). Stability conditions are generally required to avoid explosive solutions of the stochastic difference equation : Φ(L)X t = µ + Θ(L)ɛ t. Invertibility conditions regard the moving average part Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 3 / 40

Stationarity, stability, and invertibility Implications of the stability (stationarity) conditions If stability conditions hold, stationarity conditions are satisfied (the converse is not true) and (X t ) is weakly stationary. If the stochastic process (X t ) is stable (and thus weakly stationary), then (X t ) has an infinite moving average representation (MA( ) representation) : X t = Φ 1 (L) (µ + Θ(L)ɛ t ) = Φ 1 (1)µ + C(L)ɛ t µ = 1 p k=1 φ + c i ɛ t i k where the coefficients c i satisfy c i <. i=0 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 4 / 40 i=0

Stationarity, stability, and invertibility This representation shows the impact of past shocks, ɛ t i, on the current value of X t : dx t dɛ t i = c i This is an application of the Wold s decomposition Omitting the constant term, X t and ɛ t are linked by a linear filter in which C(L) = Φ 1 (L)Θ(L) is called the transfer function. The coefficients (c i ) are also referred to as the impulse response function coefficients. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 5 / 40

Stationarity, stability, and invertibility If (X t ) t Z is weakly stationary but not stable, then (X t ) has the following representation : X t = Φ 1 (L) (µ + Θ(L)ɛ t ) = Φ 1 (1)µ + C(L)ɛ t = µ 1 p k=1 φ + k c i ɛ t i. where (c j ) is sequence of constants with : i= c i < i= The latter condition insures that (X t ) t Z is weakly stationary. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 6 / 40

Stationarity, stability, and invertibility Determination of the stability and stationarity conditions Definition Consider the stochastic process defined by : Φ(L)X t = µ + Θ(L)ɛ t with Φ(L) = 1 φ 1 L φ 2 L 2 φ p L p, Θ(L) = 1 + θ 1 L + + θ q L q, and (ɛ t ) is a white noise process. This process is called stable if the modulus of all the roots, λ i, i = 1,, p, of the (reverse) characteristic equation : Φ(λ) = 0 1 φ 1 λ φ 2 λ 2 φ p λ p = 0 are greater than one, λ i > 1 for all i = 1,, p. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 7 / 40

Stationarity, stability, and invertibility Implications of the stability (stationarity) conditions Invertibility is the counterpart to stationarity for the moving average part of the process. If the stochastic process (X t ) is invertible, then (X t ) has an infinite autoregressive representation (AR( ) representation) : Θ 1 (L)Φ(L)X t = Θ 1 (L)µ + ɛ t or X t = µ 1 q k=1 θ k + d i X t i + ɛ t i=1 where the coefficients d i satisfy d i <. i=0 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 8 / 40

Stationarity, stability, and invertibility The AR( ) representation shows the dependence of the current value X t on the past values of X t i. The coefficients are referred as the d-weights of an ARMA model. If the stochastic process (X t ) is not invertible, it may exist a representation of the following form (omitting the constant term) as long as the magnitude of any (characteristic) root of Θ(λ) does not equal unity : X t = i 0 d i X t i + ɛ t Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 9 / 40

Stationarity, stability, and invertibility Determination of the invertibility conditions Definition Consider the stochastic process defined by : Φ(L)X t = µ + Θ(L)ɛ t with Φ(L) = 1 φ 1 L φ 2 L 2 φ p L p, Θ(L) = 1 + θ 1 L + + θ q L q, and (ɛ t ) is a white noise process. This process is called invertible if the modulus of all the roots, λ i, i = 1,, q, of the (reverse) characteristic equation : Θ(λ) = 0 1 + θ 1 λ + θ 2 λ 2 + + θ q λ q = 0 are greater than one, λ i > 1 for all i = 1,, q. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 10 / 40

Stationarity, stability, and invertibility Three equivalent representations Definition A stable (and thus weakly stationary), invertible stochastic process, Φ(L)X t = µ + Θ(L)ɛ t has two other equivalent representations forms : 1 An infinite moving average representation : X t = µ 1 p k=1 φ k + c i ɛ t i i=0 2 An infinite autoregressive representation : X t = µ 1 q k=1 θ k + d i X t i + ɛ t i=1 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 11 / 40

Stationarity, stability, and invertibility Each of these representations can shed a light on the model from a different perspective. For instance, the stationarity properties, the estimation of parameters, and the computing of forecasts can use different representations forms (see further). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 12 / 40

Stationarity, stability, and invertibility Examples 1. An autoregressive process of order 1 : where ɛ t WN(0, σ 2 ɛ ) for all t. (1 ρl)x t = ɛ t The stability and (weak) stationarity of (X t ) depends on the (reverse) characteristic root of Φ(λ) = 1 ρλ : The stability condition writes : Φ(λ) = 0 λ = 1 ρ. λ > 1 ρ < 1. The stationarity condition writes : λ 1 ρ 1. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 13 / 40

Stationarity, stability, and invertibility Therefore, 1 If ρ < 1, then (X t ) is stable and weakly stationary : (1 ρl) 1 = ρ k L k and X t = ρ k ɛ t k k=0 k=0 2 If ρ > 1, then a non-causal stationary solution (X t ) exists : (1 ρl) 1 = ρ k F k and X t = ρ k ɛ t+k k=1 k=1 3 If ρ = 1, then (X t ) is not stable and stationary : (1 ρl) cannot be inverted! Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 14 / 40

Stationarity, stability, and invertibility 2. A moving average process of order 1 : X t = ɛ t θɛ t 1 where ɛ t WN(0, σ 2 ɛ ) for all t. (X t ) is weakly stationary irrespective of the characteristic root of Θ(λ) = 1 θλ (why?). The invertibility of (X t ) depends on the (reverse) characteristic root of Θ(λ) = 1 θλ : Θ(λ) = 0 λ = 1 θ. The invertibility condition writes : λ > 1 θ < 1. and X t = θ k X t k + ɛ t. k=1 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 15 / 40

7. Identification tools Identification tools Need some tools to characterize the main properties of a time series. Among others : Autocovariance function Autocorrelation function (ACF) Sample autocorrelation function (SACF) Partial autocorrelation function (PACF) Sample partial autocorrelation function (SPACF) Spectral density, etc. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 16 / 40

Keep in mind that one seeks to identify, estimate and forecast the following ARMA(p,q) model : X t = µ + p φ i X t i + i=1 q θ k ɛ t k k=0 where θ 0 = 0, ɛ t WN(0, σ 2 ɛ ), p and q are unknown orders to identify, (µ, φ 1,, φ p, θ 1,, θ q, σ 2 ɛ ) are unknown parameters to estimate (that depends on p and q). The ACF and PACF functions are used to identify the appropriate time series model : 1 The orders p and q can be identified by using the (sample) autocorrelation and (sample) partial autocorrelation function. 2 The corresponding parameters can be then estimated using several statistical procedures. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 17 / 40

Autocovariances The autocovariance function Definition The autocovariance function of a stationary stochastic process (X t ) t Z is defined to be : γ : Z R h γ X (h) = Cov(X t, X t h ). with : γ X (h) = γ X ( h). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 18 / 40

Autocovariances Definition An estimator of γ X (h) (for h < T ) is defined to be : or ˆγ X (h) = 1 T T h (X t X T )(X t+h X T ) t=1 ˆγ X (h) = 1 T h T t=h+1 (X t X T )(X t h X T ). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 19 / 40

Autocovariances Proposition If (X t ) t Z is a second order stationary process with : X t = m + j Z a j ɛ t j where the ɛ t s are i.i.d. with mean zero and variance σɛ, 2 a j <, and E(ɛ 4 t ) <, then (i) ˆγ X (h) is an almost surely convergent and asymptotically unbiased estimator, (ii) the asymptotic disbribution is given by : T ˆγ X (0) γ X (0) ˆγ X (1) γ X (1). ˆγ X (h) γ X (h) where Ω γ = [Ω j,k ] 0 j,k h is such that : Ω j,k = + l N (0, Ω γ ) l= Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 20 / 40 j Z γ X (l) (γ X (l j + k) + γ X (l j k)).

Autocorrelations The autocorrelation function Definition The autocorrelation function of a stationary stochastic process (X t ) t Z is defined to be : h Z. ρ X (h) = γ X (h) γ X (0) = Corr(X t, X t h ) The autocorrelation function is obtained after re-scaling the autocovariance function by the variance γ X (0) = V(X t ). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 21 / 40

Autocorrelations Definition The autocorrelation function of a (stationary) stochastic process (X t ) satisfies the following properties : 1 ρ X ( h) = ρ X (h) h 2 ρ X (0) = 1 3 The range of ρ X is [ 1; 1]. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 22 / 40

Autocorrelations The autocorrelation (respectively, autocovariance) function of a moving average process of order q, MA(q), is always zero for orders higher than q ( h > q) : MA(q) process has no memory beyond q periods. The autocorrelation (respectively, autocovariance) function of a stationary AR(p) process exhibits exponential decay towards zero (but does not vanish for lags greater than p). The autocorrelation (respectively, autocovariance) function of a stationary ARMA(p, q) process exhibits exponential decay towards zero : it does not cut off but gradually dies out as h increases. The autocorrelation function of a nonstationary process decreases very slowly even at very high lags, long after the autocorrelations from stationary processes have declined to (almost) zero. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 23 / 40

Autocorrelations The sample autocorrelation function Definition Given a sample of T observations, x 1,, x T, the sample autocorrelation function, denoted by (ˆρ X (h)), is computed by : ˆρ X (h) = T t=h+1 (x t ˆµ)(x t h ˆµ) T (x t ˆµ) 2 t=1 where ˆµ is the sample mean : ˆµ = 1 T T x t. t=1 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 24 / 40

Autocorrelations Proposition If (X t ) t Z is a second order stationary process with : X t = m + j Z a j ɛ t j where the ɛ t s are i.i.d. with mean zero and variance σɛ 2, a j <, and E(ɛ 4 t ) <, then T ˆρ X (1) ρ X (1). ˆρ X (h) ρ X (h) l N (0, Ω ρ ). j Z Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 25 / 40

Partial autocorrelation The partial autocorrelation function The partial autocorrelation function is another tool for identifying the properties of an ARMA process. It is particularly useful to identify pure autoregressive process (AR(p)). A partial correlation coefficient measures the correlation between two random variables at different lags after adjusting for the correlation this pair may have with the intervening lags : the PACF thus represents the sequence of conditional correlations. A correlation coefficient between two random variables at different lags does not adjust for the influence of the intervening lags : the ACF thus represents the sequence of unconditional correlations. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 26 / 40

Partial autocorrelation Definition The partial autocorrelation function of a stationary stochastic process (X t ) t Z is defined to be : h > 0. a X (h) = Corr(X t, X t h X t 1,, X t h+1 ) Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 27 / 40

Partial autocorrelation Definition The partial autocorrelation function of order h is defined to be : [ ] a X (h) = Corr X t, X t h ( Cov Xt, X ) t h = [ ( ) ( )] 1/2 V X t V X t h where X t = X t EL(X t X t 1,, X t h+1 ) X t h = X t h EL(X t h X t 1,, X t h+1 ). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 28 / 40

Partial autocorrelation Definition The partial autocorrelation function of a second order stationary stochastic process (X t ) t Z satisfies : a X (h) = R (h) R(h) where R(k) is the autocorrelation matrix of order h définie par : ρ X (0) ρ X (1) ρ X (h 1) ρ X (1) ρ X (0) ρ X (h 2) R(h) =...... ρ X (h 1) ρ X (h 2) ρ X (0) and R (h) is the matrix that is obtained after raplacing the last column of R(h) by ρ = (ρ X (1),, ρ X (k)) t. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 29 / 40

Partial autocorrelation Definition The partial correlation function can be viewed as the sequence of the h-th autoregressive coefficients in a h-th order autoregression. Let a hl denote the l-th autoregressive coefficient of an AR(h) process : Then for h = 1, 2, X t = a h1 X t 1 + a h2 X t 2 + + a hh X t h + ɛ t. a X (h) = a hh Remark : The theoretical partial autocorrelation function of an AR(p) model will be different from zero for the first p terms and exactly zero for higher order terms (why?). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 30 / 40

Partial autocorrelation The sample partial autocorrelation function The sample partial autocorrelation function can be obtained by different methods : 1 Find the ordinary least squares (or maximum likelihood) estimates of a hh 2 Use the recursive equations of the autocorrelation function (Yule-Walker equations) after replacing the autocorrelation coefficients by their estimates (see further). 3 Etc. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 31 / 40

Partial autocorrelation Proposition If (X t ) t Z is a second order stationary process with : X t = m + j Z a j ɛ t j where the ɛ t s are i.i.d. with mean zero and variance σɛ 2, a j <, and E(ɛ 4 t ) <, then T â X (1) r X (1). â X (h) r X (h) l N (0, Ω r ). j Z Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 32 / 40

Spectral analysis Identification tools Spectral analysis Definition (Fourier transform of the autocovariance function) Let (X t ) be a real-valued stationary process (with absolutely summable autocovariance sequence). The Fourier transform of the autocovariance function (γ X (h)) exists and is given by ; ω [ π; π]. f X (ω) = 1 h=+ γ X (h)exp( iωh) 2π h= = 1 h=+ γ X (h)cos(ωh) 2π h= = 1 2π γ X (0) + 1 h=+ γ X (h)cos(ωh) π h=1 Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 33 / 40

Spectral analysis Properties : 1. The spectral density satsifies : f X (ω) is continuous, i.e. f X (ω) = f X (ω), f X (ω) is real-valued ; f X (ω) is a nonnegative function (since the autocovariance function is positive semidefinite). 2. f X (ω) = f X (ω + 2π) : f X is a periodic with period 2π. 3. f X (ω) = f X ( ω) ω : f X is a symmetric even function. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 34 / 40

Spectral analysis Properties (cont d) : 4. The variance satisfies : V(X t ) = γ X (0) = π π f X (ω)dω The spectrum f X (ω) may be interpreted as the decomposition of the variance of a process. The term f X (ω)dω is the contribution to the variance attributable to the component of the process with frequencies in the interval (ω, ω + dω). A pick (in the spectrum) indicates an important contribution to the variance. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 35 / 40

Spectral analysis Deriving the spectral density function Definition (Autocovariance generating function) For a given sequence of autocovariances γ X (h), h = 0, ±1, ±2, cdots, the autocovariance generating function is defined to be : γ X (L) = where L is the lag operator. + h= γ X (h)l h Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 36 / 40

Spectral analysis Proposition Let (X t ) t Z denote a second order stationary process with : X t = m + j Z θ j ɛ t j where the ɛ t s are i.i.d. with mean zero and variance σ 2 ɛ. The autocovariance generating function is defined to be : γ X (L) = σ 2 ɛ Θ(L)Θ(L 1 ). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 37 / 40

Spectral analysis Definition Let γ X (L) and f X (ω) denote respectively the autocovariance generating function and the spectrum : γ X (L) = + f X (ω) = 1 2π h= + γ X (h)l h h= exp( iωh) ω [ π; π]. Then f X (ω) = 1 2π γ X (exp( iω)). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 38 / 40

Spectral analysis Application : Consider a general stationary ARMA(p,q) model Φ p (L)X t = µ + Θ q (L)ɛ t with Φ p (L) = 1 p j=1 φ jl j and Θ q (L) = 1 + q j=1 θ jl j sharing no common factor and having their roots outside the unit circle. Then and γ X (L) = σ 2 ɛ Θ q (L)Θ q (L 1 ) Φ q (L)Φ q (L 1 ) f X (ω) = 1 2π γ X (exp( iω)) = σ2 ɛ Θ q (exp( iω))θ q (exp(iω)) 2π Φ q (exp( iω))φ q (exp(iω)) = σ2 ɛ Θ q (exp( iω)) 2 2π Φ p (exp( iω)). Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 39 / 40

Summary Summary Strong and weak stationarity Stationarity/nonstationarity White noise, Trend stationary processes, Difference stationary processes (random walk) Lag operator and writing of time series models AR( ) and MA( ) representation Autocovariance, (sample) autocorrelation, and (sample) partial autocorrelation function Spectral density. Class of ARMA(p,q) processes. Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 40 / 40