Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Similar documents
Time Series: Theory and Methods

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Chapter 4: Models for Stationary Time Series

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

3. ARMA Modeling. Now: Important class of stationary processes

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Covariances of ARMA Processes

Identifiability, Invertibility

Chapter 6: Model Specification for Time Series

Time Series Analysis -- An Introduction -- AMS 586

Ross Bettinger, Analytical Consultant, Seattle, WA

Time Series I Time Domain Methods

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

3 Theory of stationary random processes

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Elements of Multivariate Time Series Analysis

Gaussian processes. Basic Properties VAG002-

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Time Series Analysis. Asymptotic Results for Spatial ARMA Models

FE570 Financial Markets and Trading. Stevens Institute of Technology

Econometría 2: Análisis de series de Tiempo

Some Time-Series Models

Lesson 9: Autoregressive-Moving Average (ARMA) models

Levinson Durbin Recursions: I

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

at least 50 and preferably 100 observations should be available to build a proper model

Introduction to ARMA and GARCH processes

Levinson Durbin Recursions: I

AR, MA and ARMA models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Multivariate Time Series

Applied time-series analysis

Econometrics II Heij et al. Chapter 7.1

Time Series 2. Robert Almgren. Sept. 21, 2009

Dynamic Time Series Regression: A Panacea for Spurious Correlations

A time series is called strictly stationary if the joint distribution of every collection (Y t

Introduction to Time Series Analysis. Lecture 7.

6.3 Forecasting ARMA processes

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Time Series Examples Sheet

Stochastic processes: basic notions

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

7. Forecasting with ARIMA models

6 NONSEASONAL BOX-JENKINS MODELS

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Quantitative Finance I

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

3 ARIMA Models. 3.1 Introduction

ARIMA Models. Richard G. Pierse

Ch 6. Model Specification. Time Series Analysis

Chapter 9: Forecasting

Ch 4. Models For Stationary Time Series. Time Series Analysis

Introduction to Time Series Analysis. Lecture 11.

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Time Series Analysis

Lesson 2: Analysis of time series

ITSM-R Reference Manual

Using Analysis of Time Series to Forecast numbers of The Patients with Malignant Tumors in Anbar Provinc

STAT STOCHASTIC PROCESSES. Contents

Time Series Examples Sheet

Lecture 2: Univariate Time Series

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Estimation and application of best ARIMA model for forecasting the uranium price.

Lecture 4a: ARMA Model

Ch. 14 Stationary ARMA Process

Statistics 349(02) Review Questions

Nonlinear time series

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Empirical Market Microstructure Analysis (EMMA)

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Multivariate Time Series: VAR(p) Processes and Models

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

E 4101/5101 Lecture 6: Spectral analysis

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

ARMA models with time-varying coefficients. Periodic case.

Moving Average (MA) representations

ARMA (and ARIMA) models are often expressed in backshift notation.

Long-range dependence

COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

1 Time Series Concepts and Challenges

STAT Financial Time Series

Transcription:

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof of the fact that the exponential decay rate of partial autocorrelation coefficients of a short-memory process, in particular an ARMA process, is equal to the exponential decay rate of the coefficients of its infinite autoregressive representation. 1 Introduction The autocorrelation coefficients and the partial autocorrelation coefficients are basic tools for model selection in time series analysis based on ARMA models. For AR models, by the Yule-Walker equation, the autocorrelation coefficients satisfy a linear difference equation with constant coefficients and hence the autocorrelation coefficients decay to zero exponentially with the rate of the reciprocal of the smallest absolute value of the roots of the characteristic polynomial of the AR model. This also holds for ARMA models, because their autocorrelations satisfy the same difference equation defined by their AR part, except for some initial values. On the other hand, it seems that no clear statement and proof is given in standard textbooks on time series analysis concerning the exponential decay rate of the partial autocorrelation coefficients for MA models and ARMA models. For example, in Section 3.4.2 of [2] the following is stated without clear indication of the decay rate. Hence, the partial autocorrelation function of a mixed process is infinite in extent. It behaves eventually like the partial autocorrelation function of a pure moving average process, being dominated by a mixture of damped exponentials and/or damped sine waves, depending on the order of the moving average and the values of the parameters it contains. In Section 3.4 of [3] the following is stated on MA(q) processes again without clear indication of the decay rate. In contrast with the partial autocorrelation function of an AR(p) process, that of an MA(q) process does not vanish for large lags. It is however bounded in absolute value by a geometrically decreasing function. Graduate School of Information Science and Technology, University of Tokyo 1

The purpose of this paper is to give a clear statement on the decay rate and its short proof. Because of the duality between AR models and MA models, it is intuitively obvious that the partial autocorrelation coefficients of an ARMA model decay to zero at the rate of the reciprocal of the smallest absolute value of the roots of the characteristic polynomial of the MA part of the model. Note that this rate is also the decay rate of the coefficients of the AR( ) representation. In literature the sharpest results on asymptotic behavior of partial autocorrelation functions have been given by Akihiko Inoue and his collaborators (e.g. [4], [6], [5], [1]). They give detailed and deep results on the polynomial decay rate of the partial autocorrelation coefficients for the case of long-memory processes. Concerning ARMA processes, the most clear result seems to have been given by Inoue in Section 7 of [5]. However his result is one-sided, giving an upper bound on the exponential rate, whereas Theorem 2.1 in this paper gives an equality. 2 Main result and its proof We consider a zero-mean causal and invertible weakly stationary process{x t } t Z having an AR( ) representation and an MA( ) representation given by For an ARMA(p, q) process X t =π 1 X t 1 +π 2 X t 2 + +ǫ t, π(b)x t =ǫ t, X t =ǫ t +ψ 1 ǫ t 1 + =ψ(b)ǫ t, π i <, (1) ψ i <. (2) φ(b)x t =θ(b)ǫ t, π 1,π 2,..., decay exponentially with the rate of the reciprocal of the smallest absolute value of the roots ofθ(b)=0 and similarlyψ 1,ψ 2,..., decay, withθ(b) replaced byφ(b). The autocovariance function of{x t } is E(X t X t+k )=γ k =γ k =σ 2 ǫ (ψ k+ ψ k+i ψ i ), k 0, whereσ 2 ǫ = E(ǫ 2 t ). Let H denote the Hilbert space spanned by{x t} and for a subset I Z of integers, let P I denote the orthogonal projector onto the subspace H I spanned by{x t } t I. The k-th partial autocorrelation is defined byφ kk in P [t k,t 1] X t =φ k1 X t 1 + +φ kk X t k. We state our theorem, which shows that the radius of convergence is common for the infinite series with coefficients{π n } and coefficients{φ nn }. Theorem 2.1. Let{X t } t Z be a zero-mean causal and invertible weakly stationary process with its AR( ) representation given by (1) and letφ nn be the n-th partial autocorrelation coefficient. Then φ nn 1/n = π n 1/n. (3) 2

By our assumptions, both{π n } and{φ nn } are bounded and hence we have φ nn 1/n 1, π n 1/n 1. Note that (3) only gives the exponential decay rates ofπ n andφ nn and does not distinguish polynomial rates since (n k ) 1/n 1 as for any power of n. Akihiko Inoue and his collaborators provided detailed analyses of the polynomial decay rate ofφ nn for the case of long-memory processes (e.g. [4], [6], [5], [1]). For proving Theorem 2.1 we present two lemmas. Lemma 2.2. Suppose π n 1/n < 1. Then φ nn 1/n π n 1/n. Proof. Let π n 1/n = c 0 < 1. Then for every c (c 0, 1), there exist n 0 such that π n <c n, n n 0. We denote the h-period (h 1) ahead prediction by P [t k,t 1] X t+h 1 =φ (h) k1 X t 1+ +φ (h) kk X t k (φ (1) k j =φ k j ). Hereφ (h) k1 is the partial regression coefficient of X t 1 in regressing X t+h 1 to X t 1,..., X t k. Hence it is written as φ (h) k1 = Cov(P [t k,t 2] X t+h 1, P [t k,t 2] X t 1) Var(P [t k,t 2] X, t 1) where P [t k,t 2] is the projector onto the orthogonal complement of H [t k,t 2]. Then φ (h) k1 is uniformly bounded from above as φ (h) k1 In (1) we apply P [t k,t 1] to X t. Then Var(P [t k,t 2] X t+h 1)Var(P [t k,t 2] X t 1) Var(P [t k,t 2] X t 1) Var(P [t k,t 2] = X t+h 1) Var(P [t k,t 2] X t 1) Var(X t+h 1 ) Var(P (,t 2] X t 1) = γ0. (4) σ 2 ǫ φ k1 X t 1 + +φ kk X t k = P [t k,t 1] X t = P [t k,t 1] P (,t 1] X t = P [t k,t 1] ( π l X t l ) l=1 =π 1 X t 1 + +π k X t k + π l P [t k,t 1] X t l. (5) Now by time reversibility of the covariance structure of weakly stationary processes we have l=k+1 P [t k,t 1] X t k h =φ (h) k1 X t k+ +φ (h) kk X t 1. 3

By substituting this into (5) and considering the coefficient of X t k we have φ kk =π k + π k+h φ (h) k1, where the right-hand side converges absolutely under our assumptions. Then φ kk π k + π k+h φ (h) k1. For k n 0, in view of (4), the right-hand side is bounded as φ kk c k (1+ c h γ 0 /σ 2 ǫ )=ck (1+ c γ 0 /σ 2 ǫ ). 1 c Then φ kk 1/k c. k Since c>c 0 was arbitrary, we let c c 0 and obtain φ nn 1/n c 0 = π n 1/n. Lemma 2.3. Suppose φ nn 1/n < 1. Then π n 1/n φ nn 1/n. Proof. This follows from the Durbin-Levinson algorithm. Consider j = n in The initial value is φ n+1, j =φ n, j φ n+1,n+1 φ n,n j+1, j=1, 2,..., n. (6) φ n+1,n =φ n,n φ n+1,n+1 φ n,1. Using (6) for n replaced by n+1, j=n, and substituting the initial value, we obtain Repeating the substitution, we have φ n+2,n =φ n+1,n φ n+2,n+2 φ n+1,2 =φ n,n φ n+1,n+1 φ n,1 φ n+2,n+2 φ n+1,2. φ n+h,n =φ n,n φ n+1,n+1 φ n,1 φ n+h,n+h φ n+h 1,h. As h, the left-hand side converges toπ n (cf. Theorem 7.14 of [8]). Hence and π n =φ n,n π n φ n,n + φ n+h,n+h φ n+h 1,h φ n+h,n+h φ n+h 1,h 4

Now arguing as in (4), we see that φ n+h 1,h is uniformly bounded as γ 0 φ n+h 1,h Var(P (,t 1] [t+1, ) X t) = γ 0 Var(P (, 1] [1, ) X 0). Here the denominator is positive, because under our assumptions{x t } is minimal (cf. Theorem 8.11 of [8], [7, Section 2]). The rest of the proof is the same as in the proof of Lemma 2.2 By the above two lemmas, Theorem 2.1 is proved as follows. Proof of Theorem 2.1. As noted after Theorem 2.1, both φ nn 1/n and π n 1/n are less than or equal to 1. If φ nn 1/n < 1 or π n 1/n < 1, then by the above two lemmas both of them have to be less than one and they have to be equal. The only remaining case is that they are equal to 1. Acknowledgment This research is supported by JSPS Grant-in-Aid for Scientific Research No. 25220001. The author is grateful for useful comments by Tomonari Sei. References [1] N. H. Bingham, A. Inoue, and Y. Kasahara. An explicit representation of Verblunsky coefficients. Statist. Probab. Lett., 82(2):403 410, 2012. [2] G. E. P. Box, G. M. Jenkins, G. C. Reinsel, and G. M. Ljung. Time Series Analysis: Forecasting and Control. Wiley Series in Probability and Statistics. John Wiley & Sons, Inc., Hoboken, NJ, fifth edition, 2015. [3] P. J. Brockwell and R. A. Davis. Time Series: Theory and Methods. Springer Series in Statistics. Springer-Verlag, New York, second edition, 1991. [4] A. Inoue. Asymptotic behavior for partial autocorrelation functions of fractional ARIMA processes. Ann. Appl. Probab., 12(4):1471 1491, 2002. [5] A. Inoue. AR and MA representation of partial autocorrelation functions, with applications. Probab. Theory Related Fields, 140(3-4):523 551, 2008. [6] A. Inoue and Y. Kasahara. Explicit representation of finite predictor coefficients and its applications. Ann. Statist., 34(2):973 993, 2006. [7] N. P. Jewell and P. Bloomfield. Canonical correlations of past and future for time series: definitions and theory. Ann. Statist., 11(3):837 847, 1983. [8] M. Pourahmadi. Foundations of Time Series Analysis and Prediction Theory. Wiley Series in Probability and Statistics: Applied Probability and Statistics. Wiley-Interscience, New York, 2001. 5