Seasonal Models and Seasonal Adjustment

Similar documents
Multivariate ARMA Processes

Forecasting with ARMA Models

The Identification of ARIMA Models

IDENTIFICATION OF ARMA MODELS

LINEAR STOCHASTIC MODELS

Temporal Regression Models in Econometrics. With the use of the lag operator, we can rewrite this as

Serially Correlated Regression Disturbances

Chapter 9: Forecasting

1 Linear Difference Equations

Lecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book

5 Transfer function modelling

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Exercises - Time series analysis

Time Series Analysis -- An Introduction -- AMS 586

Empirical Market Microstructure Analysis (EMMA)

at least 50 and preferably 100 observations should be available to build a proper model

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Trend-Cycle Decompositions

Some Time-Series Models

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

Univariate Nonstationary Time Series 1

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

MCMC analysis of classical time series algorithms.

7. Forecasting with ARIMA models

3 Time Series Regression

Univariate Time Series Analysis; ARIMA Models

7 Introduction to Time Series

A time series is called strictly stationary if the joint distribution of every collection (Y t

Regression of Time Series

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

3. ARMA Modeling. Now: Important class of stationary processes

Time Series Econometrics 4 Vijayamohanan Pillai N

Ch 9. FORECASTING. Time Series Analysis

1. Fundamental concepts

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Ch. 14 Stationary ARMA Process

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Chapter 3: Regression Methods for Trends

Introduction to univariate Nonstationary time series models

5 Autoregressive-Moving-Average Modeling

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

3 Theory of stationary random processes

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Statistics of stochastic processes

Basic concepts and terminology: AR, MA and ARMA processes

9) Time series econometrics

Statistical Methods for Forecasting

D.S.G. POLLOCK: BRIEF NOTES

Lecture 2: Univariate Time Series

Multiple Regression 2

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

Dynamic Regression Models

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Dynamic Regressions LECTURE 8

Chapter 2: Unit Roots

Lesson 2: Analysis of time series

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Applied time-series analysis

Chapter 5: Models for Nonstationary Time Series

THE DETECTION OF SHIFTS IN AUTOCORRELATED PROCESSES WITH MR AND EWMA CHARTS

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

Minitab Project Report - Assignment 6

Efficiency Tradeoffs in Estimating the Linear Trend Plus Noise Model. Abstract

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Basics: Definitions and Notation. Stationarity. A More Formal Definition

The ARIMA Procedure: The ARIMA Procedure

Forecasting. Simon Shaw 2005/06 Semester II

Ch3. TRENDS. Time Series Analysis

Introduction to ARMA and GARCH processes

Appendix A: The time series behavior of employment growth

Classifying and building DLMs

TRENDS CYCLES AND SEASONS: ECONOMETRIC METHODS OF SIGNAL EXTRACTION

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Forecasting Egyptian GDP Using ARIMA Models

STAT 436 / Lecture 16: Key

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Transformations for variance stabilization

Ch. 19 Models of Nonstationary Time Series

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

Autoregressive and Moving-Average Models

11. Further Issues in Using OLS with TS Data

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Volatility. Gerald P. Dwyer. February Clemson University

Econometría 2: Análisis de series de Tiempo

Forecasting Seasonal Time Series 1. Introduction. Philip Hans Franses Econometric Institute Erasmus University Rotterdam

Time Series Analysis

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

CONJUGATE DUMMY OBSERVATION PRIORS FOR VAR S

Comparative Analysis of Linear and Bilinear Time Series Models

Transcription:

LECTURE 10 Seasonal Models and Seasonal Adjustment So far, we have relied upon the method of trigonometrical regression for building models which can be used for forecasting seasonal economic time series. It has proved necessary, invariably, to perform the preliminary task of eliminating a trend from the data before determining the seasonal pattern from the residuals. In most of the cases which we have analysed, the trend has been modelled quite successfully by a simple analytic function such as a quadratic. However, it is not always possible to find an analytic function which serves the purpose. In some cases, a stochastic trend seems to be more appropriate. Such a trend is generated by an autoregressive operator with units roots. Once a stochastic unit-root model has been adopted for the trend, it seems natural to model the pattern of seasonal fluctuations in the same manner by using autoregressive operators with complex-valued roots of unit modulus. The General Multiplicative Seasonal Model Let (1) z(t) = d y(t) be a de-trended series which exhibits seasonal behaviour with a periodicity of s periods. Imagine, for the sake of argument, that the period between successive observations is one month, which means that the seasons have a cycle of s =12 months. Once the trend has been extracted from the original series y(t) by differencing, we would expect to find a strong relationship between the values of observations taken in the same month of successive years. In the simplest circumstances, we might find that the difference between z t and z t 12 is a small random quantity. If the sequence of the twelve-period differences were white noise, then we should have a relationship of the form (2) z(t) =z(t 12) + ε(t) or, equivalently, 12 z(t) =ε(t). This is ostensibly an autoregressive model with an operator in the form of 12 =1 L 12. However, it is interesting to note in passing that, if z(t) were 1

D.S.G. POLLOCK: TIME SERIES AND FORECASTING generated by a regression model in the form of (3) z(t) = 6 ρ j cos(ω j θ j )+η(t), j=0 where ω j = πj/6 =j 30, then we should have (4) (1 L 12 )z(t) =η(t) η(t 12) = ζ(t); and, if the disturbance sequence η(t) were white noise, then the residual term ζ(t) = η(t) η(t 12) would show the following pattern of correlation: (5) C(ζ t,ζ t j )= { σ 2, if j mod12=0; 0, otherwise. It can be imagined that a more complicated relationship stretches over the years which connects the months of the calender. By a simple analogy with the ordinary ARMA model, we can devise a model of the form (6) Φ(L 12 ) D 12z(t) =Θ(L 12 )η(t), where Φ(z) is a polynomial of degree P and Θ(z) is a polynomial of degree Q. In effect, this model is applied to twelve separate time series one for each month of the year whose observations are seperated by yearly intervals. If η(t) were a white-noise sequence of independently and identically distributed random variables, then there would be no connection between the twelve time series. If there is a connection between successive months within the year, then there should be a pattern of serial correlation amongst the elements of the disturbance process η(t). One might propose to model this pattern using a second ARMA of the form (7) α(l)η(t) =µ(l)ε(t), where α(z) is a polynomial of degree p and µ(z) is a polynomial of degree q. The various components of our analysis can now be assembled. By combining equations (1) (6) and (7), we can derive the following general model for the sequence y(t): (8) Φ(L 12 )α(l) D 12 d y(t) =Θ(L 12 )µ(l)ε(t). A model of this sort has been described by Box and Jenkins as the general multiplicative seasonal model. To denote such a model in a summary fashion, 2

D.S.G. POLLOCK : SEASONALITY they describe it as an ARIMA (P, D, Q) (p, d, q) model. Although, in the general version of the model, the seasonal difference operator 12 is raised to the power D; it is unusual to find values other that D =0,1. Factorisation of The Seasonal Difference Operator The equation under (8) should be regarded as a portmanteau in which a collection of simplified models can be placed. The profusion of symbols in equation (8) tends to suggest a model which is too complicated to be of practical use. Moreover, even with 12 in place of D 12, there is a redundancy in the notation to which we should draw attention. This redundancy arises from the fact that the seasonal difference operator D 12 already contains the operator = 1 Las one of its factors. Therefore, unless this factor is eliminated, there is a danger that the original sequence y(t) will be subjected, inadvertently, to one more differencing operation than is intended. The twelve factors of the operator D 12 =1 L 12 contain the so-called twelfth-order roots of unity which are the solutions of the algebraic equation 1=z 12. The factorisation may be demonstrated in three stages. To begin, it is easy to see that (9) 1 L 12 =(1 L)(1 + L + L 2 + +L 11 ) =(1 L)(1 + L 2 + L 4 + +L 10 )(1 + L). The next step is to recognise that (10) (1 + L 2 + L 4 + +L 10 ) =(1 3L+L 2 )(1 L + L 2 )(1 + L 2 )(1 + L + L 2 )(1 + 3L + L 2 ). Finally, it can be see that the generic quadratic factor has the form of (11) 1 2 cos(ω j )L + L 2 =(1 e iω j L)(1 e iω j L). where ω j = πj/6 =j 30. Figure 1 shows the disposition of the twelfth roots of unity around the unit circle in the complex plane. A cursory inspection of equation (9) indicates that the first-order difference operator =1 Lis indeed one of the factors of 12 =1 L 12. Therefore, if the sequence y(t) has been reduced to stationarity already by the application of d first-order differencing operations, then its subsequent differencing via the operator 12 is unnecessary and is liable to destroy some of the characteristics of the sequence which ought to be captured by the ARIMA model. The factorisation of the seasonal difference operator also helps to explain how the seasonal ARMA model can give rise to seemingly regular cycles of the appropriate duration. 3

D.S.G. POLLOCK: TIME SERIES AND FORECASTING i Im 1 1 Re i Figure 1. The 12th roots of unity inscribed in the unit circle. Consider a simple second-order autoregressive model with complex-valued roots of unit modulus: { (12) 1 2 cos(ωj )L + L 2} y j (t) =ε j (t). Such a model can gives rise to quite regular cycles whose average duration is 2π/ω j periods. The graph of the sequence generated by a model with ω j = ω 1 = π/6 =30 is given in Figure 2. Now consider generating the full set of stochastic sequences y j (t) for j =1,...,5. Also included in this set should be the sequences y 0 (t) and y 6 (t) generated by the first-order equations (13) (1 L)y 0 (t) =ε 0 (t) and (1 + L)y 6 (t) =ε 6 (t). These sequences, which resemble trigonometrical functions, will be harmonically related in the manner of the trigonometrical functions comprised by equation (3) which also provides a model for a seasonal time series. It follows that a good representation of a seasonal economic time series can be obtained by taking a weighted combination of the stochastic sequences. For simplicity, imagine that the white-noise sequences ε j (t); j =0,...,6 are mutually independent and that their variances can take a variety of values. Then the sum of the stochastic sequences will be given by 6 y(t) = y j (t) (14) j=0 = ε 0(t) 5 1 L + ε j (t) 1 2 cos(ω j )L + L 2 + ε 6(t) 1 L. j=1 4

D.S.G. POLLOCK : SEASONALITY 25 20 15 10 5 0 5 10 15 20 25 0 20 40 60 80 Figure 2. The graph of 84 observations on a simulated series generated by the AR(2) process (1 1.732L + L 2 )y(t) =ε(t). The terms on the RHS of this expression can be combined. Their common denominator is simply the operator 12 =1 L 12. The numerator is a sum of 7 mutually independent moving-average process, each with an order of 10 or 11. This also amounts to an MA(11) process which can be denoted by η(t) = θ(l)ε(t). Thus the combination of the harmonically related unit-root AR(2) processes gives rise to a seasonal process in the form of (15) y(t) = θ(l) ε(t) 1 L12 12 y(t) =θ(l)ε(t). or, equivalently, The equation of this model is contained within the portmanteau equation of the general multiplicative model given under (8). However, although it represents a simplification of the general model, it still contains a number of parameters which is liable to prove excessive. A typical model, which contain only a few parameter, is the ARIMA (0, 1, 1) (0, 1, 1) model which Box and Jenkins fitted to the logarithms of the AIRPASS data. If z(t) = y(t) denotes the first difference of the logarithms of the data series, then the AIRPASS model takes the form of (16) (1 L 12 )z(t) =(1 θl 12 )(1 µl)ε(t). 5

D.S.G. POLLOCK: TIME SERIES AND FORECASTING Forecasting with Unit-Root Seasonal Models Although their appearances are superficially similar, the seasonal economic series and the series generated by equations such as (16) are, fundamentally, of very different natures. In the case of the series generated by a unit-root stochastic difference equation, there is no bound, in the long run, on the amplitude of the cycles. Also there is a tendency for the phases of the cycles to drift without limit. If the latter were a feature of the monthly time series of consumer expenditures, for example, then we could not expect the annual boom in sales to occur at a definite time of the year. In fact, it occurs invariably at Christmas time. The advantage of unit-root seasonal models does not lie in the realism with which they describe the processes which generate the economic data series. For that purpose the trigonometrical model seems more appropriate. Their advantage lies, instead, in their ability to forecast the seasonal series. The simplest of the seasonal unit-root models is the one which is specified by equation (2). This is a twelfth-order difference equation with a white-noise forcing function. In generating forecasts from the model, we need only replace the elements of ε(t) which lie in the future by their zero-valued expectations. Then the forecasts may be obtained iteratively from a homogeneous difference equation in which the initial conditions are simply the values of y(t) observed over the preceding twelve months. In effect, we observe the most recent annual cycle and we extrapolate its form exactly year-in year-out into the indefinite future. A somewhat different forecasting rule is associated with the model defined by the equation (17) (1 L 12 )y(t) =(1 θl 12 )ε(t) This equation is analogous to the simple IMA(1, 1) equation in the form of (18) (1 L)y(t) =(1 θl)ε(t) which was considered at the beginning of the course. The later equation was obtained by combining a first-order random walk with a white-noise error of observation. The two equations, whose combination gives rise to (18), are (19) ξ(t) =ξ(t 1) + ν(t), y(t) =ξ(t)+η(t), wherein ν(t) and η(t) are generated by two mutually independent white-noise processes. 6

D.S.G. POLLOCK : SEASONALITY 5 0 5 10 15 0 20 40 60 Figure 3. The sample trajectory and the forecast function of the nonstationary 12th-order process y(t) = y(t 12) + ε(t). Equation (17), which represents the seasonal model which was used by Box and Jenkins, is generated by combining the following the equations which are analogous to these under (19): (20) ξ(t) =ξ(t 12) + ν(t), y(t) =ξ(t)+η(t). Here ν(t) and η(t) continue to represent a pair of independent white-noise processes. The procedure for forecasting the IMA model consisted of extrapolating into the indefinite future a constant value ŷ t+1 t which represents the onestep-ahead forecast made at time t. The forecast itself was obtained from a geometrically-weighted combination of all past values of the sequence y(t) which represent erroneous observations on the random-walk process ξ(t). The forecasts for the seasonal model of (17) are obtained by extrapolating a so-called annual reference cycle into the future so that it applies in every successive year. The reference cycle is constructed by taking a geometrically weighted combination of all past annual cycles. The analogy with the IMA model is perfect! It is interesting to compare the forecast function of a stochastic unit-root seasonal model of (17) with the forecast function of the corresponding trigonometrical model represented by (3). In the latter case, the forecast function 7

D.S.G. POLLOCK: TIME SERIES AND FORECASTING depends upon a reference cycle which is the average of all of the annual cycles which are represented by the data set from which the regression parameters have been computed. The stochastic model seems to have the advantage that, in forming its average of previous annual cycles, it gives more weight to recent years. However, it is not difficult to contrive a regression model which has the same feature. 8