IDENTIFICATION OF PERIODIC AUTOREGRESSIVE MOVING-AVERAGE TIME SERIES MODELS WITH R

Similar documents
Lecture 2: Univariate Time Series

Diagnostic Checking Periodic Autoregression Models. With Application. A.I. McLeod. Department of Statistical and Actuarial Sciences

Econ 424 Time Series Concepts

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Some Time-Series Models

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

at least 50 and preferably 100 observations should be available to build a proper model

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

FE570 Financial Markets and Trading. Stevens Institute of Technology

MCMC analysis of classical time series algorithms.

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

STAD57 Time Series Analysis. Lecture 8

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Empirical Market Microstructure Analysis (EMMA)

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

A time series is called strictly stationary if the joint distribution of every collection (Y t

of seasonal data demonstrating the usefulness of the devised tests. We conclude in "Conclusion" section with a discussion.

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

STAT STOCHASTIC PROCESSES. Contents

Part II. Time Series

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

3. ARMA Modeling. Now: Important class of stationary processes

Ch 6. Model Specification. Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Statistics of stochastic processes

Time Series I Time Domain Methods

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Ross Bettinger, Analytical Consultant, Seattle, WA

Time Series: Theory and Methods

KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE ANTONY G AU T I E R (LILLE)

AR, MA and ARMA models

Econometrics II Heij et al. Chapter 7.1

Volume 11 Issue 6 Version 1.0 November 2011 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals Inc.

Estimation of Parameters of Multiplicative Seasonal Autoregressive Integrated Moving Average Model Using Multiple Regression

Econometric Forecasting

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

STAT 248: EDA & Stationarity Handout 3

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Review Session: Econometrics - CLEFIN (20192)

Nonlinear time series

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

1 Linear Difference Equations

5 Autoregressive-Moving-Average Modeling

Time Series 2. Robert Almgren. Sept. 21, 2009

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

3 Theory of stationary random processes

Time Series Analysis

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Dynamic Time Series Regression: A Panacea for Spurious Correlations

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Univariate ARIMA Models

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

Time Series Analysis -- An Introduction -- AMS 586

Final Examination 7/6/2011

Econ 623 Econometrics II Topic 2: Stationary Time Series

Applied time-series analysis

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

1. Fundamental concepts

Introduction to Stochastic processes

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Time Series Analysis. Asymptotic Results for Spatial ARMA Models

Stochastic Processes

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Lesson 9: Autoregressive-Moving Average (ARMA) models

Discrete time processes

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Modelling Monthly Rainfall Data of Port Harcourt, Nigeria by Seasonal Box-Jenkins Methods

Time Series Analysis

Exercises - Time series analysis

Time-Varying Moving Average Model for Autocovariance Nonstationary Time Series

Akaike criterion: Kullback-Leibler discrepancy

IDENTIFICATION OF ARMA MODELS

Forecasting Bangladesh's Inflation through Econometric Models

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Ch 4. Models For Stationary Time Series. Time Series Analysis

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Long-range dependence

2. An Introduction to Moving Average Models and ARMA Models

11. Further Issues in Using OLS with TS Data

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

On Consistency of Tests for Stationarity in Autoregressive and Moving Average Models of Different Orders

Elements of Multivariate Time Series Analysis

Gaussian Copula Regression Application

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Chapter 3 - Temporal processes

7. MULTIVARATE STATIONARY PROCESSES

Problem Set 2: Box-Jenkins methodology

Transcription:

Journal of Mathematics and Statistics (3): 358-367, 4 ISSN: 549-3644 4 doi:.3844/jmssp.4.358.367 Published Online (3) 4 (http://www.thescipub.com/jmss.toc) IDENTIFICATION OF PERIODIC AUTOREGRESSIVE MOVING-AVERAGE TIME SERIES MODELS WITH R Hazem I. El Shekh Ahmed, Raid B. Salha and 3 Diab I. AL-Awar Department of Mathematics, Al-Quds Open University, Gaza, Palestine Department of Mathematics, Islamic University, Gaza, Palestine 3 Department of Mathematics, Al-Aqsa University, Gaza, Palestine Received 4-3-; Revised 4-3-4; Accepted 4-8-8 ABSTRACT Periodic autoregressive moving average PARMA process extend the classical autoregressive moving average ARMA process by allowing the parameters to vary with seasons. Model identification is the identification of a possible model based on an available realization, i.e., determining the type of the model with appropriate orders. The Periodic Autocorrelation Function (PeACF) and the Periodic Partial Autocorrelation Function (PePACF) serve as useful indicators of the correlation or of the dependence between the values of the series so that they play an important role in model identification. The identification is based on the cut-off property of the Periodic Autocorrelation Function (PeACF). We derive an explicit expression for the asymptotic variance of the sample PeACF to be used in establishing its bands. Therefore, we will get in this study a new structure of the periodic autocorrelation function which depends directly to the variance that will derived to be used in establishing its bands for the PMA process over the cut-off region and we have studied the theoretical side and we will apply some simulated examples with R which agrees well with the theoretical results. Keywords: Periodic Models, PARMA Model, Identification, Periodic Autocorrelation Function. INTRODUCTION flows have significant periodic behavior in the mean, standard deviation and skewness. In addition to these The time series has been found that many periodicities, they show a time correlation structure meteorological variables (such as rainfall, global which may be either constant or periodic, for more temperature) are nonstationary. The theory and details see (Anderson and Vecchia, 993; Bartlett, 946). practice of time series analysis have developed rapidly Many macroeconomic time series display a trend and since the appearance in 97 of the seminal work of marked seasonal variation, while many variables in (Box et al., 3). finance and marketing display seasonality but no We Know that the time series analysis and trend. If there is a trend in the data, then often one is modeling is an important tool in many areas in our life interested in examining the nature of this trend, as this like water resources. It is used for building can have implications for forecasting and for mathematical models to generate synthetic hydrologic subsequent model building, (Iqelan, ). records to forecast, determine the likelihood, detect The fundamental aim of periodic time series analysis trends and shifts, and to interpolate missing data and is generally two-fold: To understand and identify the extend records in hydrologic records. The statistical stochastic process that produced the observed series and characteristics of hydrologic series are important in turn to forecast future values of a series from past deciding factors in the selection of the type of model. values alone. The common procedure in modeling such For example, in most cases known in nature, river periodic river flow series is first to standardize or filter Corresponding Author: Hazem I. El Shekh Ahmed, Department of Mathematics, Al-Quds Open University, Gaza, Palestine 358

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 the series and then fit an appropriate stationary model to the reduced series, for more details see (Hipel and McLeod, 994). Most estimation techniques depend on the assumption that the series is stationary. A special class of nonstationary time series has been defined by (Gladyshev, 96), called periodically correlated time series (also known as cyclostationary time series). These time series are nonstationary, but have periodic means and covariances. The major aim of this study is the PARMA model identification, i.e., the determination of the seasonally varying orders of the PARMA model, and to develop a practical computer program which performs model identification for any given actual periodic stationary series. An important class of periodic models useful in such situations consists of PARMA models, which are extensions of commonly used ARMA models that allow periodic parameters. PARMA models explicitly represent the seasonal fluctuations in mean flow, flow standard deviation, and flow autocorrelation, resulting in a more realistic time series model that leads to more reliable simulations of natural river flows, for more details (Ula and Smadi 3; Vecchia, 985). Since PARMA models are quite new, many questions about them are still unanswered and need further studies. It is known that there exists general methods for the identification of standard ARMA models of mixed type, however no such satisfactory method is available for PARMA processes. In the application part of this study, PARMA models may be more suitable for some seasons (Table ). This study involves only the periodic stationarity case and the programs work only for periodic stationary processes. There exists well-known methods like differencing or filtering to achieve stationarity of standard ARMA models, such methods for achieving periodic stationarity should also be investigated. Table. Behavior of the PeACF and PePACF for PARMA models, where the PeACF have values before lag q s but it is zero for lags beyond q s for pure PMA processes and the order of the process can be decided according to the sample PeACF. Also the PePACF have values before lag p s but it becomes zero for lags beyond p s for pure PAR processes and the order of the process can be decided according to the sample PePACF PAR (p s ) PMA (q s ) PARMA (p s, q s ) PeACF Tails off Cuts-off after Tails off lag q s PePACF Cuts-off after Tails off Tails off lag p s 359 Of specific importance to this study is to establish an identification of a possible model based on an available realization, i.e., to decide the kind of the model with correct orders by using two statistics functions widely used for identifying PARMA time series models which are PeACF and PePACF. Also, we will get some properties of the variance summarized which are needed for the assessment of the cut-off property of the seasonal ACF for a season s which follows a MA(q(s)) and then apply this establish on some simulated examples.. STEPS FOR MODEL IDENTIFICATION In time series analysis, the periodic autocorrelation function PeACF and the periodic partial autocorrelation function PePACF serve as useful indicators of the correlation or of the dependence between the values of the series so that they play an important role in model identification (Box et al., 3). The most crucial steps are to identify and build a model based on available data (Fig. ). This requires a good understanding of the processes, particularly the the characteristics of these processes in terms of their PeACF and PePACF. In practice, these PeACF and PePACF are unknown, so they have to be estimated by the sample PeACF and PePACF. Thus, in model identification our goal is to match patterns in the sample PeACF and PePACF for the PARMA models, for more details (William, 6). Fig.. Summary of steps for model identification which summarize in collecting data then plot the time series data then compute and examine the sample PeACF and the sample PePACF of the series and finally test the deterministic terms

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 Step : Plot the time series data and choose proper transformation, where the most commonly used transformation are variance-stabilizing transformations Step : Compute and examine the sample PeACF and the sample PePACF of the original series to further confirm a necessary degree of them Step 3: Compute and examine the sample PeACF and the sample PePACF of the properly transformed series to identify the orders of p = max{p,,p d } and q = max{q,,q d } in PARMA d (p, q, p, q,., p d, q d ) Step 4: Test the deterministic terms φ s, and θ s, when d> Alternatively, one can include φ s, and θ s, initially and discard it at the final model estimation if the preliminary estimation result is not significant. This steps will applied in section of simulation with R program. 3. PEACF OF PARMA MODLS We consider the identification of seasonally varying orders of PARMA time series models by making use of a periodic version of the functions ACF and PACF, to get a periodic autocorrelation function PeACF and periodic partial autocorrelation function PePACF. Definition 3. Gladyshev (96) We say that the process { t } is a periodically correlated time series, if: where: d Z + ; µ t = µ t+d, γ (s,t) = γ (s+d,t+d) s,t Z, ( )( ) µ t = Et < and γ ( s, t) = E s E s t Et < We will write the time index parameter t as t(s, r) = s + rd such that s =,,d refers to season and r + (r =,,N-) refers to year, d is the number of seasons and N is the number of years with total sample size n = Nd. Let { s+rd } be a zero-mean periodically correlated time series of period d =, 3,. The PARMA d (p, q, p, q,., p d, q d ) has representation Equation 3.: ps qs s+ rd φs, i s + rd i = s+ rd θ s, i s+ rd i i= i= (3.) where { s+rd } is an uncorrelated periodic white noise process and normally distributed terms with mean zero and 36 periodic variances σ (s) and φ s,i, θ s,i are the autoregression and moving average coefficients respectively. For the univariate periodic stationary PARMA process { s+rd }, the Periodic Autocovariance Function (PeACVF) is defined as Equation 3.: ( ) γ s ( h) = E s + rd s + rd h qs h θs, i + hθs h, i σ s h i, if h qs ; = i =, if h > qs. (3.) where, θ s, is defined to be for all seasons s at lag h. This is similar to the cut-off property of the ACVF of the MA processes. The PeACF of ρ s (h) for season s at lag h is defined as Equation 3.3: qs h θ s, i hθs h, i σ i = + s h i, if h qs ; ρs ( h) = γ s () γ s h (), if h > qs. qs h (3.3) θ i s, i hθs h, i σ = + s h i, if h qs ; qs q = s θ i s, i σs i θ i s h, i σ = = s h i, if h > qs. where, γ s () is the variance for the s th season. This is an important property of the PeACF. That is, for a season following a pure MA(q s ) process, ρ s (h) = h>q s, which is called the cut-off property of the periodic autocorrelation function of a pure PMA processes. Also, this function does not play the same role with PARMA and PAR models. Definition 3. The PeACF of { s+rd } for season s is defined as Equation 3.4: ρs ( h) = E = E Z ( Z ) s + rd s + rd h γ s ( h) =, h γ () γ () s s h µ µ s + rd s s + rd h s h γ () γ () s s h (3.4) where, γ s () is the variance for the s th season and {Z s+rd } denotes the periodically standardized time series.

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 We can define a function called peacf which summarizes the steps of periodic autocorrelation function PeACF to identify the seasonally varying orders of PARMA time series models. This function, which implements formula (3.3), is represented in Algorithm. Algoritm. peacf of PeACF for PARMA d (;q s ) process Input: ( σs h i, γ s ( h )), s =,, d for s to d do for i to q s -h do γs ( h), h qs ; ρs ( h) γs () γs h(), h > qs. / / cut - off end end Output: ρ s (h) Although, as for the ACF of a stationary ARMA model, it can be shown for a periodic stationary PARMA model that lim h ρ h (s) = (i.e., PeACF of a stationary series goes to zero as time lag increases) (Ula and Smadi, 3). The cut-off property of ρ h (s) for an arbitrary season s following the PMA[q(s)] process Equation 3.5: qs = θ (3.5) s+ rd s + rd s, i s + rd i i = where is that ρ h (s) = for all h>q(s). This is analogous to the cut-off property of the ACF of an ordinary MA process. Definition 3.3 The sequences of {ψ s,i } is called absolutely summable sequences (i.e., converge) if ψ, <. Definition 3.4 i= s i The periodic autoregressive moving-average process for the time series { s+rd } is said to be causal function of { s+rd }, if for each season s =,,...,d, there exist sequences of {ψ s,i }, such that Equation 3.6: = ψ, for each s =,.., d ; r =,,,... (3.6) s+ rd s, i s+ rd i i= where, ψ s, = and all the values ψ s,i are d-periodic and satisfy ψ, < for each season s. i= s i Proposition 3. Let { s+rd } to be periodically correlated time series and the absolutely summable sequences {ψ s,i } be such 36 that {ψ s,i } is d-periodic. Then the autocovariance function γ s (h) = E[ s+rd s+rd-h ] is given by Equation 3.7: γ ( s, s h) d l= hr= q= Proof = ψ ψ γ ( s h r + l, s h r) s, h l+ r s h, r (3.7) Let { s+rd } be a zero-mean periodically correlated time series of period d =, 3,.... Then: n E s rd + s+ rd h = lim E ψ n s, i s+ rd i i= n ψ s h, i s + rd h i i= n = lim E ψ s, iψ n s h, j s + rd i s + rd h j i, j= n = lim ψ s, iψ s h, jγ ( s i, s h j) n i, j = = ψ s, iψ s h, jγ ( s i, s h j) i, j= Since the autocovariance function of { s+rd } is bounded by max. [γ (, ),,γ (d, d)] and the elementwise product of absolutely summable sequences is absolutely summable. Now, let us change the index i by l = h+j-i to get: E s rd + s+ rd h = l= h j= ψ s, h+ j lψ s h, jγ ( s h j + l, s h j) d = ψ s, h l+ r+ qdψ s h, r+ qdγ ( υ + l qd, υ qd) l= h r= q= where, j = r + qd and v = s-h-r. Therefore: E s rd + s+ rd h = d l= hr= q= Corollary 3. ψs, h l+ r+ qdψ s h, r+ qd γ ( υ + l, υ) If { s+rd } in proposition (3.) is periodic white noise, σ,, then Equation 3.8: PWN ( s d )

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 ( s, s h) d r= s, h+ r s h, r s h r γ = ψ ψ σ (3.8) 4. PEPACF OF PARMA MODELS In time series analysis, the periodic partial autocorrelation function is well adapted to the identification of pure PAR processes. Let { s+rd } be a zero-mean periodically correlated time series of period d =, 3,. The PAR d (p,,p d ) has representation Equation 4.: ps = φ + (4.) s+ rd s, i s+ rd i s+ rd i= Definition 4. The conditional expectation of t given s, s<t is denoted by E( t s, s<t) and defined as follows E( t s, s<t) = a t- + a t- +. The conditional expectation can be replaced by the linear projection. The best linear prediction of t, s<t is denoted by pred ( t s, s<t), for more details see (Iqelan, ). Consider the discrete time series s+rd ; s+rd-h, s+rd-(h-),, s+rd-, s+rd,. Hence, for any h>, ˆ ( h ) = Pr oj,..., to define ( ) s + rd s + rd s + rd s + rd ( h ) be the best forward linear predictor of s+rd from the intermediate variables s+rd- to s+rd-(h-), where the upper indices refer to quantities connected to the predictor of s+rd based on the h- observations right before or right after it. So Equation 4.: h ˆ ( h ) s+ rd ah, i s+ rd ( h i) i= = (4.) where a h-,i, i =,, h- are the forward coefficients of the prediction. ( h ) ( ) Let ˆ h = to be the forward residual s+ rd s+ rd s+ rd of s+rd and define Equation 4.3: h ( ) ( ) f s + rd σs + rd Var = ( h ) (4.3) We can also obtain the best backward linear prediction if we reversing the time index, say ( h ) % = Pr oj (,..., ) to be the s+ rd h s+ rd h s+ rd ( h ) s+ rd backward linear predictor of s+rd-h from s+rd-(h-) to s+rd-. So Equation 4.4: 36 h ( h ) s + rd h = bh, i s + rd i i = % (4.4) where, b h-,i, i =,..,h- are the backward coefficients of the prediction. Also, let ( h ) ( h ) η s rd h s rd h + + s+ rd h = % to be the backward residual of s+rd-h and define Equation 4.5: Var h ( η ) ( ) b s + rd h σs + rd Definition 4. = ( h ) (4.5) Iqelan () The periodic partial autocorrelation function β(s, s-h) is the correlation between s+rd and s+rd-h with the effect of the intermediate variables s+rd-,..., s+rd-(h-) filtered out, which is defined on Z Z by Equation 4.6: β ( s, s h) = Var ( s+ rd ), if h = ; Corr ( s+ rd, s+ rd h), if h = ; ˆ ( h ) ( h ) Corr ( s+ rd s+ rd, s+ rd h % s+ rd h), if h. (4.6) Notice that, setting β(s, s) = Var( s+rd ) instead of in the above definition. Also, we can rewrite β(s, s-h) at lag h of one variable to be as Equation 4.7: ( ) β ( h) = β s, s h s f ( ˆ ( h ) ( ), % h s+ rd s+ rd s+ rd h s+ rd h ) ( h ) ( h ) Cov ( s + rd, ηs+ rd h ) = Corr = σ b s+ rd h σs+ rd Theorem 4.. ( ) ( h ) (4.7). If { s+rd } is a causal periodic autoregression of order p s at season s, then β s (h) = whenever h>p s.. If { s+rd } is a periodic series with period d and β s (h) = for all h>p s and β s (p s ) for each season s, then { s+rd } is a periodic autoregression of order p s at season s; s d. It was mentioned that for a pure PMA processes, the PeACF is zero for lags beyond q s. Likewise, for pure PAR processes, the PePACF becomes zero for lags beyond p s, so if the correct order is p s for season s, then β s (h) = for all h>p s. This is the cut-off property of periodic partial autocorrelation function for PAR processes.

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 Actually, the PePACF for PMA models behaves much like the PeACF for PAR models. Also, the PePACF for PAR models behaves much like the PeACF for PMA models. Also, we can define a function called pepacf which summarizes the steps of periodic partial autocorrelation function PePACF to identify the seasonally varying orders of PARMA time series models which implements formula (4.7) to be represented in Algorithm. Algoritm. pepacf of PePACF for PARMA d (p s ; ) process f b ( h ) ( h ) ( h ) ( h ) σ ( h ), σ ( h ),, η, Cov, η. Input: s+ rd s+ rd s+ rd s+ rd h ( s+ rd s+ rd h ) for s to d do % βs ( h) end Output: β s (h) ˆ ( h ) ( h ) Corr ( s+ rd s+ rd, s+ rd h s+ rd h ), h ps ;, h> ps./ / cut off 5. SAMPLE PERIODIC AUTOCORRELATION FUNCTION Assume that {,,, Nd } be a series of size Nd from a periodic stationary process { s+rd }. Then ˆ γ ( ) is the sample periodic autocovariance function which calculated from Equation 5.: N N ˆ h ( s) = s + rd s + rd N r = N r = γ N s + rd h s + rd h N r = N = N r = ( s + rd s )( s + rd h s h ) h s (5.) where, the sample mean for season s is s N r s+ rd N = =. Also, the sample periodic autocorrelation function can be given by Equation 5.: ˆ γ ( ) ( ) h s rh s =, h ˆ γ ˆ ( s) γ ( s h) (5.) Since µ s and γ h (s) are periodic with period d, also and ˆ γ ( ) are so. s h s Let the white noise terms to be independent and normal, so that { s+rd } is a Gaussian PARMA process, 363 then in (Pagano, 978) it is proved that ˆ γ h ( s ) are consistent, asymptotically independent, joint normal, unbiased estimates, efficient and converge almost surely to γ h (s) for all s and h. Also, s is consistent and unbiased estimator of µ s and it can be shown that it also consistent under the periodic stationarity assumption. For stationary processes, the asymptotic joint normality and unbiasedness of the sample ACF r h (s) with d = s = have been shown by (Bartlett, 946) and the asymptotic variancecovariance matrix has been specified in which Equation 5.3: Var rh ( s) N m = { ρ m + ρ 4 m m m + h ρ m h ρ h ρ ρ m h + ρ ρ h} (5.3) where, the symbol wherever appears means that the statement is true for large N (Box et al., 3). In Vecchia (985) an approximate solution for the first- and second-order moments of r h (s) were obtained by pretending that the sample means and variances γ ˆ (s) in r h (s) are equal to their population counterparts µ s and γ h (s), respectively. This assumption will obviously be well justified only for large samples due to the consistency of these estimators. In that case, we can take r h (s) as Equation 5.4: N rh ( s) Zs+ rd Zs+ rd h N r= = (5.4) It then follows that r h (s) is asymptotically unbiased with asymptotic variance Equation 5.5: Var rh ( s) ρ md ( s) ρ md ( s h) ρ md + h ( s) ρ md h ( s h) N + m = s (5.5) Which by utilizing ρ s (h) = ρ s-h (-h) and the fact that ρ md (s)ρ md (s-h)+ρ md+h (s)ρ md-h (s-h) is an even function of m, reduces to Equation 5.6: Var rh ( s) N ρh ( s) + + ρmd ( s) ρmd ( s h) ρmd + h( s) ρmd h( s h) + m= (5.6)

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 Now, the Equation 5.5 and 5.6 is used, only the first two terms are retained in these Equation. The remaining last two terms containing the third-and higher-order autocorrelations disappear for h>q() if the first season follows a MA(q()) process. Therefore, for the assessment of cut-off in seasonal autocorrelation, that is for checking whether season s is a MA(q()) or not, for example, for which ρ h () =, for h>q(). In the following proposition some properties of the variance of Var[r h (s)], which follow from (5.6), are summarized which are needed for the assessment of the cut-off property of the seasonal ACF for a season s which follows a MA(q(s)) process. Proposition 5. Ula and Smadi (3) Let { s+rd } be a periodic stationary PARMA d (p(s), q(s)) process. If s is an arbitrary with p(s) =, then for positive integer h, we have the following results for Var[r h (s)]; (i) for q(s)<d Equation 5.7: ( ρ h ) + [ ( s )], h q ( s ) N Var rh ( s), h > q ( s ) N Proof (ii) for rd q(s)<(r+)d, r =,,, then: V ar [ rh ( s )] r ( [ ( )] + ρh s + { ρmd ( s ) ρmd ( s h ) N m = + ρmd + h ( s ) ρmd h ( s h)}), h q ( s ) rd ; r ( [ ( )] + ρh s + { ρmd ( s ) ρmd ( s h ) + N m = j ρmd + h ( s ) ρmd h ( s h)}), m = q ( s ) ( j + ) d < h q ( s ) jd, j =,..,( r ); r ( [ ( )] + ρh s + { ρmd ( s ) ρmd ( s h)}), N m = q ( s ) d < h q ( s ); r ( + ρmd ( s ) ρmd ( s h )), h > q ( s ). N m = For more details see Ula and Smadi (3). (5.7) 364 Setting d = s =, it can be easily seen from Proposition (5.) that, for h>q(s) = q, case (i) reduces to the white noise process Equation 5.8: Var rh ( s ) =, h N (5.8) Also, case (ii) reduces to the MA(h) process Equation 5.9: Var r h ( s ) +, N h > r r ρm (5.9) m= These are the well-known formulas for the identification of white noise and MA processes, respectively, in the context of stationary processes (Box et al., 3). However, note that for h q(s) the formulas for Var(r h ) in case (ii) of Proposition (4.) are rather approximate as they are base on (5.6). The identification bands for the MA(q(s)) process utilize r h (s) being asymptotically normal with zero mean and variance Equation 5.8 and 5.9 over the cutoff region h>q(s). Following the same methodology applied to the sample ACF of a stationary process, for a season s which follows a MA(q(s)) process, we start checking values of q(s) successively, starting with q(s) =. Then for large N, as long as q(s)<d, Equation 5.8 implies that r h (s), for h>q(s), is normally distributed with mean zero and.96.96 variance /N, so that the 95% band, is N N applied to those autocorrelations. If q(s) d, which is unlikely for moderate or large values of d, Equation 5.9 should be utilized, with ρ h (s) estimated by r h (s). let s h to denoted the sample value of Var[ rh ( s )], the 95% band is (-.96 s h,.96 s h ), which should be applied for r h (s), for h>q(s), the accuracy of these bands are verified through simulation next section. 6. SIMULATION RESULT In this section we will illustrate some simulation studies to investigate the usefulness in practice of the theoretical results stated earlier and to demonstrate the identification procedure. The R programming language was used in this simulation study in conjunction with the pear package. Example 6. Consider the model PARMA 4 (, ;, ; 3, ;, 4):

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 + 4r =.9 4+ 4( r ) + + 4r + 4r =.7 + 4r +.8 4+ 4( r ) +.6 + 4r +. 4+ 4( r ) + + 4r 3 + 4r =. + 4r.6 + 4r +.5 4+ 4( r ) + 3 + 4r 4 + 4r =.5 3+ 4r +.7 + 4r +.8 + 4r +.3 4+ 4( r ) + 4 + 4r Which is chosen to be periodic stationary and invertible (for determining periodic stationarity and invertibility of such processes, the first season MA (), second season ARMA (,), third season AR (3) and fourth season MA (4). To process our simulation study an R-code was written to generate independent replicates. Each of these replicates has sizes N = 5, N = 5 and N = 75. In all cases, the white noise terms are independently and normally distributed with mean zero and variance equal to one. In Fig. we see that one of simulated series which corresponds to N =. The white noise terms are assumed to be independently and normally distributed with zero means (our basic assumptions) and unit variances ( a ( s) ; s,,3,4 ) σ = =. Ten thousand realizations each of length N (years), i.e., 4 N values, for N = 5; 5; 75, are simulated from the above 4-period PARMA model. For each realization, the sample PeACF, r h (s), for s =,, 3, 4 and h =,, are computed in Table. Only the first and fourth seasons here follow a pure MA process, MA () and MA (4), respectively. For the first season, we expect a cut-off behavior for the sample PeACF for h> and for the fourth season for h>4. These can be observed from the values in Table. Only for these first seasons, we observe an apparent sudden drop in the values, for more analysis see Fig. 3. Equation 5.8 follows and the relative frequencies (rel. freq.) of r h (), h>q =, going outside the 95% band.96.96, for N = 5; 5; 75, are given in N N Table respectively and shown in Fig. 4, which agree well with the theoretical asymptotic value of 5%. The relative frequency is the percentage of autocorrelation values over all realizations falling outside the corresponding bands. The asymptotic property also justifies the improvement in the values as N increases. Fig.. Line graph of observations Fig. 3. The PeACF for the size data sets(n = 5, N = 5 and N = 75 respectively). In season one PeACF show cut-off after lag and since PeACF shows better cut-off, this season is said to follow MA () model. Also, in season four PeACF show cut-off after lag 4 and this season is said to follow MA (4) model 365

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 Fig. 4. The 95% confidence interval size data sets (N = 5, N = 5 and N = 75 respectively). It can be said that the first, second, third and the fourth seasons (for N = 5) follow white noise process without any doubt, since all values fall inside the bands for PeACF, but for season one, st lag in PeACF is outside the bands and since that value is close to band limits, it may be ignorable. Also, the seasons (for N = 5) follow white noise process, but for season one and four, st lag in PeACF may be ignorable. Finally, the seasons (for N = 75) follow white noise process, but for season one, two and four, st lag in PeACF may be ignorable Table. Average sample PeACF r h (s) for lag h = with σ a = N = 5 N = 5 N = 75 ---------------------------------------- -------------------------------------- ------------------------------------------------- h\s 3 4 3 4 3 4.65.378..43.59.388.3.437.598.384 -.3.455 -.3.8 -.3. -. -.7.647 -.3. -.44 -.66 -.9 3.6 -.5. -.. -.8 -.4 -.9 -. -.58 -.6. 4 -.3 -. -..39 -.7 -.67 -.99 -.7 -. -.4..3 5.9 -.5 -.3.36 -.5.9.5 -.7.8 -.6.7 -. 6.3 -.9.8 -.5.33 -. -.6.8 -...3 -. 7.6.3.4. -..3 -.48 -.5. -.3.7 -.9 8 -.4..8.4.4 -.9. -..8.3.. 9 -..3 -.8.3 -..39 -.7.4. -...9 -..8 -.8..5.4. -.6..4.7.8 Rel. freq...58.3.68.4.4.5.85.68.59.79.9 Table 3. Average sample PeACF r h (s) for lag h = with σ a =.5. The variance seem to have no significant effect on the overall result, as theoretically expected N = --------------------------------------------------------------------------------------------------------------------------- h\s 3 4.6.37 -..436 -.5.3 -.3. 3. -.8.4 -.5 4 -. -.7.6. 5 -..4 -.. 6.5.3 -.6 -. 7 -..6 -..4 8.6 -... 9.5.7.7.3..5.3 -. 366

Hazem I. El Shekh Ahmed et al. / Journal of Mathematics and Statistics (3): 358-367, 4 For the fourth season, therefore Equation 5.9 applies with relative frequencies of r h (4) going outside the corresponding 95% band for the three N values. Example 6. In this example we will to see the effect of variance on the results, we take the same model in the previous example PARMA 4 (, ;, ; 3, ;, 4) and the white noise terms are independently and normally distributed with mean zero and variance equal.5 (possible take any value of the standard deviation). We will see the variances seem to have no significant effect on the overall result, as theoretically expected. The simulation is repeated for N = and σ a (s) =.5 for s = (,, 3, 4). For the first season, we expect a cut-off behavior for the sample PeACF for h> and for the fourth season for h>4. These can be observed from the values in Table 3 7. CONCLUSION In this study we have made a survey on one of the most important topics in identification of PARMA models. An explicit expression for the asymptotic variance of the sample process PeACF is derived to be used in establishing its bands for the PMA process over the cut-off region and we have studied the theoretical side therefore we have some applications on it where the simulation results agree well with the theoretical results. In the future research, we will explicit expression for the asymptotic variance of the sample process PePACF that derived to be used in establishing its bands for the PAR process over the cut-off region and therefore we have some applications on it. Also, we will get estimation of PARMA models using some ways of estimation. 8. ACKNOWLEDGMENT The researchers would like to thank the referee for carefully reading the paper and for helpful suggestions which greatly improved the paper. Bartlett, M.S., 946. On the theoretical specification and sampling properties of autocorrelated timeseries. Suppl. J. Royal Stat. Society, 8: 7-4. http://www.jstor.org/discover/.37/9836?u id=9&uid=&uid=7&uid=4&sid=44694 6497 Box, G.E.P., G.M. Jenkins and G.C. Reinsel, 3. Time Series Analysis: Forecasting and Control. 4th Edn., John Wiley and Sons, Hoboken, ISBN- : 86964, pp: 746. Gladyshev, E.G., 96. Periodically and almostperiodically correlated random processes with a continuous time parameter. Theory Probab. Applied Math., 8: 73-77. DOI:.37/86 Hipel, K.W. and A.I. McLeod, 994. Time Series Modelling of Water Resources and Environmental Systems. Developments in Water Science. st Edn., Elsevier, Amsterdam, ISBN-: 887368, pp:. Iqelan, B.,. Periodically Correlated Time Series: Models and Examples. st Edn., Lambert Academic Publishing, Saarbru cken, ISBN-: 38443, pp: 4. Pagano, M., 978. On periodic and multiple autoregressions. Annals Statist., 6: 3-37. DOI:.4/aos/76344376 Ula, T.A. and A.A. Smadi, 3. Identification of periodic moving-average models. Commun. Stat. Theory Meth., 3: 465-475. DOI:.8/STA-5388 Vecchia, A.V., 985. Periodic Autoregressive-Moving Average (PARMA) modeling with applications to water resources. J. Am. Water Resources Assoc., : 7-73. DOI:./j.75-688.985.tb67.x William, W.S.W., 6. Time Series Analysis: Univariate and Multivariate Methods. st Edn., Pearson Addison Wesley, Boston, MA, ISBN-: 3369, pp: 64. 9. REFERENCES Anderson, P.L. and A.V. Vecchia, 993. Asymptotic results for periodic autoregressive moving-average processes. J. Time Series Anal., 4: -8. DOI:./j.467-989.993.tb6.x 367