Applied Mathematical Sciences, Vol. 5, 011, no. 4, 193-0 A New Iterative Procedure for Estimation of RCA Parameters Based on Estimating Functions Norli Anida Abdullah, Ibrahim Mohamed 1 Shelton Peiris and Nor Azlinna Azizan Institute of Mathematical Sciences University of Malaya 50603 Kuala Lumpur, Malaysia Abstract This paper considers a novel approach for estimation of random coefficient autoregressive models using an iterative (IT) procedure based on estimating functions (EF). We compare the performance of these new estimates with the traditional least squares (LS) and the standard EF estimates via a large scale simulation study. It is shown that the proposed IT procedure performs better than the other two methods. Keywords: Estimating function; Least squares; Random coefficient autoregressive model; Iterative method; Estimation; Bias; Standard error 1 Introduction Random coefficient autoregressive (RCA) models have attracted a growing interest in recent years due to its wide range of applications in biology, economics, meteorology and, recently, in finance. See, for example, Nicholls and Quinn (198) and references therein, Bera et. al (1997), Rahiala (1999), Wang and Gosh (00) and Lee et. al (006) for details and various applications of RCA models. In estimating the parameters of RCA models, Nicholls and Quinn (1981) employed the traditional least squares (LS) and the maximum likelihood (ML) methods while Wang and Gosh (00) used the Bayesian approach. Another method that has been considered is the theory of estimating functions originally proposed by Godambe (1985). Thavaneswaran and Abraham (1988) 1 Corresponding author: imohamed@um.edu.my Visiting from the University of Sydney, Australia, e-mail: shelton.peiris@sydney.edu.au
194 Norli Anida Abdullah et al applied this estimating functions (EF) theory to nonlinear time series estimation problems and Thavaneswaran and Peiris (1996) have extended these results further with applications. For an RCA model, the estimation using the EF technique is equivalent to the weighted least squares estimator which has been discussed by Hwang and Basawa (1998) and Chandra and Taniguchi (001). This paper considers RCA models and propose a new iterative (IT) procedure for estimating the AR parameter. Unlike the EF method, this algorithm can be used to estimate variances of the random variables involved in addition to the AR parameter θ. We finally compare the performance of the IT method with the traditional LS and the standard EF methods. With that view in mind, this paper is organized as follows: The section reports the parameter estimation for an RCA(p) model using LS, EF and IT methods. The section 3 compares the performance of these three estimates through a simulation study. Finally, a conclusion is given in section 4 to summarize the work of this paper. Random Coefficient Autoregressive Model of Order p The general class of random coefficient autoregressive model of order p, RCA(p), is given by p y t = [θ i + b i (t)]y t i + ε t, (.1) i=1 where θ =(θ 1,θ,,θ p ) is a p 1 vector of constants, {ε t } is a sequence of iid (independent and identically distributed) random variable with mean zero and variance σε, b(t) =(b 1 (t),..., b p (t)) is a sequence of iid random variables with mean zero and E[b(t)b (t)] = Σ = diag(σ b 1,σ b,,σ b p ) for each t, {b(t)} is independent of {ε t } for each t and θ 0 θ +tr(σ) < 1 to ensure stability and ergodicity. It can be seen if the elements of Σ are small compared with those of θ, then {y t } would be expected to resemble realizations of constant coefficient autoregressions. See, for example, Nicholls and Quinn (198) and references therein for details. It is easy to see that p p y t = θ i y t i + b i (t)y t i + ε t i=1 i=1 = Y t 1 θ + u t, (.) where Y t 1 =(y t 1,y t,,y t p ) and u t = p i=1 b i(t)y t i + ε t.
Iterative procedure for estimation of RCA parameters 195 Given a sample size n observations y 1,y,..., y n, where n>>p,the least squares estimate ˆθ LS of θ is given by ˆθ LS =( Y t 1 Y t 1) 1 Y t 1 y t. (.3) Having estimated θ, we may then estimate σb i equation (.) such that for all i and σ e by considering û t,ls = y t Y t 1ˆθ LS Let η =û t,ls σ e p i=1 σ b i Yt i. Nicholls and Quinn (1981) showed the least squares estimates of σb i for all i and σe can be obtained by regressing û t,ls on 1 and y t i, which is reduced to a standard regression problem y = Xβ + η, where y = û 1,LS. û n,ls, X = 1 y t 1,1 y t p,1... 1 yt 1,n yt p,n, β = σ e σ b 1. σ b p and η = η 1. η n (.4) Hence, the solution of the least squares estimates is ˆβ = (X 0 X) 1 X 0 y (.5) On the other hand, Thavaneswaran and Abraham (1988) obtained the corresponding EF estimate (by solving the optimal estimating equations) ˆθ EF of θ given by ˆθ EF =( Y t 1 Y t 1 /W t) 1 N Y t 1 y t /W t, (.6) where W t = σε + n y t i σ b i. In practice, σb i and σε in (.5) are replaced by the corresponding LS estimates as given in (.4). Now we look at an alternative estimation procedure.
196 Norli Anida Abdullah et al.1 An Iterative Estimation (IT) Method It is clear that the evaluation of (.5) depends on (.4) and therefore we suggest the following efficient iterative algorithm to estimate θ in (.1). Since the LS estimates are used to evaluate the estimate in (.5), now we show that the quality of the estimate ˆθ EF can be improved as follows: Let θ (k) IT be an estimate of θ based on n observation at step k, k =1,,. 1. Take θ (0) IT = ˆθ LS from (.3) while σ (0) ε,it i =1,,..., p from (.5). Let u (0) t,it =û t,ls. =ˆσ ε,ls. Now obtain the following values for k=1,,...: and σ(0) b i,it =ˆσ b i,ls, for W (k 1) t = σ (k 1) ε,it + y t i σ(k 1) b i (.7) θ (k) IT =( Y t 1 Y t 1 N (k 1) /W t ) 1 Y t 1 y t /W (k 1) t (.8) u (k) t,it = y t ˆθ 0(k) IT Y t 1 (.9) σ (k) ε,it = ˆβ (k) σ e (.10) σ (k) b i,it = ˆβ (k) σ b i, i =1,,.., p. (.11) where β (k) σ refers the first entry of ˆβ at step k in equation (.4). Similarly e for β (k), i =1,,..., p. σb i 3. The process in step two will continue until all of θ (k) IT,σ(k) ε,it i = 1,,..., p converge with a certain pre-specified tolerance. and σ(k) b i,it, The section 3 considers a large scale simulation study to investigate the performance of this suggested IT procedure for the simplest case of RCA(1) model.
Iterative procedure for estimation of RCA parameters 197 3 Simulation This section reports a simulation study for investigating the performance of IT method compared to the EF and LS methods. In the case of p = 1 in (.1), the corresponding LS estimates are n t= ˆθ LS = y ty t 1 n, (3.1) t= y t 1 ˆσ b,ls = n t= û t,ls (y t 1 z) n t= y4 t 1 z, (3.) ˆσ e,ls = n t= û t,ls σ b,ls z n 1 (3.3) where z = n yt 1 t= and û n 1 t,ls = y t ˆθ LS y t 1. The EF estimate is given by ˆθ EF = y t y t 1 σe + / σ b y t 1 t= yt 1 σe +. (3.4) σ b y t 1 t= Whilst for IT estimates are θ (k) IT = t= σ (k 1) e,it y t y t 1 / + σ (k 1) b,it yt 1 t= σ (k 1) e,it y t 1 + σ (k 1) b,it yt 1 (3.5) u (k) t,it = y t θ (k) IT y t 1 (3.6) n b,it = t= û(k) t,it (y t 1 z) n (3.7) t= y4 t 1 z σ (k) σ (k) n e,it = t= û(k) n 1 Below are the steps taken in our study. t,it σ(k) b,it z. (3.8)
198 Norli Anida Abdullah et al 1. Generate RCA(1) series of lengths N = 50, 700, 100 with known parameter values of θ, σb and σ ε assuming ε t and b t follow normal distributions with means 0 and variances 1 and σb (known) respectively. The initial value y 0 is chosen to be 0 and the first 00 values from each generated the series are discarded to remove the initial value effect giving the effective lengths of each series used in estimation are n =50, 500, 1000.. Estimate θ, σ b and σ ε using the LS, EF and IT methods. 3. Repeat the steps in (1) and () for s times so that we have 3s estimates for each method of estimation. 4. Let ξ =(θ, σb,σ ε)=(ξ 1,ξ,ξ 3 ) and let ˆξ (j) i be an estimate of ξi at step j = 1,..., s. The following calculations for each parameter is obtained for each i =1,, 3. Mean: ξi = 1 s (j) s j=1 ˆξ i, Bias: ξi ξ i, Root mean square error (RMSE): 1 s [ˆξ (j) i ξ i ] s, (3.9) j=1 Standard error (SE): 1 s 1 s j=1 [ˆξ (j) i ξ i ]. (3.10) 5. Throughout our simulation, we fixed the tolerance of the IT method to be 10 6 and the number of simulation to be s =1, 000. The section 3.1 below will review the criteria for comparison of LS, EF and IT methods. 3.1 Model Comparison Several criteria are available in comparing the estimates given by LS, EF and IT methods when fitted on time series data such as the Akaike s information criteria, Bayesian Akaike s information criteria and Schwartz criteria. We use the Akaike s information criteria (AIC) given by AIC = lnl(θ)+ p, (3.11)
Iterative procedure for estimation of RCA parameters 199 where lnl(θ) is the log likelihood function and p is the number of parameters to be estimated. For the RCA model, the log-likelihood function is given by Nicholls and Quinn (1981). 3. The Results Table 1 gives the simulation results for the true parameter value θ =0.1 and σb =0.16. Whilst for Table, we choose the true parameter value θ = 0.3 and σb =0.5. In table 3, the true parameter value θ=0.8 and σ b =o.3 are chosen representing the extreme case satisfying the stationary condition θ + σb <1. Lastly, Table 4 considers the true value of σ b =0.5 which is greater than the true value of θ=0.3. The first three rows of the tables give the bias for each parameter θ, σb and σ e using LS, EF and IT approaches. The next six rows report the RMSE and SE results. Based on our simulation results, we note that the EF and IT methods perform better than the LS method in all cases considered, the bias, RMSE and SE of the three parameters are smaller for EF and IT methods, the LS, EF and IT methods give almost similar bias for θ, the IT method gives smaller bias of σ ε and σ b, the RMSE and SE get smaller as n increases (as expected). These values for EF are generally smaller than those for LS and IT. Table 5 gives the proportion data accurately fitted by one of the three estimation methods based on AIC values. Using 1000 samples from RCA(1) process with different sets of parameter values, it can be seen that the EF or the IT methods always perform better than the LS method. Further, it is interesting to note that the IT method has a higher proportion of models having smaller AIC values than the EF method. This conjecture that the IT method is a better alternative estimation method for an RCA(1) model. 4 Conclusion This work shows that the proposed IT procedure works well in the estimation of RCA(1) parameters. Simulation results and the illustration of a real data further support the performance of the IT method.
00 Norli Anida Abdullah et al Table 1: Simulation Results for θ=0.1 and σ b =0.16 n =50 n = 500 n = 1000 Statistics LS EF IT LS EF IT LS EF IT Bias θ -0.0095-0.0074-0.0069-0.0019-0.0017-0.0017-0.0009-0.0008-0.0008 Bias σb -0.0866-0.0866-0.0758-0.0148-0.0147-0.018-0.0066-0.0066-0.005 Bias σe 0.0543 0.0543 0.0438 0.0090 0.0090 0.0069 0.0043 0.0043 0.008 RMSE θ 0.1537 0.1533 0.155 0.0486 0.0477 0.0467 0.0355 0.0343 0.0340 RMSE σb 0.171 0.1717 0.1687 0.0667 0.066 0.0657 0.057 0.053 0.0518 RMSE σe 0.880 0.880 0.84 0.0891 0.0890 0.0889 0.0668 0.0665 0.0656 SE θ 0.1535 0.153 0.1530 0.0486 0.0476 0.047 0.0355 0.0343 0.0335 SE σb 0.1488 0.1488 0.1663 0.0651 0.0650 0.0646 0.053 0.051 0.0516 SE σe 0.89 0.85 0.811 0.0887 0.088 0.0877 0.0667 0.0660 0.0656 Table : Simulation Results for θ=-0.3 and σ b =0.5 n =50 n = 500 n = 1000 Statistics LS EF IT LS EF IT LS EF IT Bias θ 0.016 0.0088 0.0076 0.0013-0.001-0.0010 0.0010 0.0003 0.0001 Bias σb -0.1344-0.1341-0.0518-0.033-0.0319-0.07-0.030-0.09-0.003 Bias σe 0.1363 0.1358 0.1091 0.0353 0.0350 0.084 0.066 0.056 0.09 RMSE θ 0.6815 0.791 0.408 0.0397 0.0375 0.0369 0.0403 0.0363 0.0358 RMSE σb 4.500 4.500 1.638 1.0199 1.0169 0.8608 0.0693 0.0654 0.0595 RMSE σe 4.3098 4.3058 3.4497 1.1163 1.1133 0.8975 0.097 0.0948 0.0931 SE θ 0.1653 0.1648 0.1614 0.0563 0.0504 0.0501 0.0403 0.0363 0.0303 SE σb 0.17 0.171 0.1668 0.089 0.0879 0.089 0.0654 0.0638 0.065 SE σe 0.3145 0.3135 0.309 0.170 0.154 0.104 0.0936 0.094 0.0910 Table 3: Simulation Results for θ=0.8 and σ b =0.3 n =50 n = 500 n = 1000 Statistics LS EF IT LS EF IT LS EF IT Bias θ -0.1061-0.0498-0.0460-0.040-0.0081-0.0041-0.084-0.0019-0.0014 Bias σb -0.1486-0.1458-0.1044-0.096-0.090-0.076-0.0879-0.086-0.0758 Bias σe 0.6498 0.6489 0.3476 0.7055 0.703 0.544 0.7967 0.7957 0.7151 RMSE θ 0.1698 0.166 0.1505 0.0665 0.0507 0.0466 0.0536 0.0346 0.030 RMSE σb 0.195 0.1918 0.1303 0.13 0.11 0.108 0.104 0.1194 0.1114 RMSE σe.0958.0953.0004 1.6908 1.6900 1.6516 1.6918 1.6908 1.6516 SE θ 0.135 0.1965 0.1456 0.0530 0.0404 0.0341 0.0455 0.0346 0.039 SE σb 0.134 0.14 0.105 0.0818 0.0813 0.0806 0.0867 0.0853 0.0848 SE σe 1.9939 1.9936 1.5784 1.731 1.730 1.6619 1.498 1.491 1.4895 Table 4: Simulation Results for θ=0.3 and σ b =0.5 n =50 n = 500 n = 1000 Statistics LS EF IT LS EF IT LS EF IT Bias θ -0.0364-0.0097-0.006-0.0088-0.0003-0.0003-0.0053 0.0010 0.0011 Bias σb -0.753-0.753-0.345-0.170-0.170-0.1103-0.1078-0.1078-0.0953 Bias σe 0.4477 0.4477 0.3645 0.691 0.691 0.35 0.414 0.414 0.139 RMSE θ 0.199 0.1816 0.1833 0.0798 0.0560 0.05560 0.067 0.0400 0.0400 RMSE σb 0.387 0.387 0.377 0.1815 0.1815 0.180 0.1605 0.1605 0.1609 RMSE σe 0.9163 0.9163 0.895 0.4111 0.4111 0.3953 0.3685 0.3685 0.3619 SE θ 0.1895 0.1814 0.1919 0.0794 0.0560 0.0799 0.065 0.0400 0.069 SE σb 0.7999 0.7999 0.8151 0.3109 0.3109 0.3199 0.786 0.786 0.91 SE σe 0.7999 0.7999 0.8151 0.3109 0.3109 0.3199 0.786 0.786 0.91
Iterative procedure for estimation of RCA parameters 01 Table 5: AIC-frequency of model preference for each estimation method Model F requencyaic LS F requencyaic EF F requencyaic IT n = 100, θ =0.1, σb =0.16 0 0.434 0.566 n = 100, θ =0.8, σb =0.3 0 0.335 0.665 n = 100, θ =0.3, σb =0.5 0 0.305 0.695 n = 100, θ = 0.3, σb =0.5 0 0.333 0.667 n = 100, θ =0.5, σb =0.5 0 0.337 0.663 n = 100, θ =0.7, σb =0.16 0 0.337 0.663 References [1] A. K. Bera, P. Garcia and J. S. Roh, Estimation of Time-varying Hedge Ratios for Corn and Soybeans: BGARCH and Random Coefficient Approaches, The Indian Journal of Statistics, 59(3)(1997), 346-368. [] A. Thavaneswaran and B. Abraham, Estimation for Nonlinear Time Series Model using Estimating Function, J. Time Ser. Anal., 9(1988), 99-108. [3] A. Thavaneswaran and S. Peiris, Nonparametric Estimation for Some Nonlinear Models, Statistics and Probability Letters, 8(1996), 7-33. [4] D.F. Nicholls and B.G. Quinn, The Estimation of Random Coefficient Autoregressive Models. II, Journal of Time Series Analysis, (1981), 185-03. [5] D.F. Nicholls and B.G. Quinn, Random Coefficient Autoregressive Models: An Introduction, Lecture Notes in Statistics, 11, Springer-Verlag, New York, 198. [6] D. Wang and S. K. Ghosh, Bayesian Analysis of Random Coefficient Autoregressive Models, Model Assisted Statistics and Applications, 3()(00), 81-95. [7] H.T. Lee, K.J. Yoder, R. C Mittlehammer and J.J. McCluskey, A Random Coefficient Autoregressive Markov Regime Switching Model for Dynamic Futures Hedging, The journal of Futures Markets, 6()(006), 103-19. [8] M. Rahiala, Random Coefficient Autoregressive Model for Longitudinal Data, Biometrika, 86(3)(1999), 718-7. [9] S.A. Chandra and M. Taniguchi, Estimating Functions for Nonlinear Time Series Model, Ann. Inst. Statist. Math., 53(1)(001), 15-136. [10] S.Y. Hwang and I.V. Basawa, Parameter Estimation for Generalized Random Coefficient Autoregressive Processes, Journal of Statistical Planning and Inference, 68(1998), 33-37.
0 Norli Anida Abdullah et al [11] V.P. Godambe, The Foundations of Finite Sample Estimation in Stochastic Processes, Biometrika, 7(1985), 419-48. Received: August, 010