Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 do:0.3/00.39..38 Chaotc Tme Seres Predcton Based on RBF Neural Network Yepng Peng School of Software and Servce Outsourcng, Jshou Unversty, Zhangae47000, Hunan, Chna Abstract Ths paper conducts research on the forecastng method of chaotc tme seres based on RBF (Radal Bass Functon n abbrevaton) neural network. Frstly, t provdes an analyss of the nfluence of the parameters of the chaotc dynamc system on ther system. Secondly, t contnues wth the descrpton of chaotc tme seres of RBF neural network and ustfes the applcaton of the forecastng method of the chaotc tme seres by experment. The mplementaton of chaotc tme seres forecastng based on RBF neural network can ncrease the predcton performance by%. Ths algorthmc applcaton plays a postve role and can be promoted n practce. The optmzed desgn of the forecastng method of chaotc tme seres based on RBF neural network s helpful to mprove the predcton performance of chaotc tme seres and exerts a postve nfluence. Key words:neural Network, Radal Bass Functon, Chaotc Tme Seres... INTRODUCTION In chaotc tme serespredcton, the network nput weght has to be adusted each tme t s entered, whch leads to a low learnng speed of ts global approxmaton to the network. However, based on neural network, RBF learnng has a hgh convergence speed, thus playng an actve role n mprovng the predcton of chaotc tme seres. In domestc research, usng RBF neural network can both approxmate any nonlnear functon and deal wth regularty hard to parse wthn the system. It has a hgh convergence speed n learnng and good generalzaton competence, thus successfully appled to tme seres analyss. In foregn studes, accordng to the theory of Cover, RBF neural network maps data onto a hgher dmensonal space, and then uses the lnear model to do regresson or classfcaton analyss n ths space. By means of RBF neural network, data can be mapped onto a hgher space. Ths applcaton has obtaned good expermental and practcal results and can exert a postve mpact on chaotc tme seres predcton n Chna. Based on RBF neural network, ths paper apples the technque n the predcton of chaotc tme seres, so as to mprove the desgn performance. RBF neural network contans Clusterng Algorthm, Orthogonal Least Squares Method, Gradent Tranng Algorthm, etc. In ths study, Clusterng Algorthm and Gradent Tranng Algorthm wll be used to acheve chaotc tme seres predcton and by the ntroducton of RBF neural network model can be mproved the accuracy of forecastng results and predcton performance.. RBF NEURAL NETWORK AND ITS ADVANTAGES In practce, RBF neural network s a 3-layer feedforward network (Zhang and Xu, 04), whch can be used n functon approxmaton and classfcaton algorthms. Commonly used nstances of RBF network are usually of an n-h-m structure,.e., n nput nodes, h hdden nodes and m output nodes n the network. The specfc structure s shown as the fgure below: x x x m w h w h w h dst b n y Fgure.Structure of RBF neural network RBF can be treated as the hdden layer space consttuted by the bass n ts hdden unts, wheren connecton can not only be establshed wthout the exhauston of weghts but drectly map the nput vector onto the hdden space n the network as well (Chen, Lu and Ma, 0). Moreover, n the neural network, the mappng relatonshp s defnedafter the defnton of the center of RNF. As the output map of the hdden space
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 layer s of a lnear model, ths can mprove the nonlnear self-adaptng of neural network and has good nformaton processng competence. Ths applcaton can be combned wth the tradtonal method and wll help to promote the development of artfcal ntellgence. 3. RBF NEURAL NETWORK ALGORITHM In general, the applcaton flow of RBF neural network algorthm n tme seres predctons as follows: Frstly, ntalze the algorthm. Randomly select h centers of dfferent clusterng from the nput data of RBF neural network samples and currently make k = (Weng and P, 04). =, h, =, N. Secondly, calculate the nput X of all the samples by X -c (k) and get ts dstance from the cluster center. Then classfy X accordng to the prncple of mnmum dstance,.e., (X )=mn X -c (k), wheren X can be grouped nto classfcaton and n nput nto classfcatons n the number of h. Furthermore, at the same tme, all classfcatons of clusterng centers can be recalculated. Thrdly, defne the Gauss factor of each hdden node n the RBF neural network accordng to the dstance from the cluster center. Then defne the date of each hdden node and ts expanson constant and accordng to the weght vector w of output, tran and supervse the learnng method: LMS: If =, N n the nput X n the neural network, for the nput h of the th node, h = U ( X -c ), and the hdden layer s output array s: H=[h ]. Then the vector of RBF neural network output s: y=hw. The followng can be obtaned by applyng Least Square Method to the weghts: + + W=H y ( H s pseudonverse H ); T T H =(H H) H. In actual tme seres predcton, the use of RBF can also solve XOR problems (Dong and L, 0). In RBF neural network, radal bass functon,.e., Gauss functon can be used as the actvatng functon n tme seres predcton. Because the space for nputtng neurons s small, more RBF neurons are needed. The output of RBF network s the lnear weghted sum of the nput of hdden untes, thus enablng a hgher learnng speed. The steps to solve XOR problems are as follows: x R (x) y x R (x) Input RBF Neurons Output Here, X X Y 0 0 0 0 0 0 0 x x R (x) R (x) 0 0.3679 0.3679 0 0.3679 0.3679 0 0 0.353 0.353
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 x R x e x R x e x R x e 0 0 x x x x R x e e 0.3679 4. PRACTICAL APPLICATION OF CHAOTIC TIME SERIES PREDICTION BASED ON RBF NEURAL NETWORK 4.. RBFNetwork Tools In practcal applcaton of RBF neural network, Matlab toolbox almost covers all the commonly used models, ncludng perceptrons, RBF network, etc. for data seres predcton. Moreover, t ntegrates dfferent neural network learnng algorthms on the bass of dfferent network models, thus provdng contnence for users to conduct chaotc tme seres predcton (Gan and Peng, 00; Wu and Wang, 03; Guo et al., 0; L, Zhang and Wang, 05). Ths research employs Matlab toolbox 7.0, whch contans many functons used for RBF network analyss and desgn. The frequently used functons among them are lsted n Table. Table.Matlab functons n RBF neural network and ther features Functon name Feature newrb() To create a new radal bass neural network newrbe() To create a new strct radal bass neural network newgrnn() To create a new generalzed regresson radal bass neural network newpnn() To create a new radal bass probablstc neural network 4.. Constructon of Chaotc Neural Network and Chaotc Neuron Model based on RBF Transent chaotc neural network model s as follows: Among whch, formula () actvaton functon of neuron; x to ndcate the output of the th neuron; y to ndcate the nput of the th neuron; x ( t) f ( y ( t )) () y ( ) ( ) t ky t W x I z ( t)( x ( t) I 0 ) () W to ndcate the weght value connectng from the th neuron to the th neuron; t z ( t ) ( β) z ( t ) (3) I to bas the th neuron; I 0 to be expressed as a normal number; onng strength of neurons,.e., the couplng factor of neurons n RBF neural network; k dampng factor n the nerve daphragm; Formula () s nvarable as the actvaton functon, whch can be ether Sgmodfuncton or other functons compatble wth Sgmod functon. Ths paper adopts Sgmod functon, of whch the model s proposed by Chen&Ahara (Chen and Lu, 0). Sgmod functon s expressed as the followng formula: f ( u) / ( exp( u / )) (4) wheren gan parameter. When =0, the three above-mentoned formulae wll evolve nto a chaotc neuron model: x( t) f ( y( t )) (5) y( t ) ky( t) z( t)( x( t) I 0 ) (6)
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 zt ( ) ( β) zt ( ) (7) 4.3. Algorthm Implementaton Process Frstly, defne the classfcaton of class RBFNet n RBF neural network. Then, call the constructor n RBF neural network, contanng features such as nputnum, outputnum,datanum, etc., respectvely. Thrdly, enter nput and output dmensons and data log (Xu, 0). The procedure secton of learnng functon for Matlab desgn n RBF neural network s as follows:. x=-4:0.0:4;. y=sn((/)*p*x)+sn(p*x); 3. %tranlm choose functon alternatve 4. net=newff(mnmax(x),[,5,],{'tansg','tansg','pureln'},'tranlm'); 5. net.tranparam.epochs=000; 6. net.tranparam.goal=0.0000; 7. net=tran(net,x,y); 8. y=sm(net,x); 9. err=y-y; 0. res=norm(err);. %Pause, press any key to contnue. Pause 3. %drawng, orgnal dagram (smooth blue lne) and smulaton result dagram (red + dot lne) 4. plot(x,y); 5. hold on 6. plot(x,y,'r+'); The next step can call means data of nput data and then nput algorthm data of the calculaton ths tme; Also, callng means() can nput data X n terms of the samples. Accordng to the mnmum dstance prncple, calculate relevant classfcaton nformaton, fnd out the new clusterng center and obtan the extended Gauss facto. Smulaton and lnear regresson analyss s as follows:. % tran. swtch 3. case 4. spread = 0.; 5. net = newrbe(a,tn,spread); 6. case 7. goal =0; 8. spread =0.; 9. MN = sze(a,); 0. DF = 5;. net = newrb(a,tn,goal,spread,mn,df);. case 3 3. spread = 0.; 4. net = newgrnn(a,tn,spread); 5. end 6. % smulaton test 7. YN = sm (net,a); % actual output of traned samples Here, t s necessary to call RBFNet::saveW(double *neww) n the means to calculate the sequence weght, whch s to be stored n fle form for the ease of the next call n seres predcton. The callng method s as RBFNet :: savegaos (d ouble *newg). After calculatng the weghts, save the Gauss factor and possblythe traned network as well. 4.4. Tme Seres Predcton
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 Tme seres predcton n RBF neural network can be acheved through the followng steps. Frstly, ntalze the network and randomly select tranng samples, whch can be treated as the clusterng center of ths predcton. Secondly, nput the tranng samples and collect and group them accordng to Nearest Neghbor Rule. At the same tme, reallocate the clusterng center. Thrdly, calculate collecton clusterng for each center. Here, decmal fracton can be appled to mappng and bnary nteger algorthm can be used to map the chaotc tme seres n RBF neural network as bnarysequences. The specfc algorthm s mplemented as follows: L X x (8) ( X ) ( X ) (9) 0 Accordng to the characterstcs of actvaton functonof Sgmod n RBF neural network, the chaotc tme seres elements can effectvely mantan the value range of [0,]. In fact, because there are a lot of numbers, among whch each value tends to become0 n chaotc tme seres, t wll both ncrease the left bnary nvald 0 and destroy the balance of 0, n the sequence f the length of L s very large. Therefore, n ths study, the RBF neural network algorthm s comprsed to perform calculatng wth L=4. After sequence transformaton, the chaotc tme seres s made accordng to the frst 6 bnary dgts. Three random trals of Golomb Hypothess can be realzed accordng to therbf neural network algorthm above: In ths algorthm tral, the proporton of 0, n pseudo-random bnary sequence s realzed as : accordng to Golomb Hypothess. Table provdes the numbers of 0, and ther rato that has been carred out after many tral predctons. Table.Representaton numbers and rato Number of Rato of 0 and Number of 0 Iteratons 66 0.9856 5884 000 40498 0.9754 3950 5000 64953 0.9707 63047 8000 For the run property predcton, the number of runs wth length at L accounts for a proporton of /L n the total number of runs. In Table 3, the forecast data s obtaned from the above parameters after 000 teratons. Table 3. Run propertes Run number N0/N N N0 Actual Rato Theoretcal Rato.0 43 483 0.58 0.5000000.074 07 68 0.55 0.500000 3 0.957 059 04 0.46 0.50000 4.08 477 56 0.0596 0.065000 5.030 9 36 0.079 0.03500 From Table 3, t can be seen that the obtaned predcton result can approxmate the theoretcal value due to the lmted statstcal tme seres n the predcton process. Hence, the predcton of chaotc tmeseres can conform to the actual run propertes. Fgure.Dagram of autocorrelaton
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 In the predcton of sequental autocorrelaton and of cross correlaton, assume: lm N x x (0) N N 0 Then the autocorrelaton functon s: ac( ) lm N m ( x x)( x m x) () N N 0 And the cross correlaton functon s: cc( ) lm N m ( x x)( x( m) x) () N N 0 In terms of two sequences wth a dfferent ntal value, chaotc tme seres can be generated after the teraton of RBF neural network and takes x and x as the bnary sequences the seres corresponds to, wth mndcatng the actual nterval. For the quantzed chaotc bnary sequences, take a sequental segmentaton at length of 000, detect ts relevant characterstcs and set ts nterval as 500~500. Then the characterstcs of aperodc autocorrelaton and of cross correlaton can be obtaned as Fgures and 3 show below. The predcton result shows that both the autocorrelaton propertes and cross correlaton values are on the low sde. Fgure 3.Dagram of cross correlaton In ths research, the results show that the chaotc tme seres obtaned n RBF neural network s a pseudo random sequence and ts predcton results are uncertan to such an extent that t can mprove the performance of encrypton algorthm. 5. AN ANALYSIS OF APPLICATION BENEFIT Ths research manly employs Matlabsoftware to program for the chaotc tme seres predcton based on RBF neural network, whch enables the applcaton of neural network algorthm to functon approxmaton and sample content predcton as well as analyzng and comparng relevant obtaned parameters.concernng the adopton of chaotc sequences to encrypt XOR data, the plan text of ths experment s as follows: Cryptologst the scence of overt wrtng(cryptography),of ts authorzed decrypton (cryptanalyss),and of the rules whch are n turn ntended to make that unauthorzed decrypton dffcult(encrypton securty). Through the expermental analyss, we can get the followng data related to probablty statstcs, as shown n Fgures 4 and 5.
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 Fgure 4.Statstcal probablty of plan text characters (maxmum 0.048) Fgure 5.Statstcal probablty of cpher text characters (maxmum 0.090) From Fgures 4 and 5 above, t can be seen that the maxmal occurrencefrequency of plan text characters s 0.048. After encrypton, the probablty of cpher text characters has become average wth the maxmal frequency as 0.090 only. The chaotc tme seres predcton method based on RBF neural network can ncrease ts performance by %, thus exertng a postve mpact. 6. CONCLUSIONS In summary, n chaotc tme seres predcton the accuracy of predcton can be mproved on the bass of RBF neural network. Ths forecastng method plays an actve role n practcal applcaton.it can be seen through the test that the method has effectvely ncreased the dffculty to decrypt and mproved the system performance.the chaotc dynamc system has such good pseudorandom propertes that t can be well appled n stream cpher encrypton. Therefore, ths technology s worth promotng n practce. REFERENCES Chen D. Y., Lu Y., Ma X. Y.(0) Parameter Jont Estmaton of Phase Space Reconstructon n Chaotc Tme Seres based on Radal Bass Functon Neural Networks,Acta Physca Snca,6(0), pp.-3. Dong J. X, L Q. (0) Based on Genetc Algorthm Optmzaton RBF Neural Network for Predctng Chaotc Tme Seres,Bulletn of Scence and Technology,8(8), pp.66-68,7. Gan M., Peng H. (00) Predctng Chaotc Tme Seres Usng RBF-AR Model wth Regresson Weght,Systems Engneerng and Electroncs,3(4), pp.80-84. Guo L.P., Yu J.N., Zhang X.D., Q Y.J., Zhang J.G. (0) Chaotc Tme Seres Forecastng Model based on the Improved RBFNN,Journal of Yunnan Unversty of Natonaltes(Natural Scences Edton), 0(), pp.63-70.
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 L R.G., Zhang H.L., Wang Y. (05) New Orthogonal Bass Neural Network based on Quantum Partcle Swarm Optmzaton Algorthm for Fractonal Order Chaotc Tme Seres Sngle-Step Predcton,Journal of Computer Applcatons,35(8), pp.7-3. Weng H., P D.C. (04) Chaotc RBF Neural Network Anomaly Detecton Algorthm,Computer Technology and Development,04(7), pp.9-33. Wu K.J., Wang T.J. (03) Predcton of Chaotc Tme Seres based on RBF Neural Network Optmzaton,Computer Engneerng, 39(0), pp.08-,6. Xu G.L. (0) Predcton for Traffc Flow of RBF Neural Network Based On Cloud Genetc Algorthm,Computer Engneerng and Applcatons,04(6):6-0. Zhang C., Xu G.L. (04) Predcton for Traffc Flow of RBF Neural Network based on Cloud Genetc Algorthm,Computer Engneerng and Applcatons, 50(6), pp.6-0