doi: / Chaotic Time Series Prediction Based on RBF Neural Network

Similar documents
Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

EEE 241: Linear Systems

Wavelet chaotic neural networks and their application to continuous function optimization

A Network Intrusion Detection Method Based on Improved K-means Algorithm

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Generalized Linear Methods

Comparison of Regression Lines

Negative Binomial Regression

A Prediction Method of Spacecraft Telemetry Parameter Based On Chaos Theory

A Fast Computer Aided Design Method for Filters

Study on Project Bidding Risk Evaluation Based on BP Neural Network Theory

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

Study on Active Micro-vibration Isolation System with Linear Motor Actuator. Gong-yu PAN, Wen-yan GU and Dong LI

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Solving Nonlinear Differential Equations by a Neural Network Method

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Multilayer Perceptron (MLP)

An Improved multiple fractal algorithm

MATH 567: Mathematical Techniques in Data Science Lab 8

Multigradient for Neural Networks for Equalizers 1

Coke Ratio Prediction Based on Immune Particle Swarm Neural Networks

Uncertainty and auto-correlation in. Measurement

Boostrapaggregating (Bagging)

Application research on rough set -neural network in the fault diagnosis system of ball mill

Ensemble Methods: Boosting

Supporting Information

A Short Term Forecasting Method for Wind Power Generation System based on BP Neural Networks

Multi-layer neural networks

Non-linear Canonical Correlation Analysis Using a RBF Network

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

VQ widely used in coding speech, image, and video

Week 5: Neural Networks

Scroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator

Lecture Notes on Linear Regression

Nonlinear Classifiers II

Chapter 5 Multilevel Models

829. An adaptive method for inertia force identification in cantilever under moving mass

Which Separator? Spring 1

Kernel Methods and SVMs Extension

An Iterative Modified Kernel for Support Vector Regression

A New Grey Relational Fusion Algorithm Based on Approximate Antropy

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

A Robust Method for Calculating the Correlation Coefficient

Orientation Model of Elite Education and Mass Education

Grey prediction model in world women s pentathlon performance prediction applied research

Jifeng Zuo School of Science, Agricultural University of Hebei, Baoding , Hebei,China

Multilayer neural networks

Appendix B: Resampling Algorithms

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane

/ n ) are compared. The logic is: if the two

Microwave Diversity Imaging Compression Using Bioinspired

Lecture 10 Support Vector Machines II

Research on Route guidance of logistic scheduling problem under fuzzy time window

CHAPTER IV RESEARCH FINDING AND DISCUSSIONS

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Natural Language Processing and Information Retrieval

Efficient Weather Forecasting using Artificial Neural Network as Function Approximator

MDL-Based Unsupervised Attribute Ranking

Online Classification: Perceptron and Winnow

Study on Non-Linear Dynamic Characteristic of Vehicle. Suspension Rubber Component

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Parameter Estimation for Dynamic System using Unscented Kalman filter

SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES. R. Pinto, S. Cavalieri

Introduction to the Introduction to Artificial Neural Network

Improvement of Histogram Equalization for Minimum Mean Brightness Error

Evaluation of classifiers MLPs

Fault Diagnosis of Autonomous Underwater Vehicles

Statistics for Economics & Business

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

Review of Taylor Series. Read Section 1.2

Kristin P. Bennett. Rensselaer Polytechnic Institute

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Radial-Basis Function Networks

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

IV. Performance Optimization

PCA-PSO-BP Neural Network Application in IDS Lan SHI a, YanLong YANG b JanHui LV c

The Study of Teaching-learning-based Optimization Algorithm

Pulse Coded Modulation

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Homework Assignment 3 Due in class, Thursday October 15

Lecture 23: Artificial neural networks

An Evolutionary Method of Neural Network in System Identification

Linear Feature Engineering 11

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Bearing Remaining Useful Life Prediction Based on an Improved Back Propagation Neural Network

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Queueing Networks II Network Performance

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Assessment of Site Amplification Effect from Input Energy Spectra of Strong Ground Motion

Transcription:

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 do:0.3/00.39..38 Chaotc Tme Seres Predcton Based on RBF Neural Network Yepng Peng School of Software and Servce Outsourcng, Jshou Unversty, Zhangae47000, Hunan, Chna Abstract Ths paper conducts research on the forecastng method of chaotc tme seres based on RBF (Radal Bass Functon n abbrevaton) neural network. Frstly, t provdes an analyss of the nfluence of the parameters of the chaotc dynamc system on ther system. Secondly, t contnues wth the descrpton of chaotc tme seres of RBF neural network and ustfes the applcaton of the forecastng method of the chaotc tme seres by experment. The mplementaton of chaotc tme seres forecastng based on RBF neural network can ncrease the predcton performance by%. Ths algorthmc applcaton plays a postve role and can be promoted n practce. The optmzed desgn of the forecastng method of chaotc tme seres based on RBF neural network s helpful to mprove the predcton performance of chaotc tme seres and exerts a postve nfluence. Key words:neural Network, Radal Bass Functon, Chaotc Tme Seres... INTRODUCTION In chaotc tme serespredcton, the network nput weght has to be adusted each tme t s entered, whch leads to a low learnng speed of ts global approxmaton to the network. However, based on neural network, RBF learnng has a hgh convergence speed, thus playng an actve role n mprovng the predcton of chaotc tme seres. In domestc research, usng RBF neural network can both approxmate any nonlnear functon and deal wth regularty hard to parse wthn the system. It has a hgh convergence speed n learnng and good generalzaton competence, thus successfully appled to tme seres analyss. In foregn studes, accordng to the theory of Cover, RBF neural network maps data onto a hgher dmensonal space, and then uses the lnear model to do regresson or classfcaton analyss n ths space. By means of RBF neural network, data can be mapped onto a hgher space. Ths applcaton has obtaned good expermental and practcal results and can exert a postve mpact on chaotc tme seres predcton n Chna. Based on RBF neural network, ths paper apples the technque n the predcton of chaotc tme seres, so as to mprove the desgn performance. RBF neural network contans Clusterng Algorthm, Orthogonal Least Squares Method, Gradent Tranng Algorthm, etc. In ths study, Clusterng Algorthm and Gradent Tranng Algorthm wll be used to acheve chaotc tme seres predcton and by the ntroducton of RBF neural network model can be mproved the accuracy of forecastng results and predcton performance.. RBF NEURAL NETWORK AND ITS ADVANTAGES In practce, RBF neural network s a 3-layer feedforward network (Zhang and Xu, 04), whch can be used n functon approxmaton and classfcaton algorthms. Commonly used nstances of RBF network are usually of an n-h-m structure,.e., n nput nodes, h hdden nodes and m output nodes n the network. The specfc structure s shown as the fgure below: x x x m w h w h w h dst b n y Fgure.Structure of RBF neural network RBF can be treated as the hdden layer space consttuted by the bass n ts hdden unts, wheren connecton can not only be establshed wthout the exhauston of weghts but drectly map the nput vector onto the hdden space n the network as well (Chen, Lu and Ma, 0). Moreover, n the neural network, the mappng relatonshp s defnedafter the defnton of the center of RNF. As the output map of the hdden space

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 layer s of a lnear model, ths can mprove the nonlnear self-adaptng of neural network and has good nformaton processng competence. Ths applcaton can be combned wth the tradtonal method and wll help to promote the development of artfcal ntellgence. 3. RBF NEURAL NETWORK ALGORITHM In general, the applcaton flow of RBF neural network algorthm n tme seres predctons as follows: Frstly, ntalze the algorthm. Randomly select h centers of dfferent clusterng from the nput data of RBF neural network samples and currently make k = (Weng and P, 04). =, h, =, N. Secondly, calculate the nput X of all the samples by X -c (k) and get ts dstance from the cluster center. Then classfy X accordng to the prncple of mnmum dstance,.e., (X )=mn X -c (k), wheren X can be grouped nto classfcaton and n nput nto classfcatons n the number of h. Furthermore, at the same tme, all classfcatons of clusterng centers can be recalculated. Thrdly, defne the Gauss factor of each hdden node n the RBF neural network accordng to the dstance from the cluster center. Then defne the date of each hdden node and ts expanson constant and accordng to the weght vector w of output, tran and supervse the learnng method: LMS: If =, N n the nput X n the neural network, for the nput h of the th node, h = U ( X -c ), and the hdden layer s output array s: H=[h ]. Then the vector of RBF neural network output s: y=hw. The followng can be obtaned by applyng Least Square Method to the weghts: + + W=H y ( H s pseudonverse H ); T T H =(H H) H. In actual tme seres predcton, the use of RBF can also solve XOR problems (Dong and L, 0). In RBF neural network, radal bass functon,.e., Gauss functon can be used as the actvatng functon n tme seres predcton. Because the space for nputtng neurons s small, more RBF neurons are needed. The output of RBF network s the lnear weghted sum of the nput of hdden untes, thus enablng a hgher learnng speed. The steps to solve XOR problems are as follows: x R (x) y x R (x) Input RBF Neurons Output Here, X X Y 0 0 0 0 0 0 0 x x R (x) R (x) 0 0.3679 0.3679 0 0.3679 0.3679 0 0 0.353 0.353

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 x R x e x R x e x R x e 0 0 x x x x R x e e 0.3679 4. PRACTICAL APPLICATION OF CHAOTIC TIME SERIES PREDICTION BASED ON RBF NEURAL NETWORK 4.. RBFNetwork Tools In practcal applcaton of RBF neural network, Matlab toolbox almost covers all the commonly used models, ncludng perceptrons, RBF network, etc. for data seres predcton. Moreover, t ntegrates dfferent neural network learnng algorthms on the bass of dfferent network models, thus provdng contnence for users to conduct chaotc tme seres predcton (Gan and Peng, 00; Wu and Wang, 03; Guo et al., 0; L, Zhang and Wang, 05). Ths research employs Matlab toolbox 7.0, whch contans many functons used for RBF network analyss and desgn. The frequently used functons among them are lsted n Table. Table.Matlab functons n RBF neural network and ther features Functon name Feature newrb() To create a new radal bass neural network newrbe() To create a new strct radal bass neural network newgrnn() To create a new generalzed regresson radal bass neural network newpnn() To create a new radal bass probablstc neural network 4.. Constructon of Chaotc Neural Network and Chaotc Neuron Model based on RBF Transent chaotc neural network model s as follows: Among whch, formula () actvaton functon of neuron; x to ndcate the output of the th neuron; y to ndcate the nput of the th neuron; x ( t) f ( y ( t )) () y ( ) ( ) t ky t W x I z ( t)( x ( t) I 0 ) () W to ndcate the weght value connectng from the th neuron to the th neuron; t z ( t ) ( β) z ( t ) (3) I to bas the th neuron; I 0 to be expressed as a normal number; onng strength of neurons,.e., the couplng factor of neurons n RBF neural network; k dampng factor n the nerve daphragm; Formula () s nvarable as the actvaton functon, whch can be ether Sgmodfuncton or other functons compatble wth Sgmod functon. Ths paper adopts Sgmod functon, of whch the model s proposed by Chen&Ahara (Chen and Lu, 0). Sgmod functon s expressed as the followng formula: f ( u) / ( exp( u / )) (4) wheren gan parameter. When =0, the three above-mentoned formulae wll evolve nto a chaotc neuron model: x( t) f ( y( t )) (5) y( t ) ky( t) z( t)( x( t) I 0 ) (6)

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 zt ( ) ( β) zt ( ) (7) 4.3. Algorthm Implementaton Process Frstly, defne the classfcaton of class RBFNet n RBF neural network. Then, call the constructor n RBF neural network, contanng features such as nputnum, outputnum,datanum, etc., respectvely. Thrdly, enter nput and output dmensons and data log (Xu, 0). The procedure secton of learnng functon for Matlab desgn n RBF neural network s as follows:. x=-4:0.0:4;. y=sn((/)*p*x)+sn(p*x); 3. %tranlm choose functon alternatve 4. net=newff(mnmax(x),[,5,],{'tansg','tansg','pureln'},'tranlm'); 5. net.tranparam.epochs=000; 6. net.tranparam.goal=0.0000; 7. net=tran(net,x,y); 8. y=sm(net,x); 9. err=y-y; 0. res=norm(err);. %Pause, press any key to contnue. Pause 3. %drawng, orgnal dagram (smooth blue lne) and smulaton result dagram (red + dot lne) 4. plot(x,y); 5. hold on 6. plot(x,y,'r+'); The next step can call means data of nput data and then nput algorthm data of the calculaton ths tme; Also, callng means() can nput data X n terms of the samples. Accordng to the mnmum dstance prncple, calculate relevant classfcaton nformaton, fnd out the new clusterng center and obtan the extended Gauss facto. Smulaton and lnear regresson analyss s as follows:. % tran. swtch 3. case 4. spread = 0.; 5. net = newrbe(a,tn,spread); 6. case 7. goal =0; 8. spread =0.; 9. MN = sze(a,); 0. DF = 5;. net = newrb(a,tn,goal,spread,mn,df);. case 3 3. spread = 0.; 4. net = newgrnn(a,tn,spread); 5. end 6. % smulaton test 7. YN = sm (net,a); % actual output of traned samples Here, t s necessary to call RBFNet::saveW(double *neww) n the means to calculate the sequence weght, whch s to be stored n fle form for the ease of the next call n seres predcton. The callng method s as RBFNet :: savegaos (d ouble *newg). After calculatng the weghts, save the Gauss factor and possblythe traned network as well. 4.4. Tme Seres Predcton

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 Tme seres predcton n RBF neural network can be acheved through the followng steps. Frstly, ntalze the network and randomly select tranng samples, whch can be treated as the clusterng center of ths predcton. Secondly, nput the tranng samples and collect and group them accordng to Nearest Neghbor Rule. At the same tme, reallocate the clusterng center. Thrdly, calculate collecton clusterng for each center. Here, decmal fracton can be appled to mappng and bnary nteger algorthm can be used to map the chaotc tme seres n RBF neural network as bnarysequences. The specfc algorthm s mplemented as follows: L X x (8) ( X ) ( X ) (9) 0 Accordng to the characterstcs of actvaton functonof Sgmod n RBF neural network, the chaotc tme seres elements can effectvely mantan the value range of [0,]. In fact, because there are a lot of numbers, among whch each value tends to become0 n chaotc tme seres, t wll both ncrease the left bnary nvald 0 and destroy the balance of 0, n the sequence f the length of L s very large. Therefore, n ths study, the RBF neural network algorthm s comprsed to perform calculatng wth L=4. After sequence transformaton, the chaotc tme seres s made accordng to the frst 6 bnary dgts. Three random trals of Golomb Hypothess can be realzed accordng to therbf neural network algorthm above: In ths algorthm tral, the proporton of 0, n pseudo-random bnary sequence s realzed as : accordng to Golomb Hypothess. Table provdes the numbers of 0, and ther rato that has been carred out after many tral predctons. Table.Representaton numbers and rato Number of Rato of 0 and Number of 0 Iteratons 66 0.9856 5884 000 40498 0.9754 3950 5000 64953 0.9707 63047 8000 For the run property predcton, the number of runs wth length at L accounts for a proporton of /L n the total number of runs. In Table 3, the forecast data s obtaned from the above parameters after 000 teratons. Table 3. Run propertes Run number N0/N N N0 Actual Rato Theoretcal Rato.0 43 483 0.58 0.5000000.074 07 68 0.55 0.500000 3 0.957 059 04 0.46 0.50000 4.08 477 56 0.0596 0.065000 5.030 9 36 0.079 0.03500 From Table 3, t can be seen that the obtaned predcton result can approxmate the theoretcal value due to the lmted statstcal tme seres n the predcton process. Hence, the predcton of chaotc tmeseres can conform to the actual run propertes. Fgure.Dagram of autocorrelaton

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 In the predcton of sequental autocorrelaton and of cross correlaton, assume: lm N x x (0) N N 0 Then the autocorrelaton functon s: ac( ) lm N m ( x x)( x m x) () N N 0 And the cross correlaton functon s: cc( ) lm N m ( x x)( x( m) x) () N N 0 In terms of two sequences wth a dfferent ntal value, chaotc tme seres can be generated after the teraton of RBF neural network and takes x and x as the bnary sequences the seres corresponds to, wth mndcatng the actual nterval. For the quantzed chaotc bnary sequences, take a sequental segmentaton at length of 000, detect ts relevant characterstcs and set ts nterval as 500~500. Then the characterstcs of aperodc autocorrelaton and of cross correlaton can be obtaned as Fgures and 3 show below. The predcton result shows that both the autocorrelaton propertes and cross correlaton values are on the low sde. Fgure 3.Dagram of cross correlaton In ths research, the results show that the chaotc tme seres obtaned n RBF neural network s a pseudo random sequence and ts predcton results are uncertan to such an extent that t can mprove the performance of encrypton algorthm. 5. AN ANALYSIS OF APPLICATION BENEFIT Ths research manly employs Matlabsoftware to program for the chaotc tme seres predcton based on RBF neural network, whch enables the applcaton of neural network algorthm to functon approxmaton and sample content predcton as well as analyzng and comparng relevant obtaned parameters.concernng the adopton of chaotc sequences to encrypt XOR data, the plan text of ths experment s as follows: Cryptologst the scence of overt wrtng(cryptography),of ts authorzed decrypton (cryptanalyss),and of the rules whch are n turn ntended to make that unauthorzed decrypton dffcult(encrypton securty). Through the expermental analyss, we can get the followng data related to probablty statstcs, as shown n Fgures 4 and 5.

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 Fgure 4.Statstcal probablty of plan text characters (maxmum 0.048) Fgure 5.Statstcal probablty of cpher text characters (maxmum 0.090) From Fgures 4 and 5 above, t can be seen that the maxmal occurrencefrequency of plan text characters s 0.048. After encrypton, the probablty of cpher text characters has become average wth the maxmal frequency as 0.090 only. The chaotc tme seres predcton method based on RBF neural network can ncrease ts performance by %, thus exertng a postve mpact. 6. CONCLUSIONS In summary, n chaotc tme seres predcton the accuracy of predcton can be mproved on the bass of RBF neural network. Ths forecastng method plays an actve role n practcal applcaton.it can be seen through the test that the method has effectvely ncreased the dffculty to decrypt and mproved the system performance.the chaotc dynamc system has such good pseudorandom propertes that t can be well appled n stream cpher encrypton. Therefore, ths technology s worth promotng n practce. REFERENCES Chen D. Y., Lu Y., Ma X. Y.(0) Parameter Jont Estmaton of Phase Space Reconstructon n Chaotc Tme Seres based on Radal Bass Functon Neural Networks,Acta Physca Snca,6(0), pp.-3. Dong J. X, L Q. (0) Based on Genetc Algorthm Optmzaton RBF Neural Network for Predctng Chaotc Tme Seres,Bulletn of Scence and Technology,8(8), pp.66-68,7. Gan M., Peng H. (00) Predctng Chaotc Tme Seres Usng RBF-AR Model wth Regresson Weght,Systems Engneerng and Electroncs,3(4), pp.80-84. Guo L.P., Yu J.N., Zhang X.D., Q Y.J., Zhang J.G. (0) Chaotc Tme Seres Forecastng Model based on the Improved RBFNN,Journal of Yunnan Unversty of Natonaltes(Natural Scences Edton), 0(), pp.63-70.

Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, 339-345, 06 L R.G., Zhang H.L., Wang Y. (05) New Orthogonal Bass Neural Network based on Quantum Partcle Swarm Optmzaton Algorthm for Fractonal Order Chaotc Tme Seres Sngle-Step Predcton,Journal of Computer Applcatons,35(8), pp.7-3. Weng H., P D.C. (04) Chaotc RBF Neural Network Anomaly Detecton Algorthm,Computer Technology and Development,04(7), pp.9-33. Wu K.J., Wang T.J. (03) Predcton of Chaotc Tme Seres based on RBF Neural Network Optmzaton,Computer Engneerng, 39(0), pp.08-,6. Xu G.L. (0) Predcton for Traffc Flow of RBF Neural Network Based On Cloud Genetc Algorthm,Computer Engneerng and Applcatons,04(6):6-0. Zhang C., Xu G.L. (04) Predcton for Traffc Flow of RBF Neural Network based on Cloud Genetc Algorthm,Computer Engneerng and Applcatons, 50(6), pp.6-0