doi: / Chaotic Time Series Prediction Based on RBF Neural Network
|
|
- Priscilla Boyd
- 5 years ago
- Views:
Transcription
1 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 do:0.3/ Chaotc Tme Seres Predcton Based on RBF Neural Network Yepng Peng School of Software and Servce Outsourcng, Jshou Unversty, Zhangae47000, Hunan, Chna Abstract Ths paper conducts research on the forecastng method of chaotc tme seres based on RBF (Radal Bass Functon n abbrevaton) neural network. Frstly, t provdes an analyss of the nfluence of the parameters of the chaotc dynamc system on ther system. Secondly, t contnues wth the descrpton of chaotc tme seres of RBF neural network and ustfes the applcaton of the forecastng method of the chaotc tme seres by experment. The mplementaton of chaotc tme seres forecastng based on RBF neural network can ncrease the predcton performance by%. Ths algorthmc applcaton plays a postve role and can be promoted n practce. The optmzed desgn of the forecastng method of chaotc tme seres based on RBF neural network s helpful to mprove the predcton performance of chaotc tme seres and exerts a postve nfluence. Key words:neural Network, Radal Bass Functon, Chaotc Tme Seres... INTRODUCTION In chaotc tme serespredcton, the network nput weght has to be adusted each tme t s entered, whch leads to a low learnng speed of ts global approxmaton to the network. However, based on neural network, RBF learnng has a hgh convergence speed, thus playng an actve role n mprovng the predcton of chaotc tme seres. In domestc research, usng RBF neural network can both approxmate any nonlnear functon and deal wth regularty hard to parse wthn the system. It has a hgh convergence speed n learnng and good generalzaton competence, thus successfully appled to tme seres analyss. In foregn studes, accordng to the theory of Cover, RBF neural network maps data onto a hgher dmensonal space, and then uses the lnear model to do regresson or classfcaton analyss n ths space. By means of RBF neural network, data can be mapped onto a hgher space. Ths applcaton has obtaned good expermental and practcal results and can exert a postve mpact on chaotc tme seres predcton n Chna. Based on RBF neural network, ths paper apples the technque n the predcton of chaotc tme seres, so as to mprove the desgn performance. RBF neural network contans Clusterng Algorthm, Orthogonal Least Squares Method, Gradent Tranng Algorthm, etc. In ths study, Clusterng Algorthm and Gradent Tranng Algorthm wll be used to acheve chaotc tme seres predcton and by the ntroducton of RBF neural network model can be mproved the accuracy of forecastng results and predcton performance.. RBF NEURAL NETWORK AND ITS ADVANTAGES In practce, RBF neural network s a 3-layer feedforward network (Zhang and Xu, 04), whch can be used n functon approxmaton and classfcaton algorthms. Commonly used nstances of RBF network are usually of an n-h-m structure,.e., n nput nodes, h hdden nodes and m output nodes n the network. The specfc structure s shown as the fgure below: x x x m w h w h w h dst b n y Fgure.Structure of RBF neural network RBF can be treated as the hdden layer space consttuted by the bass n ts hdden unts, wheren connecton can not only be establshed wthout the exhauston of weghts but drectly map the nput vector onto the hdden space n the network as well (Chen, Lu and Ma, 0). Moreover, n the neural network, the mappng relatonshp s defnedafter the defnton of the center of RNF. As the output map of the hdden space
2 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 layer s of a lnear model, ths can mprove the nonlnear self-adaptng of neural network and has good nformaton processng competence. Ths applcaton can be combned wth the tradtonal method and wll help to promote the development of artfcal ntellgence. 3. RBF NEURAL NETWORK ALGORITHM In general, the applcaton flow of RBF neural network algorthm n tme seres predctons as follows: Frstly, ntalze the algorthm. Randomly select h centers of dfferent clusterng from the nput data of RBF neural network samples and currently make k = (Weng and P, 04). =, h, =, N. Secondly, calculate the nput X of all the samples by X -c (k) and get ts dstance from the cluster center. Then classfy X accordng to the prncple of mnmum dstance,.e., (X )=mn X -c (k), wheren X can be grouped nto classfcaton and n nput nto classfcatons n the number of h. Furthermore, at the same tme, all classfcatons of clusterng centers can be recalculated. Thrdly, defne the Gauss factor of each hdden node n the RBF neural network accordng to the dstance from the cluster center. Then defne the date of each hdden node and ts expanson constant and accordng to the weght vector w of output, tran and supervse the learnng method: LMS: If =, N n the nput X n the neural network, for the nput h of the th node, h = U ( X -c ), and the hdden layer s output array s: H=[h ]. Then the vector of RBF neural network output s: y=hw. The followng can be obtaned by applyng Least Square Method to the weghts: + + W=H y ( H s pseudonverse H ); T T H =(H H) H. In actual tme seres predcton, the use of RBF can also solve XOR problems (Dong and L, 0). In RBF neural network, radal bass functon,.e., Gauss functon can be used as the actvatng functon n tme seres predcton. Because the space for nputtng neurons s small, more RBF neurons are needed. The output of RBF network s the lnear weghted sum of the nput of hdden untes, thus enablng a hgher learnng speed. The steps to solve XOR problems are as follows: x R (x) y x R (x) Input RBF Neurons Output Here, X X Y x x R (x) R (x)
3 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 x R x e x R x e x R x e 0 0 x x x x R x e e PRACTICAL APPLICATION OF CHAOTIC TIME SERIES PREDICTION BASED ON RBF NEURAL NETWORK 4.. RBFNetwork Tools In practcal applcaton of RBF neural network, Matlab toolbox almost covers all the commonly used models, ncludng perceptrons, RBF network, etc. for data seres predcton. Moreover, t ntegrates dfferent neural network learnng algorthms on the bass of dfferent network models, thus provdng contnence for users to conduct chaotc tme seres predcton (Gan and Peng, 00; Wu and Wang, 03; Guo et al., 0; L, Zhang and Wang, 05). Ths research employs Matlab toolbox 7.0, whch contans many functons used for RBF network analyss and desgn. The frequently used functons among them are lsted n Table. Table.Matlab functons n RBF neural network and ther features Functon name Feature newrb() To create a new radal bass neural network newrbe() To create a new strct radal bass neural network newgrnn() To create a new generalzed regresson radal bass neural network newpnn() To create a new radal bass probablstc neural network 4.. Constructon of Chaotc Neural Network and Chaotc Neuron Model based on RBF Transent chaotc neural network model s as follows: Among whch, formula () actvaton functon of neuron; x to ndcate the output of the th neuron; y to ndcate the nput of the th neuron; x ( t) f ( y ( t )) () y ( ) ( ) t ky t W x I z ( t)( x ( t) I 0 ) () W to ndcate the weght value connectng from the th neuron to the th neuron; t z ( t ) ( β) z ( t ) (3) I to bas the th neuron; I 0 to be expressed as a normal number; onng strength of neurons,.e., the couplng factor of neurons n RBF neural network; k dampng factor n the nerve daphragm; Formula () s nvarable as the actvaton functon, whch can be ether Sgmodfuncton or other functons compatble wth Sgmod functon. Ths paper adopts Sgmod functon, of whch the model s proposed by Chen&Ahara (Chen and Lu, 0). Sgmod functon s expressed as the followng formula: f ( u) / ( exp( u / )) (4) wheren gan parameter. When =0, the three above-mentoned formulae wll evolve nto a chaotc neuron model: x( t) f ( y( t )) (5) y( t ) ky( t) z( t)( x( t) I 0 ) (6)
4 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 zt ( ) ( β) zt ( ) (7) 4.3. Algorthm Implementaton Process Frstly, defne the classfcaton of class RBFNet n RBF neural network. Then, call the constructor n RBF neural network, contanng features such as nputnum, outputnum,datanum, etc., respectvely. Thrdly, enter nput and output dmensons and data log (Xu, 0). The procedure secton of learnng functon for Matlab desgn n RBF neural network s as follows:. x=-4:0.0:4;. y=sn((/)*p*x)+sn(p*x); 3. %tranlm choose functon alternatve 4. net=newff(mnmax(x),[,5,],{'tansg','tansg','pureln'},'tranlm'); 5. net.tranparam.epochs=000; 6. net.tranparam.goal=0.0000; 7. net=tran(net,x,y); 8. y=sm(net,x); 9. err=y-y; 0. res=norm(err);. %Pause, press any key to contnue. Pause 3. %drawng, orgnal dagram (smooth blue lne) and smulaton result dagram (red + dot lne) 4. plot(x,y); 5. hold on 6. plot(x,y,'r+'); The next step can call means data of nput data and then nput algorthm data of the calculaton ths tme; Also, callng means() can nput data X n terms of the samples. Accordng to the mnmum dstance prncple, calculate relevant classfcaton nformaton, fnd out the new clusterng center and obtan the extended Gauss facto. Smulaton and lnear regresson analyss s as follows:. % tran. swtch 3. case 4. spread = 0.; 5. net = newrbe(a,tn,spread); 6. case 7. goal =0; 8. spread =0.; 9. MN = sze(a,); 0. DF = 5;. net = newrb(a,tn,goal,spread,mn,df);. case 3 3. spread = 0.; 4. net = newgrnn(a,tn,spread); 5. end 6. % smulaton test 7. YN = sm (net,a); % actual output of traned samples Here, t s necessary to call RBFNet::saveW(double *neww) n the means to calculate the sequence weght, whch s to be stored n fle form for the ease of the next call n seres predcton. The callng method s as RBFNet :: savegaos (d ouble *newg). After calculatng the weghts, save the Gauss factor and possblythe traned network as well Tme Seres Predcton
5 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 Tme seres predcton n RBF neural network can be acheved through the followng steps. Frstly, ntalze the network and randomly select tranng samples, whch can be treated as the clusterng center of ths predcton. Secondly, nput the tranng samples and collect and group them accordng to Nearest Neghbor Rule. At the same tme, reallocate the clusterng center. Thrdly, calculate collecton clusterng for each center. Here, decmal fracton can be appled to mappng and bnary nteger algorthm can be used to map the chaotc tme seres n RBF neural network as bnarysequences. The specfc algorthm s mplemented as follows: L X x (8) ( X ) ( X ) (9) 0 Accordng to the characterstcs of actvaton functonof Sgmod n RBF neural network, the chaotc tme seres elements can effectvely mantan the value range of [0,]. In fact, because there are a lot of numbers, among whch each value tends to become0 n chaotc tme seres, t wll both ncrease the left bnary nvald 0 and destroy the balance of 0, n the sequence f the length of L s very large. Therefore, n ths study, the RBF neural network algorthm s comprsed to perform calculatng wth L=4. After sequence transformaton, the chaotc tme seres s made accordng to the frst 6 bnary dgts. Three random trals of Golomb Hypothess can be realzed accordng to therbf neural network algorthm above: In ths algorthm tral, the proporton of 0, n pseudo-random bnary sequence s realzed as : accordng to Golomb Hypothess. Table provdes the numbers of 0, and ther rato that has been carred out after many tral predctons. Table.Representaton numbers and rato Number of Rato of 0 and Number of 0 Iteratons For the run property predcton, the number of runs wth length at L accounts for a proporton of /L n the total number of runs. In Table 3, the forecast data s obtaned from the above parameters after 000 teratons. Table 3. Run propertes Run number N0/N N N0 Actual Rato Theoretcal Rato From Table 3, t can be seen that the obtaned predcton result can approxmate the theoretcal value due to the lmted statstcal tme seres n the predcton process. Hence, the predcton of chaotc tmeseres can conform to the actual run propertes. Fgure.Dagram of autocorrelaton
6 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 In the predcton of sequental autocorrelaton and of cross correlaton, assume: lm N x x (0) N N 0 Then the autocorrelaton functon s: ac( ) lm N m ( x x)( x m x) () N N 0 And the cross correlaton functon s: cc( ) lm N m ( x x)( x( m) x) () N N 0 In terms of two sequences wth a dfferent ntal value, chaotc tme seres can be generated after the teraton of RBF neural network and takes x and x as the bnary sequences the seres corresponds to, wth mndcatng the actual nterval. For the quantzed chaotc bnary sequences, take a sequental segmentaton at length of 000, detect ts relevant characterstcs and set ts nterval as 500~500. Then the characterstcs of aperodc autocorrelaton and of cross correlaton can be obtaned as Fgures and 3 show below. The predcton result shows that both the autocorrelaton propertes and cross correlaton values are on the low sde. Fgure 3.Dagram of cross correlaton In ths research, the results show that the chaotc tme seres obtaned n RBF neural network s a pseudo random sequence and ts predcton results are uncertan to such an extent that t can mprove the performance of encrypton algorthm. 5. AN ANALYSIS OF APPLICATION BENEFIT Ths research manly employs Matlabsoftware to program for the chaotc tme seres predcton based on RBF neural network, whch enables the applcaton of neural network algorthm to functon approxmaton and sample content predcton as well as analyzng and comparng relevant obtaned parameters.concernng the adopton of chaotc sequences to encrypt XOR data, the plan text of ths experment s as follows: Cryptologst the scence of overt wrtng(cryptography),of ts authorzed decrypton (cryptanalyss),and of the rules whch are n turn ntended to make that unauthorzed decrypton dffcult(encrypton securty). Through the expermental analyss, we can get the followng data related to probablty statstcs, as shown n Fgures 4 and 5.
7 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 Fgure 4.Statstcal probablty of plan text characters (maxmum 0.048) Fgure 5.Statstcal probablty of cpher text characters (maxmum 0.090) From Fgures 4 and 5 above, t can be seen that the maxmal occurrencefrequency of plan text characters s After encrypton, the probablty of cpher text characters has become average wth the maxmal frequency as only. The chaotc tme seres predcton method based on RBF neural network can ncrease ts performance by %, thus exertng a postve mpact. 6. CONCLUSIONS In summary, n chaotc tme seres predcton the accuracy of predcton can be mproved on the bass of RBF neural network. Ths forecastng method plays an actve role n practcal applcaton.it can be seen through the test that the method has effectvely ncreased the dffculty to decrypt and mproved the system performance.the chaotc dynamc system has such good pseudorandom propertes that t can be well appled n stream cpher encrypton. Therefore, ths technology s worth promotng n practce. REFERENCES Chen D. Y., Lu Y., Ma X. Y.(0) Parameter Jont Estmaton of Phase Space Reconstructon n Chaotc Tme Seres based on Radal Bass Functon Neural Networks,Acta Physca Snca,6(0), pp.-3. Dong J. X, L Q. (0) Based on Genetc Algorthm Optmzaton RBF Neural Network for Predctng Chaotc Tme Seres,Bulletn of Scence and Technology,8(8), pp.66-68,7. Gan M., Peng H. (00) Predctng Chaotc Tme Seres Usng RBF-AR Model wth Regresson Weght,Systems Engneerng and Electroncs,3(4), pp Guo L.P., Yu J.N., Zhang X.D., Q Y.J., Zhang J.G. (0) Chaotc Tme Seres Forecastng Model based on the Improved RBFNN,Journal of Yunnan Unversty of Natonaltes(Natural Scences Edton), 0(), pp
8 Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº, , 06 L R.G., Zhang H.L., Wang Y. (05) New Orthogonal Bass Neural Network based on Quantum Partcle Swarm Optmzaton Algorthm for Fractonal Order Chaotc Tme Seres Sngle-Step Predcton,Journal of Computer Applcatons,35(8), pp.7-3. Weng H., P D.C. (04) Chaotc RBF Neural Network Anomaly Detecton Algorthm,Computer Technology and Development,04(7), pp Wu K.J., Wang T.J. (03) Predcton of Chaotc Tme Seres based on RBF Neural Network Optmzaton,Computer Engneerng, 39(0), pp.08-,6. Xu G.L. (0) Predcton for Traffc Flow of RBF Neural Network Based On Cloud Genetc Algorthm,Computer Engneerng and Applcatons,04(6):6-0. Zhang C., Xu G.L. (04) Predcton for Traffc Flow of RBF Neural Network based on Cloud Genetc Algorthm,Computer Engneerng and Applcatons, 50(6), pp.6-0
Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB
Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on
More informationUsing Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*
Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationWavelet chaotic neural networks and their application to continuous function optimization
Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationMultilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationA Prediction Method of Spacecraft Telemetry Parameter Based On Chaos Theory
A Predcton Method of Spacecraft Telemetry Parameter Based On Chaos Theory L Le, Gao Yongmng, Wu Zhhuan, Ma Kahang Department of Graduate Management Equpment Academy Beng, Chna e-mal: lle_067@hotmal.com
More informationA Fast Computer Aided Design Method for Filters
2017 Asa-Pacfc Engneerng and Technology Conference (APETC 2017) ISBN: 978-1-60595-443-1 A Fast Computer Aded Desgn Method for Flters Gang L ABSTRACT *Ths paper presents a fast computer aded desgn method
More informationStudy on Project Bidding Risk Evaluation Based on BP Neural Network Theory
1192 Proceedngs of the 7th Internatonal Conference on Innovaton & Management Study on Proect Bddng Rs Evaluaton Based on BP Neural Networ Theory Wang Xnzheng 1, He Png 2 1 School of Cvl Engneerng, Nanyang
More informationAPPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS
Journal of Theoretcal and Appled Informaton Technology 005-01 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationStudy on Active Micro-vibration Isolation System with Linear Motor Actuator. Gong-yu PAN, Wen-yan GU and Dong LI
2017 2nd Internatonal Conference on Electrcal and Electroncs: echnques and Applcatons (EEA 2017) ISBN: 978-1-60595-416-5 Study on Actve Mcro-vbraton Isolaton System wth Lnear Motor Actuator Gong-yu PAN,
More informationOther NN Models. Reinforcement learning (RL) Probabilistic neural networks
Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn
More informationInternet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks
Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationMATH 567: Mathematical Techniques in Data Science Lab 8
1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationCoke Ratio Prediction Based on Immune Particle Swarm Neural Networks
Send Orders for Reprnts to reprnts@benthamscence.ae 576 The Open Cybernetcs & Systemcs Journal, 05, 9, 576-58 Open Access Coke Rato Predcton Based on Immune Partcle Swarm Neural Networks Yang Ka,,*, Jn
More informationUncertainty and auto-correlation in. Measurement
Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationApplication research on rough set -neural network in the fault diagnosis system of ball mill
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationA Short Term Forecasting Method for Wind Power Generation System based on BP Neural Networks
Advanced Scence and Technology Letters Vol.83 (ISA 05), pp.7-75 http://dx.do.org/0.457/astl.05.83.4 A Short Term Forecastng Method for Wnd Power Generaton System based on BP Neural Networks Shenghu Wang,
More informationMulti-layer neural networks
Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent
More informationNon-linear Canonical Correlation Analysis Using a RBF Network
ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane
More informationMULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN
MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationScroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator
Latest Trends on Crcuts, Systems and Sgnals Scroll Generaton wth Inductorless Chua s Crcut and Wen Brdge Oscllator Watcharn Jantanate, Peter A. Chayasena, and Sarawut Sutorn * Abstract An nductorless Chua
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationNonlinear Classifiers II
Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural
More informationChapter 5 Multilevel Models
Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level
More information829. An adaptive method for inertia force identification in cantilever under moving mass
89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationAn Iterative Modified Kernel for Support Vector Regression
An Iteratve Modfed Kernel for Support Vector Regresson Fengqng Han, Zhengxa Wang, Mng Le and Zhxang Zhou School of Scence Chongqng Jaotong Unversty Chongqng Cty, Chna Abstract In order to mprove the performance
More informationA New Grey Relational Fusion Algorithm Based on Approximate Antropy
Journal of Computatonal Informaton Systems 9: 20 (2013) 8045 8052 Avalable at http://www.jofcs.com A New Grey Relatonal Fuson Algorthm Based on Approxmate Antropy Yun LIN, Jnfeng PANG, Ybng LI College
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More informationDe-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG
6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationOrientation Model of Elite Education and Mass Education
Proceedngs of the 8th Internatonal Conference on Innovaton & Management 723 Orentaton Model of Elte Educaton and Mass Educaton Ye Peng Huanggang Normal Unversty, Huanggang, P.R.Chna, 438 (E-mal: yepeng@hgnc.edu.cn)
More informationGrey prediction model in world women s pentathlon performance prediction applied research
Avalable onlne www.jocpr.com Journal of Chemcal and Pharmaceutcal Research, 4, 6(6):36-4 Research Artcle ISSN : 975-7384 CODEN(USA) : JCPRC5 Grey predcton model n world women s pentathlon performance predcton
More informationJifeng Zuo School of Science, Agricultural University of Hebei, Baoding , Hebei,China
Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº 7, 76 80, 016 do:10.1311/001.39.7.34 Proof on Decson Tree Algorthm Jfeng Zuo School of Scence, Agrcultural Unversty of Hebe, Baodng 071001, Hebe,Chna Pepe Ja Basc
More informationMultilayer neural networks
Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More information/ n ) are compared. The logic is: if the two
STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence
More informationMicrowave Diversity Imaging Compression Using Bioinspired
Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationResearch on Route guidance of logistic scheduling problem under fuzzy time window
Advanced Scence and Technology Letters, pp.21-30 http://dx.do.org/10.14257/astl.2014.78.05 Research on Route gudance of logstc schedulng problem under fuzzy tme wndow Yuqang Chen 1, Janlan Guo 2 * Department
More informationCHAPTER IV RESEARCH FINDING AND DISCUSSIONS
CHAPTER IV RESEARCH FINDING AND DISCUSSIONS A. Descrpton of Research Fndng. The Implementaton of Learnng Havng ganed the whole needed data, the researcher then dd analyss whch refers to the statstcal data
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationNatural Language Processing and Information Retrieval
Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support
More informationEfficient Weather Forecasting using Artificial Neural Network as Function Approximator
Effcent Weather Forecastng usng Artfcal Neural Network as Functon Approxmator I. El-Fegh, Z. Zuba and S. Abozgaya Abstract Forecastng s the referred to as the process of estmaton n unknown stuatons. Weather
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationStudy on Non-Linear Dynamic Characteristic of Vehicle. Suspension Rubber Component
Study on Non-Lnear Dynamc Characterstc of Vehcle Suspenson Rubber Component Zhan Wenzhang Ln Y Sh GuobaoJln Unversty of TechnologyChangchun, Chna Wang Lgong (MDI, Chna [Abstract] The dynamc characterstc
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationParameter Estimation for Dynamic System using Unscented Kalman filter
Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,
More informationSEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES. R. Pinto, S. Cavalieri
SEASONAL TIME SERIES PREDICTION WITH ARTIFICIAL NEURAL NETWORKS AND LOCAL MEASURES R. Pnto, S. Cavaler Unversty of Bergamo, Department of Industral Engneerng, Italy vale Marcon, 5. I-24044 Dalmne BG Ph.:
More informationIntroduction to the Introduction to Artificial Neural Network
Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp
More informationImprovement of Histogram Equalization for Minimum Mean Brightness Error
Proceedngs of the 7 WSEAS Int. Conference on Crcuts, Systems, Sgnal and elecommuncatons, Gold Coast, Australa, January 7-9, 7 3 Improvement of Hstogram Equalzaton for Mnmum Mean Brghtness Error AAPOG PHAHUA*,
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationFault Diagnosis of Autonomous Underwater Vehicles
Research Journal of Appled Scences, Engneerng and Technology 5(6): 407-4076, 03 SSN: 040-7459; e-ssn: 040-7467 Maxwell Scentfc Organzaton, 03 Submtted: March 3, 0 Accepted: January, 03 Publshed: Aprl 30,
More informationStatistics for Economics & Business
Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationAN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING
AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology
More informationReview of Taylor Series. Read Section 1.2
Revew of Taylor Seres Read Secton 1.2 1 Power Seres A power seres about c s an nfnte seres of the form k = 0 k a ( x c) = a + a ( x c) + a ( x c) + a ( x c) k 2 3 0 1 2 3 + In many cases, c = 0, and the
More informationKristin P. Bennett. Rensselaer Polytechnic Institute
Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationRadial-Basis Function Networks
Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal
More informationMotion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong
Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal
More informationIV. Performance Optimization
IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton
More informationPCA-PSO-BP Neural Network Application in IDS Lan SHI a, YanLong YANG b JanHui LV c
Internatonal Power, Electroncs and Materals Engneerng Conference (IPEMEC 2015) PCA-PSO-BP Neural Network Applcaton n IDS Lan SHI a, YanLong YANG b JanHu LV c College of Informaton Scence and Engneerng,
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationPower law and dimension of the maximum value for belief distribution with the max Deng entropy
Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationLecture 23: Artificial neural networks
Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of
More informationAn Evolutionary Method of Neural Network in System Identification
Internatonal Journal of Intellgent Informaton Systems 206; 5(5): 75-8 http://www.scencepublshnggroup.com//s do: 0.648/.s.2060505.4 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) An Evolutonary Method
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationBearing Remaining Useful Life Prediction Based on an Improved Back Propagation Neural Network
nternatonal Journal of Performablty Engneerng, Vol. 10, No. 6, September 2014, pp.653-657. RAMS Consultants Prnted n nda Bearng Remanng Useful Lfe Predcton Based on an mproved Back Propagaton Neural Network
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationQueueing Networks II Network Performance
Queueng Networks II Network Performance Davd Tpper Assocate Professor Graduate Telecommuncatons and Networkng Program Unversty of Pttsburgh Sldes 6 Networks of Queues Many communcaton systems must be modeled
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informationAssessment of Site Amplification Effect from Input Energy Spectra of Strong Ground Motion
Assessment of Ste Amplfcaton Effect from Input Energy Spectra of Strong Ground Moton M.S. Gong & L.L Xe Key Laboratory of Earthquake Engneerng and Engneerng Vbraton,Insttute of Engneerng Mechancs, CEA,
More information