IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM

Similar documents
NONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM

Neural network-based athletics performance prediction optimization model applied research

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages

ERROR MODELING FOR STRUCTURAL DEFORMATIONS OF MULTI-AXIS SYSTEM BASED ON SVR

Application of support vector machine in health monitoring of plate structures

MARKOV CHAIN AND HIDDEN MARKOV MODEL

WAVELET-BASED IMAGE COMPRESSION USING SUPPORT VECTOR MACHINE LEARNING AND ENCODING TECHNIQUES

A finite difference method for heat equation in the unbounded domain

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory

Adaptive and Iterative Least Squares Support Vector Regression Based on Quadratic Renyi Entropy

The Study of Teaching-learning-based Optimization Algorithm

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students.

A new P system with hybrid MDE- k -means algorithm for data. clustering. 1 Introduction

Optimization of JK Flip Flop Layout with Minimal Average Power of Consumption based on ACOR, Fuzzy-ACOR, GA, and Fuzzy-GA

Predicting Model of Traffic Volume Based on Grey-Markov

A Novel Hierarchical Method for Digital Signal Type Classification

Associative Memories

Image Classification Using EM And JE algorithms

Deriving the Dual. Prof. Bennett Math of Data Science 1/13/06

Kernel Methods and SVMs Extension

Sparse Training Procedure for Kernel Neuron *

Linear Classification, SVMs and Nearest Neighbors

The University of Auckland, School of Engineering SCHOOL OF ENGINEERING REPORT 616 SUPPORT VECTOR MACHINES BASICS. written by.

COXREG. Estimation (1)

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

An Effective Space Charge Solver. for DYNAMION Code

A General Column Generation Algorithm Applied to System Reliability Optimization Problems

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Reactive Power Allocation Using Support Vector Machine

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Adaptive LRBP Using Learning Automata for Neural Networks

Support Vector Machine Technique for Wind Speed Prediction

On the Equality of Kernel AdaTron and Sequential Minimal Optimization in Classification and Regression Tasks and Alike Algorithms for Kernel

MULTIVARIABLE FUZZY CONTROL WITH ITS APPLICATIONS IN MULTI EVAPORATOR REFRIGERATION SYSTEMS

Comparative Analysis of SPSO and PSO to Optimal Power Flow Solutions

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Polite Water-filling for Weighted Sum-rate Maximization in MIMO B-MAC Networks under. Multiple Linear Constraints

[WAVES] 1. Waves and wave forces. Definition of waves

Networked Cooperative Distributed Model Predictive Control Based on State Observer

The line method combined with spectral chebyshev for space-time fractional diffusion equation

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS

Feature Selection: Part 1

Lecture Notes on Linear Regression

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

Cyclic Codes BCH Codes

On Uplink-Downlink Sum-MSE Duality of Multi-hop MIMO Relay Channel

REAL-TIME IMPACT FORCE IDENTIFICATION OF CFRP LAMINATED PLATES USING SOUND WAVES

Greyworld White Balancing with Low Computation Cost for On- Board Video Capturing

Development of whole CORe Thermal Hydraulic analysis code CORTH Pan JunJie, Tang QiFen, Chai XiaoMing, Lu Wei, Liu Dong

Autonomous State Space Models for Recursive Signal Estimation Beyond Least Squares

Supporting Information

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints

The Leak Detection of Heating Pipe Based on Multi-Scale Correlation Algorithm of Wavelet

Lower Bounding Procedures for the Single Allocation Hub Location Problem

Particle Swarm Optimization with Adaptive Mutation in Local Best of Particles

Study on Active Micro-vibration Isolation System with Linear Motor Actuator. Gong-yu PAN, Wen-yan GU and Dong LI

The Entire Solution Path for Support Vector Machine in Positive and Unlabeled Classification 1

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems

Approximate merging of a pair of BeÂzier curves

Orientation Model of Elite Education and Mass Education

Which Separator? Spring 1

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic

Research Article New Strategy for Analog Circuit Performance Evaluation under Disturbance and Fault Value

L-Edge Chromatic Number Of A Graph

Negative Binomial Regression

Active Learning with Support Vector Machines for Tornado Prediction

Chapter Newton s Method

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

Dynamic Analysis Of An Off-Road Vehicle Frame

Errors for Linear Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Support Vector Machines

On the Power Function of the Likelihood Ratio Test for MANOVA

An Improved multiple fractal algorithm

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Correspondence. Performance Evaluation for MAP State Estimate Fusion I. INTRODUCTION

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

ECE559VV Project Report

Lecture 3: Dual problems and Kernels

MODIFIED PARTICLE SWARM OPTIMIZATION FOR OPTIMIZATION PROBLEMS

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

DERIVATION OF THE PROBABILITY PLOT CORRELATION COEFFICIENT TEST STATISTICS FOR THE GENERALIZED LOGISTIC DISTRIBUTION

A MIN-MAX REGRET ROBUST OPTIMIZATION APPROACH FOR LARGE SCALE FULL FACTORIAL SCENARIO DESIGN OF DATA UNCERTAINTY

Gaussian Processes and Polynomial Chaos Expansion for Regression Problem: Linkage via the RKHS and Comparison via the KL Divergence

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang

Originated from experimental optimization where measurements are very noisy Approximation can be actually more accurate than

Lower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle

A principal component analysis using SPSS for Multi-objective Decision Location Allocation Problem

Demodulation of PPM signal based on sequential Monte Carlo model

USING LEARNING CELLULAR AUTOMATA FOR POST CLASSIFICATION SATELLITE IMAGERY

CHAPTER 7 STOCHASTIC ECONOMIC EMISSION DISPATCH-MODELED USING WEIGHTING METHOD

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

Support Vector Machines

9 Adaptive Soft K-Nearest-Neighbour Classifiers with Large Margin

Transcription:

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM, XIANFANG WANG, JIALE DONG, YUANYUAN ZHANG, 3 ZHIYONG DU Coege of Computer & Informaton Engneerng, Henan Norma Unversty, Xnxang 4537, Chna Henan Provnce Coeges and Unverstes Engneerng Technoogy Research Center for Computng Integence & Data Mnng, Xnxang 4537, Henan, Chna 3 Henan Mechanca and Eectrca Engneerng Coege, Xnxang 453, Henan, Chna E-ma: xfwang@yahoo.com.cn ABSTRACT Gven the nfuence of the seecton of regresson parameters on the accuracy of SVR mode and ts abty of earnng and generazaton, ths artce adopts the partce swarm optmzaton agorthm to bud the SVR mode and appes t to the modeng of nonnear system dentfcaton. Through the smuaton experments, t s found that ths mode s more accurate n dentfcaton and has a stronger abty of earnng and generazaton compared wth GA. In addton, t demonstrates that the appcaton n nonnear system dentfcaton based on PSO-SVR agorthm coud be consderaby effectve. Keywords: Partce Swarm Optmzaton (PSO), Support Vector Regresson (SVR), Nonnear System. INTRODUCTION Wth the rapd deveopment of computer technoogy and contro theory, system dentfcaton has become a subect of tremendous mportance, whch has been wdey used n day fe and ndustra producton. It can be cassfed nto near system dentfcaton and nonnear system dentfcaton. The theory system of near system dentfcaton has become mature graduay. However, the nonnear system dentfcaton fed st has a ot of room for mprovement because t s dffcut to estabsh an accurate mode consderng the dversty and compexty n nonnear system. As n practce most of the systems are nonnear, the nonnear system dentfcaton w be an mportant aspect for further research n the area of system dentfcaton. The theory, many based on neura network, s an effectve too to sove the probem concernng nonnear system dentfcaton. But t s not fawess wth probems such as overfttng, oca extremum, sow convergence rate, strong dependence on the quantty and quaty of data. SVM (Support Vector Machne) [], a new machne earnng agorthm based on the statstca earnng theory, can get the goba optmum souton wthout oca extremum by usng the structura rsk mnmzaton prncpe, whch has dstnct advantages n sovng such practca probems as nonnearty, sma sampe, and hgh dmenson []. SVR (Support Vector Regresson) s a regresson agorthm estabshed on the bass of SVM, whch s apped n functona regresson. As the seecton of regresson parameters ( ε, C, γ ) has an enormous nfuence on the accuracy of SVR mode and ts earnng generazaton abty n the estmaton of nonnear support vector regresson, t s necessary to optmze the parameters. GA (Genetc Agorthm) can be apped to the parameter optmzaton, but, due to ts computatona compexty t s not effcent enough n searchng the optma souton. PSO (Partce Swarm Optmzaton) whch has stronger goba searchng abty and faster convergence speed than GA can reaze the optmzaton of mutpe parameters at the same tme aowng the mode to acheve better regresson effect. Therefore, ths artce uses PSO to get the optma parameters and mode SVR, whch s then apped to the nonnear system dentfcaton by MATLAB smuatng experment.. ALGORITHM THEORY. Partce Swarm Optmzaton Agorthm PSO s a swarm ntegence optmzaton agorthm frst proposed by Kennedy and Eberhart n 995 [3]. PSO agorthm estabshes a smpe veocty and dspacement mode to reaze the optmzaton n the souton space wthout adustng 967

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 parameters. So the agorthm s easy to acheve wth ts faster convergence speed and has some advantages compared wth other optmzaton agorthms. The basc dea of PSO agorthm s ths: A group of partces are ntazed n the entre souton space, and each of them, measured by veocty, poston, and ftness vaue, may be an optma souton for the probem. Then these partces adust and update ther own poston dynamcay accordng to the mobe experence of themseves and other partces around them. Each tme, the partce woud update the veocty and poston of ndvdua extremum (pbest) and group extremum (gbest) by comparng the ftness vaue of new partce wth the ftness vaue of ndvdua extremum and group extremum. The formuas are as foows: k k k k v + = w v + c rand( pbest x ) k k + c rand ( gbest x ) () x = x + v () k + k k + where k denotes the current teraton number of the k k partce, v, v + are the current partce speed and k k the speed of next generaton, x, x + are the current partce poston and the poston of next generaton, w whch determnes the mpact of hstorca speed on current speed s the nerta weght, non-negatve constants c and c denote earnng factors, k k numbers between and, and pbest, gbest are the ndvdua extremum and the goba extremum of the current partce [4]. rand, rand are the random. Support Vector Regresson SVM, a machne earnng agorthm based on statstca earnng theory, was ntay proposed for cassfcaton of probems by Vapnk et a. On the bass of SVM, SVR, whch ntroduces oss functon, s apped n the regresson earnng [5]. Frsty, the near regresson s dscussed. A near functon f ( x) = w x + b s used to ft the { } tranng sampe set ( ) x, y, =,,,, where w s the weght vector, b s the bas, x denotes nput vector, and y s the output vaue of x. Sack varabes ξ and ξ are ntroduced due to the n fttng, and thus, the modeng probem s transformed nto the optmzaton probem: mn = w + C ( ξ + ξ ) (3) w, b, ξ, ξ = y ( wx + b) ε + ξ ( wx + b) y ε + ξ s. t. ξ, ξ ( =,,, ) C> (4) where C denotes the penaty coeffcent, ε s the nsenstve oss functon. When the dfference between f ( x ) and y s ess than ε, the s supposed to be zero, namey, no oss. Otherwse, the s f ( x ) y ε. To sove the probem of mathematca optmzaton more easy whch s a convex quadratc programmng probem Lagrange functon and duaty prncpe are used, then we can get ts dua form as foows: max : L( α, α ) y = = = ε ( α + α ) + ( α α ) ( α α )( α α )( αα ), = ( α α ) =, s. t. = =,,, α, α C (5) (6) By sovng Lagrange mutpers α and α the foowng functon to be estmated s obtaned: f ( x) = ( α α )( x x) + b (7) = Accordng to Equaton (7), the near regresson functon can be got: f ( x) = wx + b = ( α α )( x x) + b (8) = Next, the nonnear regresson s dscussed. The nonnear transformaton s adopted to map the data to hgh dmensona space, thus transatng t nto the probem concernng nonnear regresson. The kerne functon K( x, x ) s ntroduced here to cacuate the nner product ψ ( x ) ψ ( x ) n hgh dmensona feature space, and the nonnear regresson functon s as foow: 968

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 f ( x) = ( α α ) K( α, α ) + b (9), = There are many common kerne functons, such as rada bass functon (RBF), poynoma kerne functon, and near kerne functon etc. Accordng to researches and experments, the resuts from usng RBF are desrabe n most cases. That s why x x Gaussan RBF K( x, x ) = exp s chosen γ as kerne functon n ths paper. 3. SVR BASED ON PSO ALGORITHM In the estmaton of nonnear support vector regresson we are many concerned wth the optmzaton of nsenstve oss functonε, penaty coeffcent C, and γ n kerne functon K( x, x ), whch are decsve to the SVR mode n generazaton abty and ts earnng accuracy [6]. Among the three parameters, ε affects the mode accuracy: the smaer the ε s, the more support vectors we have and the more accurate the mode s key to be; C has a great nfuence on the generazaton abty of the mode: wth the rse of C, the data s fttng degree tends to ncrease, but the generazaton abty decreases; γ aso concerns the earnng accuracy of the mode. In order to fnd the optma parameter combnaton of SVR mode, PSO agorthm s used to optmze the three-dmensona parameter ( ε, C, γ ) [7]. As the veocty and poston of each partce are determned by threedmensona parameter ( ε, C, γ ), mean square (MSE) whch can refect the performance of SVR regresson s chosen as the ftness functon Ft [8], that s: = ( y ) y Ft = MSE = () where denotes the tota number of sampes, s the actua vaue of the th sampe, and y s the correspondng output vaue of SVR mode of the th sampe. The detaed steps of SVR n parameter seecton based on PSO are as foows [9]: The nput vector and output vector of SVR need to be determned. y PSO agorthm s adopted to fnd the parameters ( ε, C, γ ) of SVR mode: Frsty, ntaze the veocty and poston of each partce, set the agorthm s teraton number and determne the popuaton sze. Secondy, cacuate the ftness vaue of each partce, then search for ndvdua extremum and group extremum on the bass of the ftness vaue of each nta partce. Thrdy, update the veocty and poston of each partce accordng to () and (), and renew ndvdua extremum and goba extremum based on the ftness vaues of partces n the new popuaton. Fnay, f the termnaton condton that the predetermned ftness vaue or the maxmum teraton number can be reached s satsfed, the optmzaton w end, otherwse the cacuaton of the partces ftness vaues s nvoved. 3 Based on the optma parameter combnaton obtaned from the above steps, the SVR mode s estabshed. 4. SIMULATION EXPERIMENT 4. Smuaton Obect In order to verfy the effectveness of the appcaton n nonnear system based on SVR whch s optmzed by partce swarm agorthm, the SISO nonnear system from the reference [] s cted n ths thess:.5 y( k) y( k ) y( k + ) = + y ( k) + y ( k ).35sn[ y( k) + y( k )] +. u( k) () 4. The Seecton of Parameters Frst of a, PSO s adopted to optmze the parameters after the operatng parameters of the agorthm are set, where the partce number s, the teraton number s, and both c and c are. The actua vaue y of the mode and the output vaue y of SVR mode need to be pugged nto the formua () to work out ts MSE constanty. Then we can obtan the mnmum MSE.349 when the teraton reaches 39, and consequenty get the optma parameter combnaton (.,6,3). The ftness curve of searchng parameters wth PSO s shown n Fgure. To verfy the resut of PSO, agorthm GA s used to optmze the parameters, thus gettng the Fgure whch s the ftness curve of GA. 969

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 ftness vaue.5 optma ndvdua ftness 4 6 8. teraton number Fgure : The ftness curve of PSO optma ndvdua ftness vaue s why PSO agorthm s ntroduced to optmze parameters. 4.3 Nonnear System Identfcaton Based on PSO-SVR We need to pug the above optma parameters nto the gven mode and set the amptude of whte nose sgna at. Fgure 3 and Fgure 4 are the output graph and the mage of the mode. output 3 ftness vaue.9.8.5.5 Fgure 3: The Mode Output In Whte Nose Sgna.7 4 6 8 teraton number Fgure : The ftness curve of GA.5..5 Tabe : Comparson between PSO and GA Iteraton MSE number PSO GA.84.36798..5336.94.939499 3.3858.939499 39.349.89956 6.349.8866.349.735759 Through comparson, t can be seen that both GA and PSO optmze the parameters teratvey, but the MSE of PSO s aways ower than that of GA when ther teratons are same. And aso the optmum souton wth PSO has appeared when the teraton number s 39, however, the optma souton wth GA hasn t surfaced unt the teraton number s, whch ndcates that PSO has better convergence and takes ess tme than GA. And that -.5 -..5.5 Fgure 4: The Mode Error In Whte Nose Sgna Fgure 4 shows that the of the mode s -3 kept wthn orders of magntude, whch ndcates that the mode has a hgher accuracy. The random sgna (amptude.8), snusoda sgna (.4sn( π t) +.4) and square wave sgna (.4 sgn[sn( π t)] +.4) are used to verfy the generazaton abty of the mode, and the resuts are shown n Fgure 5 - Fgure7. 97

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395.5 -.5 -. -.5.5.5 Fgure 5: The Error Checkng In Random Sgna. -. -. -.3.5.5 Fgure 6: The Error Checkng In Snusoda Sgna. -. -. -.3.5.5 Fgure 7: The Error Checkng In Square Wave Sgna As seen n Fgure 5 Fgure 7 the remans - ess than orders of magntude under dfferent cabratng sgnas, whch shows that the mode aso has a better generazaton abty. 5. CONCLUSION As the optmzed seecton of parameters ( ε, C, γ ) exerts a great nfuence on the regresson accuracy of SVR mode and ts earnng and generazaton abty n the estmaton of nonnear support vector regresson, t s necessary to optmze the parameters. Therefore, PSO s ntroduced to obtan the optma parameters and mode SVR n ths paper, whch s then apped to the modeng of nonnear system dentfcaton. The smuaton resuts show that PSO s better n terms of convergence and more effcent n optmzaton compared wth GA, whch makes the mode get hgher dentfcaton accuracy and stronger abty of earnng and generazaton. Though the mode takes ess tme compared to GA when the parameters of SVR are optmzed, the tota runtme of program s st rather ong, whch means the mode has yet to be mproved. ACKNOWLEDGEMENTS Ths work s supported by the Natona Natura Scence Foundaton of Chna (No.6737), the Scence and Technoogy Research Proect of Henan Provnce(No.79), the Foundaton and Fronter Technoogy Research Programs of Henan Provnce(No.44387), the Innovaton Taent Support Program of Henan Provnce Unverstes (No.HASTIT), the Doctora Started Proect of Henan Norma Unversty(No.39). REFERENCES: [] Vapnk V, The nature of statstca earnng theory, Wey-Inter Scence, New York, 998. [] Dng Shfe, Q Bnguan, Tan Hongyan, An overvew on theory and agorthm of support vector machnes, Journa of Unversty of Eectronc Scence and Technoogy of Chna, Vo. 4, No.,, pp. -8. [3] KENNEDY J, EBERHART R C, A dscrete bnary verson of the partce swarm agorthm, IEEE Servce Center, NJ, 995, pp. 44-49. [4] L Aguo, Qn Zheng, Bao Fumn, He Shengpng, Partce swarm optmzaton agorthms, Computer Engneerng and Appcatons, Vo.,, pp. -3. [5] Desa K,Badhe Y,Tambe S S.et a, Soft-sensor deveopment for fed-batch boreactors usng support vector regresson, Bochemca Engneerng Journa, Vo. 7, No. 3, 6, pp. 5-39. [6] Ustun B,Messen W J, Determnaton of optma support vector regresson parameters by genetc agorthms and smpex optmzaton, Anaytca Chmca Acta, Vo. 544, 5, pp. 9-35. 97

Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 [7] Hou Yongqang, Wang Zengbao, Actve dsturbance reecton controer parameters optmzaton based on partce swarm optmzed, Computer & Dgta Engneerng, Vo. 4, No.,, pp. 6-8. [8] Mao Zhang, Lu Chunbo, Pan Feng, Parameter seecton and appcaton of SVM wth mxture kernes based on IPSO, Journa of Jangnan Unversty (Natura Scence Edton), Vo. 8, No. 6, 9, pp. 63-634. [9] Chen Shu, Xu Baoguo, Wang Haxa, Wu Xaopeng, Study of fermentaton process based on PSO-SVR, Computer Engneerng and Appcatons, Vo. 43, No. 9, 7, pp. 4-6. [] Zhang Haoran, Han Zhengzh, L Changgang, Support vector machne based nonnear systems dentfcaton, Journa of System Smuaton, Vo. 5, No., 3, pp. 9-5. 97