Comparative Study between FPA, BA, MCS, ABC, and PSO Algorithms in Training and Optimizing of LS-SVM for Stock Market Prediction

Size: px
Start display at page:

Download "Comparative Study between FPA, BA, MCS, ABC, and PSO Algorithms in Training and Optimizing of LS-SVM for Stock Market Prediction"

Transcription

1 Comparatve Study between FPA, BA, MCS, ABC, and PSO Algorthms n ranng and Optmzng of LS-SVM for Stock Market Predcton Osman Hegazy 1*, Omar S. Solman 1 and Mustafa Abdul Salam 2 Faculty of Computers and Informaton, Caro Unversty, Egypt 1 Hgher echnologcal Insttute (H..I), enth of Ramadan cty, Egypt 2 Receved: 16-February-215; Revsed: 22-March-215; Accepted: 28-March ACCENS Abstract In ths Paper, fve recent natural nspred algorthms are proposed to optmze and tran Least Square- Support Vector Machne (LS-SVM). hese algorthms are namely, Flower Pollnaton Algorthm (FPA), Bat algorthm (BA), Modfed Cuckoo Search (MCS), Artfcal Bee Colony (ABC), and Partcle Swarm Optmzaton (PSO). hese algorthms are proposed to automatcally select best free parameters combnaton for LS- SVM. Sx fnancal techncal ndcators derved from stock hstorcal data are used as nputs to proposed models. Standard LS-SVM and ANN are used as benchmarks for comparson wth proposed models. Proposed models tested wth sx datasets representng dfferent sectors n S&P 5 stock market. Proposed models were used to predct daly, weekly, and monthly stock prces. Results presented n ths paper showed that the proposed models have quck convergence rate at early stages of the teratons. hey acheved better accuracy than compared methods n prce and trend predcton. hey also overcame over fttng and local mnma problems found n ANN and standard LS-SVM. Keywords Least Square- Support Vector Machne,Flower Pollnaton Algorthm, Bat algorthm, Modfed Cuckoo Search, Artfcal Bee Colony, Partcle Swarm Optmzaton, and stock market predcton. 1. Introducton Stock market predcton s the process of attemptng to specfy the future value of company stock based on ts hstorcal data. It has been at focus snce the good predcton can maxmze nvestor's gans. Predcton of stock market s not an easy task, because the nature of stock market data s varable, nonlnear, volatle, and close to random-walk. Also choosng a sutable tranng and predcton method s stll very crtcal problem [1]. echncal ndcators were from the frst methods used to predct stock market trend and prce [2].hey are mathematcal functons that use stock hstorcal data to determne the future prce. hey are classfed n two classes, oscllators or leadng ndcators, and laggng ndcators [2]. Leadng ndcators are desgned to lead prce movements. he laggng ndcators follow the prce acton and are referred to as trend-followng ndcators. Artfcal Neural Networks (ANNs) are consdered one of the most commonly machne learnng technques used n stock market predcton. In most cases ANNs suffer from over-fttng problem due to the large number of parameters to fx, and the lttle pror user knowledge about the relevance of the nputs n the analysed problem [3]. Support vector machnes (SVMs) have been developed as an alternatve that avod ANNs lmtatons. SVMs compute globally optmal solutons, unlke those obtaned wth ANNs, whch tend to fall nto local mnma [4]. Least squares support vector machne (LS-SVM) method whch s presented n [5], s a reformulaton of the tradtonal SVM algorthm. Although LS-SVM smplfes the SVM procedure, the regularzaton parameter and the kernel parameters play an mportant role n the regresson system. herefore, t s necessary to establsh a methodology for properly selectng the LS-SVM free parameters. *Author for correspondence 35

2 he perceved advantages of evolutonary strateges as optmzaton methods motvated the authors to consder such stochastc methods n the context of optmzng SVMs. A survey and overvew of evolutonary algorthms (EAs) s found n [6]. EAs or natural nspred algorthms used n ths work are Flower Pollnaton Algorthm (FPA), Bat algorthm (BA), Cuckoo Search (CS), Artfcal Bee Colony (ABC), and Partcle Swarm Optmzaton (PSO). Flower Pollnaton Algorthm (FPA) was proposed by Yang n 212[7]. It s nspred by the pollnaton process of flowers [7]. he man purpose of a flower s ultmately reproducton va pollnaton. Flower pollnaton s typcally assocated wth the transfer of pollen, and such transfer s often lnked wth pollnators such as nsects, brds, bats and other anmals [8]. In [9] FPA was appled successvely for dfferent economc load dspatch problems. In [1] FPA was appled n nonlnear algebrac systems wth multple solutons. In [11] bnary FPA was appled to feature selecton applcaton. In [12] FPA algorthm was used for multobjectve optmzaton. In [13] Study of FPA algorthm for contnuous optmzaton s presented. In [14] FPA algorthm wth dmenson mprovement s ntroduced. Bat algorthm (BA) was proposed by Yang n 21 [15]. It s consdered a new meta-heurstc algorthm for contnuous optmzaton. BA s based on the fascnatng capablty of mcrobats (echolocaton) to fnd ther prey and dscrmnate dfferent types of nsects even n complete darkness. BA has demonstrated to outperform some well-known nature-nspred optmzaton technques lke GA, and PSO algorthms [15].BA s appled n contnuous optmzaton n the context of engneerng desgn optmzaton.ba can deal wth hghly nonlnear problem effcently and can fnd the optmal solutons accurately [16].Case studes nclude pressure vessel desgn, car sde desgn, sprng and beam desgn, truss systems, tower and tall buldng desgn and others.. BA can handle multobjectve problems effectvely [17]. In [18], a detaled study of combned economc load and emsson dspatch problems usng bat algorthm s presented. hey concluded that bat algorthm s easy to mplement and much superor to ABC and GA algorthms n terms of accuracy and effcency. In [19] a comparson study of bat algorthm wth PSO, GA, and other algorthms n the context for e-learnng s presented. hey concluded 36 Internatonal Journal of Advanced Computer Research that bat algorthm has clearly some advantages over other algorthms. In [2] A New Bat Based Back- Propagaton (BA-BP) Algorthm s presented. hey suggested that the computatonal effcency of BPNN tranng process s hghly enhanced when combned wth BA algorthm. Cuckoo Search (CS) algorthm s proposed by Yang and Deb n 29 [21]. It s consdered a naturenspred meta-heurstc algorthm for contnuous optmzaton [21]. CS s based on the brood parastsm of some cuckoo speces. CS s enhanced by the Levy flghts [22], rather than by smple sotropc random walks. CS algorthm was appled to engneerng desgn applcatons; t has superor performance over other algorthms for a range of contnuous optmzaton problems such as sprng desgn and welded beam desgn problems [23, 24, and 25]. Zheng and Zhou [26] provded a varant of cuckoo search usng Gaussan process. Modfed Cuckoo Search (MCS) algorthm s proposed by Walton proposed n 211 [27]. MCS mproved standard CS algorthm especally n terms of convergence to global mnmum n real world applcatons. Artfcal Bee Colony (ABC) s proposed by D. Karaboga n 25 for real parameter optmzaton [28]. It s nspred by the ntellgent behavour of honey bees. Karaboga and Basturk have nvestgated the performance of the ABC algorthm on unconstraned numercal optmzaton problems whch s found n [29], [3], [31] and ts extended verson for the constraned optmzaton problems n [32]. Hybrd artfcal bee colony-based approach for optmzaton of mult-pass turnng operatons s used n [33]. A study on Partcle Swarm Optmzaton (PSO) and ABC algorthms for multlevel thresholdng s ntroduced n [34]. ABC optmzaton was used for mult-area economc dspatch n [35]. Partcle Swarm Optmzaton algorthm (PSO) s proposed by James Kennedy and Russell Eberhart n 1995 [36]. PSO s one of the most used EAs. It s motvated by socal behavour of organsms such as brd flockng and fsh schoolng [36]. he PSO algorthm, whle makng adjustment towards "local" and "global" best partcles, s smlar to the crossover operaton used by genetc algorthms [37]. In [38], [39] authors proved that SVM optmzed by PSO gave better accuracy than standard SVM model. Ju Y. et al. [4] concluded that hybrd CI approaches

3 AIA-BPNN and ASA-BPNN are better than sngle CI technque BPNN-SCG and recommended for stock prce. hs paper proposes fve hybrd models. hese are FPA-LS-SVM, BA-LS-SVM, MCS-LS-SVM, ABC- LS-SVM, and PSO-LS-SVM models. hese models are hybrdzng optmzaton algorthms (FPA, BA, MCS, ABC, and PSO) respectvely, fnancal techncal ndcators, and LS-SVM model. he performance of LS-SVM s based on the selecton of hyper parameters C (cost penalty), ϵ (nsenstve-loss functon) and γ (kernel parameter). Optmzaton algorthms were used to fnd the best parameter combnaton for LS-SVM. he rest of paper s organzed as follows: Secton 2 presents the Least square support vector machne (LS-SVM) model; Secton 3 presents the FPA algorthm; Secton 4 presents the BA algorthm; Secton 5 presents the MCS algorthm; Secton 6 presents the ABC algorthm; Secton 7 presents the PSO algorthm; Secton 8 s devoted for the proposed models and ts mplementaton n daly, weekly, and monthly stock prce predcton; In Secton 9 the results are dscussed. he man conclusons of the work are presented n Secton Least Square Support Vector Machne (LS-SVM) Least squares support vector machnes (LS-SVM) are least squares versons of support vector machnes (SVM), whch are a set of related supervsed learnng methods that analyse data and recognze patterns, and whch are used for classfcaton and regresson analyss. In ths verson one fnds the soluton by solvng a set of lnear equatons nstead of a convex quadratc programmng (QP) problem for classcal SVMs. LS-SVM classfers, were proposed by Suykens and Vandewalle [41]. Let X s nput data matrx and s output vector. Gven the tranng data set, where, the LS-SVM goal s to construct the functon, whch represents the dependence of the output on the nput. hs functon s formulated as f ( x) W ( x) b (1) Where W and (x) vectors, and b R : p n R R are n1 column. LS-SVM algorthm computes the functon (1) from a smlar mnmzaton problem 37 found n the SVM method [4]. However the man dfference s that LS-SVM nvolves equalty constrants nstead of nequaltes, and t s based on a least square cost functon. Furthermore, the LS-SVM method solves a lnear problem whle conventonal SVM solves a quadratc one. he optmzaton problem and the equalty constrants of LS-SVM are defned as follows: y w ( x ) b e (3) Where e s the n1error vector, 1 s a n1vector C R wth all entres 1, and s the tradeoff parameter between the soluton sze and tranng errors. From Eq. (2) a Lagranan s formed, and dfferentatng wth respect to Largrangan multplers), we obtan I Z 1 CI I Z 1 I W b e a y Where I represents the dentty matrx and Z [ ( x1), ( x2),..., ( )] w, b, e, a ( a s x n From rows one and three n Eq. (4) Ce a hen, by defnng the kernel matrx C 1 (4) w Z a and K ZZ, and the parameter, the condtons for optmalty lead to the followng overall soluton 1 mn j( w, e, b) w w w, e, b 1 b K I a y (5) Kernel functon K types are as follows: Lnear kernel K( x, x ) x x (6) Polynomal kernel of degree d: K ( x, x ) ) 1 1 C e (2) 2 2 d (1 x x / c (7) Radal bass functon RBF kernel : 2 K( x, x ) exp( x / ) (8) x 2

4 MLP kernel : K( x, x ) tanh( kx x ) (9) 3. Flower Pollnaton Algorthm (FPA) Flower Pollnaton Algorthm (FPA) s a novel algorthm nspred by the pollnaton process of flowers [7]. Pollnaton can be acheved by selfpollnaton or cross-pollnaton. Cross-pollnaton, or allogamy, means pollnaton can occur from pollen of a flower of dfferent plant, whle self-pollnaton s the fertlzaton of one flower, such as peach flowers, from pollen of the same flower or dfferent flowers of the same plant, whch often occurs when there s no relable pollnator avalable. Botc, cross-pollnaton may occur at long dstance, and the pollnators such as bees, bats, brds and fles can fly a long dstance, thus they can consdered as the global pollnaton. In addton, bees and brds may behave as Levy flght behavor [42], wth jump or fly dstance steps obey a Levy dstrbuton. Furthermore, flower constancy can be used an ncrement step usng the smlarty or dfference of two flowers. he characterstcs of pollnaton process, flower constancy and pollnator behavor, were dealzed n the followng rules [7]: 1. Botc and cross-pollnaton s consdered as global pollnaton process wth pollen carryng pollnators performng Levy flghts. 2. Abotc and self-pollnaton are consdered as local pollnaton. 3. Flower constancy can be consdered as the reproducton probablty s proportonal to the smlarty of two flowers nvolved. 4. Local pollnaton and global pollnaton s controlled by a swtch probablty. Due to the physcal proxmty and other factors such as wnd, local pollnaton can have a sgnfcant fracton p n the overall pollnaton actvtes. In the global pollnaton step, flower pollens are carred by pollnators such as nsects, and pollens can travel over a long dstance. hs ensures the pollnaton and reproducton of the fttest, and thus we represent the fttest as. he frst rule can be formulated as bee s updated usng the Eq. (1): ( ) (1) 38 Internatonal Journal of Advanced Computer Research Where s the soluton vector at teraton and s the current best soluton. he parameter s the strength of the pollnaton whch s a step sze. Snce nsects may move over a long dstance wth varous dstance steps, a Levy flght can be used to mmc ths characterstc effcently [17]. L > wll be drawn from a Levy dstrbuton as shown n Eq. (11) ( ) ( ) ( ) (11) Here ( ) s the standard gamma functon, and ths dstrbuton s vald for large steps s >. he local pollnaton (Rule 2) and flower constancy can be represented as n Eq. (12): ( ) (12) Where and are soluton vectors drawn randomly from the soluton set. he parameter s drawn from unform dstrbuton n the range from to Bat Algorthm (BA) Bat Algorthms (BA), whch s a new nature nspred algorthm for contnuous optmzaton s proposed by Yang n 21 [15]. Yang developed the bat algorthm wth the followng three dealzed rules: a. All bats use echolocaton to sense dstance, and they also know the dfference between food/prey and background barrers n some magcal way. b. Bats fly randomly wth velocty at poston wth a frequency, varyng wavelength and loudness to search for prey. hey can automatcally adjust the wavelength (or frequency) of ther emtted pulses and adjust the rate of pulse emsson, dependng on the proxmty of ther target. c. Although the loudness can vary n many ways, we assume that the loudness vares from a large (postve) to a mnmum constant value. Frst, the ntal poston, velocty and frequency are ntalzed for each bat. For each tme step, the movement of the vrtual bats s gven by updatng ther velocty and poston usng Eq. (13), Eq. (14) and Eq. (15) respectvely, as follows:

5 ( ) (13) ( ) (14) (15) Where denotes a randomly generated number wthn the nterval [, 1]. Recall that denotes the value of decson varable for bat at tme step.he result of n Eq. (13) s used to control the pace and range of the movement of the bats. he varable represents the current global best locaton (soluton) whch s located after comparng all the solutons among all the bats. In order to mprove the varablty of the possble solutons, Yang [15] has employed random walks. Prmarly, one soluton s selected among the current best solutons for local search and then the random walk s appled n order to generate a new soluton for each bat; (16) Where, stands for the average loudness of all the bats at tme, and s a random number. For each teraton of the algorthm, the loudness and the emsson pulse rate are updated, as follows: (17) ( ) (18) Where and are constants. At the frst step of the algorthm, the emsson rate, and the loudness, are often randomly chosen. Generally, and. 5. Modfed Cuckoo Search (MCS) he cuckoo search algorthm (CS) can easly fnd the optmum [16] but, as the search reles entrely on random walks, a fast convergence cannot be guaranteed. Modfed Cuckoo Search (MCS) made two modfcatons to the orgnal CS algorthm to ncrease the convergence rate. he frst modfcaton s made to the sze of the Lévy flght step sze α. In CS, α s constant and the value α = 1 s employed [21]. In the MCS, the value of α decreases as the number of generatons ncreases. hs s done for the same reasons that the nerta constant s reduced n the PSO,.e. to encourage more localzed searchng as the ndvduals, or the 39 Internatonal Journal of Advanced Computer Research eggs, get closer to the soluton. An ntal value of the Lévy flght step sze A = 1 s chosen and, at each generaton, a new Lévy flght step s calculated usng α=a/ G, where G s the generaton number. hs exploratory search s only performed on the fracton of nests to be abandoned. he second modfcaton s to add nformaton exchange between the eggs to speed up convergence to a mnmum. In the CS, there s no nformaton exchange between ndvduals and, essentally, the searches are performed ndependently. 6. Artfcal Bee Colony (ABC) Artfcal Bee Colony (ABC) algorthm was proposed by Karaboga n 25 for real parameter optmzaton [28]. It s nspred by the ntellgent behavor of honey bees. he colony of artfcal bees conssts of three groups of bees: employed, onlooker and scout bees. Half of the colony composed of employed bees and the rest consst of the onlooker bees. he number of food sources/nectar sources s equal wth the employed bees, whch means one nectar source s responsble for one employed bee. he am of the whole colony s to maxmze the nectar amount. he duty of employed bees s to search for food sources (solutons). Later, the nectars amount (solutons qualtes/ftness value) s calculated. hen, the nformaton obtaned s shared wth the onlooker bees whch are watng n the hve. he onlooker bees decde to explot a nectar source dependng on the nformaton shared by the employed bees. he onlooker bees also determne the source to be abandoned and allocate ts employed bee as scout bees. For the scout bees, ther task s to fnd the new valuable food sources. hey search the space near the hve randomly [43]. In ABC algorthm, suppose the soluton space of the problem s D-dmensonal, where D s the number of parameters to be optmzed. he ftness value of the randomly chosen ste s formulated as follows: ( ) (19) he sze of employed bees and onlooker bees are both SN, whch s equal to the number of food sources. here s only one employed bee for each food source whose frst poston s randomly generated. In each teraton of ABC algorthm, each

6 employed bee determnes a new neghbourng food source of ts currently assocated food source and computes the nectar amount of ths new food source by ( ) (2) Where;,,. If the new food source s better than that of prevous one, then ths employed bee moves to new food source, otherwse t contnues wth the old one. After all employed bees complete the search process; they share the nformaton about ther food sources wth onlooker bees. An onlooker bee evaluates the nectar nformaton taken from all employed bees and chooses a food source wth a probablty related to ts nectar amount by Equaton: (21) Where s the ftness value of the soluton whch s proportonal to the nectar amount of the food source n the poston and SN s the number of food sources whch s equal to the number of employed bees. Later, the onlooker bee searches a new soluton n the selected food source ste, the same way as exploted by employed bees. After all the employed bees explot a new soluton and the onlooker bees are allocated a food source, f a source s found that the ftness hasn t been mproved for a predetermned number of cycles (lmt parameter), t s abandoned, and the employed bee assocated wth that source becomes a scout bee. In that poston, scout generates randomly a new soluton by: ( ) (22) Where; s random number n range [, 1]., are the lower and upper borders n the jth dmenson of the problem space. 7. Partcle Swarm Optmzaton Algorthm (PSO) PSO s a heurstc search method whch s derved from the behavor of socal groups lke brd flocks or fsh swarms [36]. PSO moves from a set of ponts to 4 another set of ponts n a sngle teraton wth lkely mprovement usng a combnaton of determnstc and probablstc rules. he PSO has been popular because of ts ease of mplementaton, and the ablty to effectvely solve hghly nonlnear, mxed nteger optmzaton problems that are typcal of complex engneerng systems. Optmzaton s acheved by gvng each ndvdual n the search space a memory for ts prevous successes, nformaton about successes of a socal group and provdng a way to ncorporate ths knowledge nto the movement of the ndvdual [36]. herefore, each ndvdual (called partcle) s characterzed by ts poston x, ts velocty v, ts personal best poston p and ts neghborhood best poston p g. he elements of the velocty vector for partcle are updated as pb sb c q x x ) c ( x x ) (23) j j j 1,.., 1 ( j j 2 j j n pb Where, w s the nerta weght, x s the best varable vector encountered so far by partcle, sb and x s the swarm best vector,.e. the best varable vector found by any partcle n the swarm, so far c 1 and c2 are constants, and q and r are random numbers n the range (, 1). Once the veloctes have been updated, the varable vector of partcle s modfed accordng to x. (24) j x j j he cycle of evaluaton followed by updates of veloctes and postons (and possble update of pb x and x sb ) s then repeated untl a satsfactory soluton has been found. 8. he proposed models he proposed models are based on the study and gatherng of stock hstorcal data (Hgh, Low, Open, Close, and Volume) of S&P 5 stock market. he gathered data are sampled nto three categores (daly, weekly, and monthly). hen fnancal techncal ndcators are extracted and calculated from the collected hstorcal data. hese techncal are namely RSI, MFI, MACD, EMA, PMO, and SO. he selected ndcators are from all types of ndcators to make the system more robust and more accurate.

7 Indcators are used as nputs to proposed model. After extractng the ndcators features, fve models were constructed by optmzng and traned LS-SVM wth fve dfferent optmzaton algorthms (FPA, BA, MCS, ABC, and PSO). hese models are used n the predcton of daly, weekly, and monthly stock prces. Standard LS-SVM, and ANN are used as benchmarks for comparson wth proposed model. he proposed models are evaluated by four dfferent crterons. Evaluaton crterons are RMSE, MAE, SMAPE, and PMRE. he proposed models archtecture contans seven nputs vectors represent the hstorcal data and sx derved techncal ndcators from raw datasets, and one output represents next prce. he proposed models are summarzed n Fg. 1. Prce Momentum Oscllator (PMO) PMO s an oscllator based on a Rate of Change (ROC) calculaton that s exponentally smoothed twce. Because the PMO s normalzed, t can also be used as a relatve strength tool. Stocks can thus be ranked by ther PMO value as an expresson of relatve strength. C = today s close prce DAC = close prce ten days ago he followng was used to calculate PMO: (25) Relatve Strength Index (RSI) A techncal momentum ndcator that compares the magntude of recent gans to recent losses n an attempt to determne overbought and oversold condtons of an asset. he formula for computng the Relatve Strength Index s as follows. ( ) (26) Where RS = Avg. of x days up closes dvded by average of x days down closes. Money Flow Index (MFI) hs one measures the strength of money n and out of a securty. he formula for MFI s as follows. ( ) (27) Where, P s typcal prce, and V s money Vol. Money Rato (MR) s calculated as: ( ) (28) ( ( )) (29) Exponental Movng Average (EMA) hs ndcator returns the exponental movng average of a feld over a gven perod of tme. EMA formula s as follows. (3) Where s oday s close and Y s Yesterday s close. Stochastc Oscllator (SO) he stochastc oscllator defned as a measure of the dfference between the current closng prce of a securty and ts lowest low prce, relatve to ts hghest hgh prce for a gven perod of tme. he formula for ths computaton s as follows. ( ) ( ) (31) Fg.1: he proposed models steps he fnancal techncal ndcators, whch are calculated from the raw datasets, are calculated as follows: 41 Where, CP s Close prce, LP s Lowest prce, HP s Hghest Prce, and LP s Lowest Prce. MovngAverage Convergence/Dvergence (MACD) hs functon calculates dfference between a short and a long term movng average for a feld. he

8 formulas for calculatng MACD and ts sgnal as follows. (32) Where, E s EMA (CP) (33) 9. Results and Dscussons In ths secton FPA, MCS, BA, ABC, and PSO algorthms are compared n optmzng LS-SVM. he number of flowers, nests, bats, bees, and partcles are 5 and traned for 1epochs. hese models were traned and tested wth daly datasets for sx companes cover dfferent sectors n S&P 5 stock markets. hese companes are Adobe, Bank of Amerca (BAC), Exxon moble, General Electrc (GE), Peps, and Pfzer. Datasets perods are as follows whch avalable n [44]. Smulaton results are done usng matlab 212b. Daly datasets perod are from Feb. 211 to Feb Weekly datasets perod are from Feb. 24 to Feb Monthly datasets perod are from Feb. 2 to Feb hese datasets are dvded nto tranng part (7%) and testng part (3%). In table 1 the RMSE of daly stock market predcton s summarzed for all compared algorthms. From table one can notce that FPA algorthm acheved lowest sum of RMSE value wth lttle advance over BA, ABC, MCS, and PSO algorthms. In table 2 the RMSE of weekly stock market predcton for all compared models s presented. FPA algorthm also acheved lowest sum of RMSE. In table 3 the RMSE of monthly stock market predcton s outlned. BA algorthm acheved lowest sum of RMSE value wth lttle advance over FPA, ABC, MCS, and PSO algorthms. able 1: RMSE of daly results RMSE FPA BA ABC MCS PSO SVM ANN Adobe BAC Exxon GE Peps Pfzer Sum of RMSE able 2: RMSE of weekly results RMSE FPA BA ABC MCS PSO SVM ANN Adobe BAC Exxon GE Peps Pfzer Sum of RMSE able 3: RMSE of monthly results RMSE FPA BA ABC MCS PSO SVM ANN Adobe BAC Exxon GE Peps pfzer Sum of RMSE

9 1. Conclusons In ths paper fve bo nspred algorthms were proposed to tran and optmze LS-SVM model. From results we can notce that standard LS-SVM and ANN models cannot overcome the overfttng problem and have slow convergence speed. Also these algorthms stll suffer from fallng n local mnma problem. Usng natural nspred algorthms or global search technques help n overcomng the problems of tradtonal learnng algorthms. FPA, ABC, BA, PSO, MCS algorthms n most cases convergences to a global mnmum whle tradtonal LS-SVM and ANN models faled. Proposed hybrd FPA-LS-SVM method acheved lowest error values (RMSE) n daly, weekly, and weekly stock prce predcton. All proposed models (FPA-LS-SVM, BA- LS-SVM, ABC-LS-SVM, MCS-LS-SVM, and PSO- LS-SVM) could easly manpulate the fluctuaton of stock tme seres whle LS-SVM and ANN models faled to cope wth these fluctuatons. Convergence speed of proposed models to global mnmum s very fast compared to classcal methods. References [1] C. Olver, Blase Pascal Unversty: Neural network modelng for stock movement predcton, state of art. 27. [2] P.Fernández-Blanco1, D.Bodas-Sag1, F.Soltero1, J.I.Hdalgo1, echncal market ndcators optmzaton usng evolutonary algorthms, Proceedngs of the 28 GECCO conference companon on Genetc and evolutonary computaton, ACM New York, NY, USA,28, pp [3] X. Leng, and H. Mller, Input dmenson reducton for load forecastng based on support vector machnes, IEEE Internatonal Conference on Electrc Utlty Deregulaton, Restructurng and Power echnologes (DRP24), 24, pp [4] V. Cherkassky, and Y. Ma, Practcal Selecton of SVM Parameters and Nose Estmaton for SVM regresson, Neural Networks, 17(1), pp , 24. [5] J. Suykens, V. Gestel, and J. Brabanter, Least squares support vector machnes, World Scentfc, 22. [6] A. Carlos, B. Gary, and A. Davd, Evolutonary Algorthms for Solvng Mult-Objectve Problems, Sprnger, 27. [7] X.S. Yang, Flower pollnaton algorthm for global optmzaton: Unconventonal Computaton and Natural Computaton 212, 43 Internatonal Journal of Advanced Computer Research Lecture Notes n Computer Scence, Vol. 7445, pp (212). [8] B. J. Glover, Understandng Flowers and Flowerng: An Integrated Approach, Oxford Unversty Press, (27). [9] R. Prathba, M. Balasngh Moses, S. Sakthvel, Flower Pollnaton Algorthm Appled for Dfferent Economc Load Dspatch Problems, Internatonal Journal of Engneerng and echnology (IJE), Vol 6 No 2 Apr-May 214, pp [1] G.M. platt, Applcaton of Flower Pollnaton Algorthm n nonlnear algebrac systems wth multple solutons, Engneerng Optmzaton IV. [11] D Rodrgues, X.S. Yang, A.N. de Souza, J.P. Papa, Bnary Flower Pollnaton Algorthm and Its Applcaton to Feature Selecton, Sprnger, Recent Advances n Swarm Intellgence and Evolutonary Computaton Studes n Computatonal Intellgence Volume 585, 215, pp [12].S Yang, M. Karamanoglu,. He, Flower pollnaton algorthm: A novel approach for multobjectve optmzaton, Engneerng Optmzaton, Volume 46, Issue 9, 214. [13] S. Łukask, P. Kowalsk, Study of Flower Pollnaton Algorthm for Contnuous Optmzaton, Intellgent Systems'214, Advances n Intellgent Systems and Computng Volume 322, 215, pp [14] R. Wang and Y.Zhou, Flower Pollnaton Algorthm wth Dmenson by Dmenson Improvement, Mathematcal Problems n Engneerng, vol. 214, Artcle ID , 9 pages, 214. do:1.1155/214/ [15] X. S., Yang, A New Metaheurstc Bat-Inspred Algorthm, n: Nature Inspred Cooperatve Strateges for Optmzaton (NISCO 21) (Eds. Cruz, C.; Gonzalez, J. R.; Pelta, D. A.; errazas, G), Studes n Computatonal Intellgence Vol. 284, Sprnger Berln, pp , 21. [16] X. S., Yang and A. H. Gandom, Bat algorthm: a novel approach for global engneerng optmzaton, Engneerng Computatons, Vol. 29, No. 5, pp , 212. [17] X. S., Yang, Bat algorthm for mult-objectve optmsaton, Int. J. Bo-Inspred Computaton, Vol. 3, No. 5, pp , 211. [18] B., Ramesh, V. C. J., Mohan, V. C. V., Reddy, Applcaton of bat algorthm for combned economc load and emsson dspatch, Int. J. of Electrcal Engneerng and elecommuncatons, Vol. 2, No. 1, pp. 1 9,213. [19] K., Khan, A. Saha, A comparson of BA, GA, PSO, BP and LM for tranng feed forward neural networks n e-learnng context, Int. J. Intellgent Systems and Applcatons (IJISA), Vol. 4, No. 7, pp , 212.

10 [2] N.M. Naw, M. Z. Rehman, A. Khan, A New Bat Based Back-Propagaton (BA-BP) Algorthm, Sprnger, Advances n Intellgent Systems and Computng Vol. 24, pp , 214. [21] X. S., Yang, S.Deb, Cuckoo search va Lévy flghts. Proceedngs of World Congress on Nature & Bologcally Inspred Computng (NaBIC 29, Inda), IEEE Publcatons, USA 29; p [22] I.Pavlyukevch, Le vy flghts, non-local search and smulated annealng. J Comput Phys, 27, 226: [23] A.H. Gandom, X.S. Yang, A.H. Alav (213) Cuckoo search algorthm: a metaheurstc approach to solve structural optmzaton problems. Eng Comput 29(1): do:1.17/s y. [24] A.H. Gandom, X.S. Yang, S. alatahar, S. Deb (212) Coupled eagle strategy and dfferental evoluton for unconstraned and constraned global optmzaton. Comput Math Appl 63(1): [25] X.S. Yang, S. Deb (21) Engneerng optmzaton by cuckoo search. Int J Math Modell Num Opt 1(4): [26] H.Q. Zheng, Y Zhou (212) A novel cuckoo search optmzaton algorthm based on Gauss dstrbuton. J Comput Inf Syst 8: [27] S.Walton, O.Hassan, K.Morgan and M.R.Brown "Modfed cuckoo search: A new gradent free optmsaton algorthm" Chaos, Soltons & Fractals Vol 44 Issue 9, Sept 211 pp DOI:1.116/j.chaos [28] D. Karaboga, An Idea Based On Honey Bee Swarm for Numercal Optmzaton, echncal Report-R6, Ercyes Unversty, Engneerng Faculty, Computer Engneerng Department, 25. [29] B. Basturk, and D. Karaboga, An Artfcal Bee Colony (ABC) Algorthm for Numerc functon Optmzaton, IEEE Swarm Intellgence Symposum, Indanapols, Indana, USA, 26. [3] D. Karaboga, and B. Basturk, A Powerful And Effcent Algorthm For Numercal Functon Optmzaton: Artfcal Bee Colony (ABC) Algorthm, Journal of Global Optmzaton, 39(3), 27, [31] D. Karaboga, and B. Basturk, On he Performance Of Artfcal Bee Colony (ABC) Algorthm, Appled SoftComputng journal, 8(1), 28, [32] D. Karaboga, and B. Basturk, Artfcal Bee Colony (ABC) Optmzaton Algorthm for Solvng Constraned Optmzaton Problems, Advances n SoftComputng: Foundatons of Fuzzy Logc and Soft Computng, 4529, 27, [33] A.R. Yldz, Optmzaton of cuttng parameters n mult-pass turnng usng artfcal bee colonybased approach, Informaton Scences: an Internatonal Journal, 22, 213, [34] A. Gupta Stock market predcton usng Hdden Markov Models, IEEE Engneerng and Systems (SCES), 212 Students Conference on, pp.1-4, 212. [35] M. Basu, Artfcal bee colony optmzaton for mult-area economc dspatch, Internatonal Journal of Electrcal Power & Energy Systems, (49), 213, [36] D. N. Wlke. Analyss of the partcle swarm optmzaton algorthm, Master's Dssertaton, Unversty of Pretora, 25. [37] A.S. Khall An Investgaton nto Optmzaton Strateges of Genetc Algorthms and Swarm Intellgence. Artfcal Lfe (21). [38] Z. Guo, H. Wang, Q. Lu: Fnancal tme seres forecastng usng LPP and SVM optmzed by PSO Sprnger, Soft Computng Methodologes and Applcatons, December 212. [39] G. e: he Optmzaton of Share Prce Predcton Model Based on Support Vector Machne, Internatonal Conference on Control, Automaton and Systems Engneerng (CASE), pp.1-4., 3-31 July 211. [4] Y. Ju: Computatonal Intellgence Approaches for Stock Prce Forecastng, IEEE Internatonal Symposum on Computer, Consumer and Control (IS3C), pp , 212. [41] J. Suykens, J. Vandewalle, Least squares support vector machne classfers, Neural Processng Letters, Vol. 9 (3), 293-3, [42] I. Pavlyukevch, Levy flghts, non-local search and smulated annealng, J. Computatonal Physcs, 226, (27). [43] P. Mansour, B. Asady, and N. Gupta, A Novel Iteraton Method for solve Hard Problems (Nonlnear Equatons) wth Artfcal Bee Colony Algorthm, World Academy of Scence, Engneerng and echnology, 5(11), 211, [44] last accessed 11 Jan Osman Hegazy receved hs Ph.D. n Computer Engneerng n 1977 from Lecester Unversty, UK. Currently, he s Professor at the Department of Informaton Systems, Faculty of Computers and Informaton, Caro Unversty, Egypt. He has more than 1+ publcatons n the areas of Data Mnng, Data Engneerng, Cloud Computng, and Computatonal Intellgence n nternatonal journals and nternatonal conferences. He has supervsed over 5 44

11 M.Sc. s and over 3 Ph.D. n computer scence and nformaton systems. Emal: Osman.hegazy@gmal.com Omar Solman receved hs PhD n Operatons Research department, Faculty of Computers and Informaton, Caro Unversty, Egypt n 21. He has more than 5+ publcatons n the areas of Computatonal Intellgence and Optmzaton n nternatonal journals and nternatonal conferences. Mustafa Abdul Salam receved the B.S from faculty of Computers and Informaton, Zagazg Unversty, Egypt n 23, and obtaned M.Sc. s degree n nformaton system from faculty of Computers and Informaton, Menufya Unversty, Egypt n 29. He s Ph.D. student at faculty of Computers and Informaton, Caro Unversty. He s teachng assstant at nformaton system department, Hgher echnologcal Insttute. He has contrbuted more than 1+ techncal papers n the areas of Data Mnng, Computatonal Intellgence, and Natural- Inspred algorthms n nternatonal journals and nternatonal conferences. 45

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM Internatonal Conference on Ceramcs, Bkaner, Inda Internatonal Journal of Modern Physcs: Conference Seres Vol. 22 (2013) 757 761 World Scentfc Publshng Company DOI: 10.1142/S2010194513010982 FUZZY GOAL

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Particle Swarm Optimization with Adaptive Mutation in Local Best of Particles

Particle Swarm Optimization with Adaptive Mutation in Local Best of Particles 1 Internatonal Congress on Informatcs, Envronment, Energy and Applcatons-IEEA 1 IPCSIT vol.38 (1) (1) IACSIT Press, Sngapore Partcle Swarm Optmzaton wth Adaptve Mutaton n Local Best of Partcles Nanda ulal

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

arxiv: v1 [math.oc] 3 Aug 2010

arxiv: v1 [math.oc] 3 Aug 2010 arxv:1008.0549v1 math.oc] 3 Aug 2010 Test Problems n Optmzaton Xn-She Yang Department of Engneerng, Unversty of Cambrdge, Cambrdge CB2 1PZ, UK Abstract Test functons are mportant to valdate new optmzaton

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Lecture 20: November 7

Lecture 20: November 7 0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Multiple Sound Source Location in 3D Space with a Synchronized Neural System Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Short Term Load Forecasting using an Artificial Neural Network

Short Term Load Forecasting using an Artificial Neural Network Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS Journal of Theoretcal and Appled Informaton Technology 005-01 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

An Interactive Optimisation Tool for Allocation Problems

An Interactive Optimisation Tool for Allocation Problems An Interactve Optmsaton ool for Allocaton Problems Fredr Bonäs, Joam Westerlund and apo Westerlund Process Desgn Laboratory, Faculty of echnology, Åbo Aadem Unversty, uru 20500, Fnland hs paper presents

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

Thin-Walled Structures Group

Thin-Walled Structures Group Thn-Walled Structures Group JOHNS HOPKINS UNIVERSITY RESEARCH REPORT Towards optmzaton of CFS beam-column ndustry sectons TWG-RR02-12 Y. Shfferaw July 2012 1 Ths report was prepared ndependently, but was

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming EEL 6266 Power System Operaton and Control Chapter 3 Economc Dspatch Usng Dynamc Programmng Pecewse Lnear Cost Functons Common practce many utltes prefer to represent ther generator cost functons as sngle-

More information

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL The Synchronous 8th-Order Dfferental Attack on 12 Rounds of the Block Cpher HyRAL Yasutaka Igarash, Sej Fukushma, and Tomohro Hachno Kagoshma Unversty, Kagoshma, Japan Emal: {garash, fukushma, hachno}@eee.kagoshma-u.ac.jp

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Comparative Analysis of SPSO and PSO to Optimal Power Flow Solutions

Comparative Analysis of SPSO and PSO to Optimal Power Flow Solutions Internatonal Journal for Research n Appled Scence & Engneerng Technology (IJRASET) Volume 6 Issue I, January 018- Avalable at www.jraset.com Comparatve Analyss of SPSO and PSO to Optmal Power Flow Solutons

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

MODIFIED PARTICLE SWARM OPTIMIZATION FOR OPTIMIZATION PROBLEMS

MODIFIED PARTICLE SWARM OPTIMIZATION FOR OPTIMIZATION PROBLEMS Journal of Theoretcal and Appled Informaton Technology 3 st ecember 0. Vol. No. 005 0 JATIT & LLS. All rghts reserved. ISSN: 9985 www.jatt.org EISSN: 87395 MIFIE PARTICLE SARM PTIMIZATIN FR PTIMIZATIN

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

An Iterative Modified Kernel for Support Vector Regression

An Iterative Modified Kernel for Support Vector Regression An Iteratve Modfed Kernel for Support Vector Regresson Fengqng Han, Zhengxa Wang, Mng Le and Zhxang Zhou School of Scence Chongqng Jaotong Unversty Chongqng Cty, Chna Abstract In order to mprove the performance

More information

An Adaptive Learning Particle Swarm Optimizer for Function Optimization

An Adaptive Learning Particle Swarm Optimizer for Function Optimization An Adaptve Learnng Partcle Swarm Optmzer for Functon Optmzaton Changhe L and Shengxang Yang Abstract Tradtonal partcle swarm optmzaton (PSO) suffers from the premature convergence problem, whch usually

More information

COEFFICIENT DIAGRAM: A NOVEL TOOL IN POLYNOMIAL CONTROLLER DESIGN

COEFFICIENT DIAGRAM: A NOVEL TOOL IN POLYNOMIAL CONTROLLER DESIGN Int. J. Chem. Sc.: (4), 04, 645654 ISSN 097768X www.sadgurupublcatons.com COEFFICIENT DIAGRAM: A NOVEL TOOL IN POLYNOMIAL CONTROLLER DESIGN R. GOVINDARASU a, R. PARTHIBAN a and P. K. BHABA b* a Department

More information

Support Vector Machines

Support Vector Machines CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Curve Fitting with the Least Square Method

Curve Fitting with the Least Square Method WIKI Document Number 5 Interpolaton wth Least Squares Curve Fttng wth the Least Square Method Mattheu Bultelle Department of Bo-Engneerng Imperal College, London Context We wsh to model the postve feedback

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

AN EFFICIENT METHOD BASED ON ABC FOR OPTIMAL MULTILEVEL THRESHOLDING *

AN EFFICIENT METHOD BASED ON ABC FOR OPTIMAL MULTILEVEL THRESHOLDING * IJST, Transactons of Electrcal Engneerng, Vol. 6, No. E, pp 7-9 Prnted n The Islamc Republc of Iran, 0 Shraz Unversty AN EFFICIENT METHOD BASED ON ABC FOR OPTIMAL MULTILEVEL THRESHOLDING * S. A. MOHAMMADI,

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Feature Selection in Multi-instance Learning

Feature Selection in Multi-instance Learning The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng

More information

Transient Stability Assessment of Power System Based on Support Vector Machine

Transient Stability Assessment of Power System Based on Support Vector Machine ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

arxiv: v3 [cs.ne] 5 Jun 2013

arxiv: v3 [cs.ne] 5 Jun 2013 A Hybrd Bat Algorthm Iztok Fster Jr., Dušan Fster, and Xn-She Yang Abstract Swarm ntellgence s a very powerful technque to be used for optmzaton purposes. In ths paper we present a new swarm ntellgence

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

An improved multi-objective evolutionary algorithm based on point of reference

An improved multi-objective evolutionary algorithm based on point of reference IOP Conference Seres: Materals Scence and Engneerng PAPER OPEN ACCESS An mproved mult-objectve evolutonary algorthm based on pont of reference To cte ths artcle: Boy Zhang et al 08 IOP Conf. Ser.: Mater.

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

829. An adaptive method for inertia force identification in cantilever under moving mass

829. An adaptive method for inertia force identification in cantilever under moving mass 89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information