Artificial neural network regression as a local search heuristic for ensemble strategies in differential evolution

Size: px
Start display at page:

Download "Artificial neural network regression as a local search heuristic for ensemble strategies in differential evolution"

Transcription

1 Nonlnear Dyn (2016) 84: DOI /s ORIGINAL PAPER Artfcal neural network regresson as a local search heurstc for ensemble strateges n dfferental evoluton Iztok Fster Ponnuthura Nagaratnam Suganthan Iztok Fster Jr. Salahuddn M. Kamal Fahad M. Al-Marzouk Matjaž Perc Damjan Strnad Receved: 24 Aprl 2015 / Accepted: 28 November 2015 / Publshed onlne: 18 December 2015 Sprnger Scence+Busness Meda Dordrecht 2015 Abstract Nature frequently serves as an nspraton for developng new algorthms to solve challengng real-world problems. Mathematcal modelng has led to the development of artfcal neural networks (ANNs), whch have proven especally useful for solvng problems such as classfcaton and regresson. Moreover, evolutonary algorthms (EAs), nspred by Darwn s natural evoluton, have been successfully appled to solve optmzaton, modelng, and smulaton problems. Dfferental evoluton (DE) s a partcularly wellknown EA that possesses a multtude of strateges for generatng an offsprng soluton, where the best strategy s not known n advance. In ths paper, the ANN regresson s appled as a local search heurstc wthn the DE algorthm that tres predctng the best strategy or attemptng to generate a better offsprng from an ensemble of DE strateges. Ths local search heurstc I. Fster I. Fster Jr. D. Strnad Faculty of Electrcal Engneerng and Computer Scence, Unversty of Marbor, Smetanova 17, 2000 Marbor, Slovena P. N. Suganthan School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty, Sngapore , Sngapore S. M. Kamal F. M. Al-Marzouk M. Perc Department of Physcs, Faculty of Scences, Kng Abdulazz Unversty, Jeddah, Saud Araba M. Perc (B) Faculty of Natural Scences and Mathematcs, Unversty of Marbor, Koroška cesta 160, 2000 Marbor, Slovena e-mal: matjaz.perc@un-mb.s; matjaz.perc@um.s s appled to the populaton of solutons accordng to a control parameter that regulates between the tme complexty of calculaton and the qualty of the soluton. The experments on a CEC 2014 test sute consstng of 30 benchmark functons reveal the full potental n developng ths dea. Keywords Nonlnear dynamcs Artfcal neural network Dfferental evoluton Regresson Local search Ensemble strateges 1 Introducton Scentsts n varous research areas that are confronted wth solvng tough real-world problems have always searched for an nspraton n the nature. Nature not only poses the questons, but also provdes answers to these. In computer scences, two nature-nspred mechansms have been wdely used: human brans [46] and Darwnan theory of struggle for survval [7]. The former nspraton from the nature has led to the emergence of artfcal neural networks (ANNs), whle the latter to evolutonary algorthms (EAs). In ths paper, ANN s used as a regresson method to enhance the performance of dfferental evoluton (DE). Operaton of an ANN smulates the electrochemcal actvty of bran cells called neurons [46]. The frst mathematcal model of neurons was devsed by McCulloch and Ptts [37]. Accordng to ther model, a neuron fres, when a lnear combnaton of weghted nputs

2 896 I. Fster et al. exceeds some threshold. Nonlnear response characterstc of a neuron s usually acheved through a sgmod transfer functon, whch transforms the actvaton value to neuron output. In the most wdely used type of ANNs, the neurons are organzed n layers. The outputs of one layer become the weghted nputs to the next layer, wth no nterconnectons of neurons wthn the layer. One of prmary practcal uses of ANNs s to perform nonlnear regresson on a set of nput output pars. Network tranng s executed n a supervsed fashon by ntroducng the nputs to the network, observng the output error wth respect to the target values, and adjustng the connecton weghts to mprove the performance n the next round. On the other hand, DE has become one of the most promnent EAs for solvng tough real-world optmzaton problems. Ths populaton-based method was ntroduced by Storn and Prce n 1995 [48]. Indvduals n the populaton representng the soluton of the problem to be solved are n a form of real-valued vectors that are subjected to the operators of crossover and mutaton. Thus, a populaton of tral vectors s generated that compete wth ther parents for survval. As a result, when a ftness of the tral vector s better than or equal to the ftness of ts parent at the same ndex poston n the populaton, the parent s replaced by the tral (offsprng) soluton. In order to further mprove the DE algorthm, ts development has been conducted n several ways. For example, adaptng and self-adaptng DEs assume that the parameters as set at the start of the search process may not be approprate n later phases. Therefore, these parameters are encoded nto the soluton vector and undergo operatons of crossover and mutaton. Examples of successfully appled adaptve and selfadaptve DE algorthms are jde [4] and SaDE [42]. Another knd of DE algorthms tres to mprove the results of the orgnal DE algorthm by usng ensembles of parameters and mutaton DE strateges [34,49,50]. A complete survey of DE methods can be found n [8,51]. Selecton of proper DE mutaton strategy s problem specfc. Furthermore, the best strategy may change wth the search progress n the same way as other DE parameters. The problem of adaptng the DE mutaton strategy has prevously been addressed n [33]. In ths paper, we propose the use of ANN to buld an adaptve regresson model for the best DE mutaton strategy from an ensemble of DE strateges. In [14], varous DE strateges are appled for each ndvdual, where the best value of element obtaned by all strateges n the DE ensemble s used to predct the best value of the correspondng tral vector. Ths contrbuton tres to overcome the tme complexty of the ANN regresson appled to each ndvdual. Here, the ANN regresson s used as a local search heurstc appled to a canddate soluton accordng to the control parameter called probablty of regresson. The hgher the value of ths parameter, the more canddate solutons undergo ANN local search heurstc. The structure of the remander of the paper s as follows: In Sect. 2, we gve a short overvew of ANN and DE. Secton 3 proposes a new DE algorthm wth ANN-based regresson of DE strateges (nnde). The experments and the results are presented n Sect. 4. The paper concludes wth a revew of the paper contrbutons and prospects for future work. 2 Background 2.1 Artfcal neural networks An ANN s a mathematcal model of human bran. It conssts of a set of nterconnected artfcal neurons, whch smulate the operaton of natural neurons (.e., bran cells) [45]. The electrochemcal sgnals n the bran are amplfed and propagated along neuronal chans, whereby each neuron receves a number of nput sgnals through ramfed sensors (dendrtes) and forwards an output sgnal through ts sngle extenson (axon) [21]. The smplest artfcal neuron model s shown n Fg. 1a. It receves the nput vector x = (x 0,...,x n ) and produces output y usng the followng equaton [37]: ( n ) y = φ w x (1) =0 The sgnal transfer functon s programmable through asetw = (w 0,...,w n ) of weghts on the nput lnes, where lne 0 serves as an ntercept wth fxed nput value x 0 = 1. A common choce for the transfer functon φ s a Heavsde functon or a sgmod functon lke tanh. Wth respect to the varety of connectvty types that emerge n dfferent functonal areas of the bran, many dfferent neural network archtectures have been proposed n the past. The one that receved the wdest prac-

3 Artfcal neural network regresson as a local search heurstc 897 Fg. 1 An example of an artfcal neuron (a), and an artfcal neural network (b). a Artfcal neuron, b Multlayer percepton tcal applcaton and s also employed n the context of our paper s a feedforward multlayer ANN that conssts of neurons organzed n several layers (Fg. 1b). Numercal nput sgnals enter the network on the left and propagate through layers toward the outputs on the rght. The nterlayer sgnalng works by usng the outputs of layer as nputs of layer + 1. An establshed term for such type of ANN s a multlayer perceptron (MLP). All layers except the last output layer are referred to as hdden layers. The spectrum of applcaton domans for the ANNs s wde and ncludes, among others, applcatons n ndustry and busness [54], data mnng [3], cvl engneerng [27], and fre safety [25]. In the feld of machne learnng, the ANNs are used n classfcaton and regresson tasks [28,30,56]. In ts role as nonlnear regressor, the MLP must be traned usng a set of tranng samples wth known target values. Ths approach s known as supervsed learnng [23]. The tranng procedure s an teratve process n whch the network weghts are progressvely adjusted such that the dscrepancy between the network outputs and target values s mnmzed. The common error measures for network predcton qualty are the mean-squared error (MSE) and the cross-entropy error (CEE). The best known supervsed learnng algorthm for MLP s the back-propagaton method, whch uses the gradent of the error functon to adapt the weght vectors. Weght changes are usually performed after the presentaton of each ndvdual tranng pattern and the determnaton of output error, but can be delayed to after the completon of a cycle of presentatons (called an epoch). The termnaton crteron for the tranng s determned by the allowed error tolerance, maxmum number of tranng epochs, or one of the more advanced methods lke cross-valdaton of predcton effcency on a separate test set. In ths paper, we propose the use of a two-layer MLP as an aggregator for an ensemble of DE strateges, where the best member of current populaton s the regresson target and the tral vectors derved by dfferent DE strateges are used as tranng nputs. 2.2 Dfferental evoluton Dfferental evoluton (DE) belongs to the class of evolutonary algorthms and s approprate for solvng contnuous as well as dscrete optmzaton problems. DE was ntroduced by Storn and Prce n 1995 [48] and snce then many DE varants have been proposed. The orgnal DE algorthm s represented by real-valued vectors and s populaton-based. The DE supports operators, such as mutaton, crossover, and selecton. In the basc mutaton, two solutons are randomly selected and ther scaled dfference s added to the thrd soluton, as follows: u (t) =x (t) r0 +F (x(t) r1 x(t) r2 ), for =1...NP, (2) where F [0.1, 1.0] denotes the scalng factor that scales the rate of modfcaton, whler0, r1, r2 are randomly selected values n the nterval 1...NP and NP represents the populaton sze. Note that the proposed nterval of values for parameter F was enforced n the DE communty, although Prce and Storn proposed the slghtly dfferent nterval,.e., F [0.0, 2.0]. DE employs a bnomal (bn) or exponental (exp) crossover. The tral vector s bult from parameter values coped from ether the mutant vector generated by Eq. (2) or parent at the same ndex poston. Mathematcally, ths crossover can be expressed as follows:

4 898 I. Fster et al. Table 1 An ensemble of DE strateges (ES) No. Strategy Mutaton expresson Crossover 1 Best/1/Exp x (t+1), j = best (t) j + F (x (t) r1, j x(t) r2, j ) Exponental 2 Rand/1/Exp x (t+1), j = x (t) r1, j + F (x(t) r2, j x(t) r3, j ) Exponental 3 RandToBest/1/Exp x (t+1), j = x (t), j + F (best(t) x (t), j ) + F (x(t) r1, j x(t) r2, j ) Exponental 4 Best/2/Exp x (t+1), j = best (t) + F (x (t) r1, + x(t) r2, x(t) r3, x(t) r4, ) Exponental 5 Rand/2/Exp x (t+1), j = x (t) r1, + F (x(t) r2, + x(t) r3, x(t) r4, x(t) r5, ) Exponental 6 Best/1/Bn x (t+1), j = best (t) + F (x (t) r1, x(t) r2, ) Bnomal 7 Rand/1/Bn x (t+1), j = x (t) r1, j + F (x(t) r2, j x(t) r3, j ) Bnomal 8 RandToBest/1/Bn x (t+1), j = x (t), j + F (best(t) x (t), j ) + F (x(t) r1, j x(t) r2, j ) Bnomal 9 Best/2/Bn x (t+1), j = best (t) + F (x (t) r1, + x(t) r2, x(t) r3, x(t) r4, ) Bnomal 10 Rand/2/Bn x (t+1), j = x (t) r1, + F (x(t) r2, + x(t) r3, x(t) r4, x(t) r5, ) Bnomal w (t), j = { (t) u, j rand j (0, 1) CR j = j rand, x (t), j otherwse, (3) where CR [0.0, 1.0] controls the fracton of parameters that are coped to the tral soluton. The condton j = j rand ensures that the tral vector dffers from the orgnal soluton x (t) n at least one element. Mathematcally, the selecton can be expressed as follows: x (t+1) = { w (t) f f (w (t) ) f (x (t) ), x (t) otherwse. (4) Crossover and mutaton can be performed n several ways n dfferental evoluton. Therefore, a specfc notaton was ntroduced to descrbe the varetes of these methods (also strateges), n general. For example, rand/1/bn denotes that the base vector s randomly selected, 1 vector dfference s added to t, and the number of modfed parameters n the tral/offsprng vector follows a bnomal dstrbuton. The other standard DE strateges are llustrated n Table 1. These strateges also form an ensemble of DE strateges (ES). 2.3 An evoluton of DE algorthms Snce ts ntroducton n 1995, many varants of DE algorthm have been developed so far. Zhang et al. [55] combned dfferental evoluton wth partcle swarm optmzaton, and result was a new algorthm called DEPSO. Fan and Lampnen [13] added a new trgonometrc mutaton operator to DE n Ln et al. [31] developed co-evolutonary hybrd DE. Chakraborty et al. [6] proposed an mproved varant of orgnal DE/best/1 scheme by utlzng the concept of the local topologcal neghborhood of each vector. Scheme tres to balance exploraton and explotaton abltes of DE, wthout usng addtonal FES. Qn et al. [42] proposed a dfferental evoluton algorthm wth strategy adaptaton. In 2006, Brest [4] proposed the concept of selfadaptaton of control parameters. A new algorthm jde was proposed n [4]. Rahnamayan et al. [43] ncorporated an opposton-based mechansm n DE, whle Das et al. [9] proposed DE usng neghborhood-based mutaton operator. Potrowsk [41] combned some well-known DE approaches and gathered together n one framework as a new adaptve memetc DE wth global and local neghborhood-based mutaton operators. Han et al. [22] created dynamc group-based dfferental evoluton usng a self-adaptve strategy to cope wth global optmzaton problems, whle Ca and Wang [5] developed dfferental evoluton wth neghborhood and drecton nformaton. Ner et al. [38 40] proposed compact dfferental evoluton (cde) whch could run also on very lmted hardware. Dfferental evoluton has been used to solve practcal problems such as electromagnetcs [44], economc emsson load dspatch problems [2], crop plannng model [1], unt commtment problem [10], short-term electrcal power generaton schedulng [52], ANNs desgn [20], proten structure predcton [47] and many more.

5 Artfcal neural network regresson as a local search heurstc Ensemble DE methods In the lterature, some ensemble DE methods were proposed. Mallpedd et al. [34 36] proposed an EPSDE algorthm (dfferental evoluton algorthm wth ensemble of parameters and mutaton strateges) where a pool of dstnct mutaton strateges along wth a pool of values for each control parameter coexsts throughout the evoluton process and competes to produce offsprng. Mallpedd and Suganthan [32] also proposed dfferental evoluton algorthm wth ensemble of populatons to deal wth global numercal optmzaton. Fster et al. [17] appled ensemble of DE strateges to the hybrd bat algorthm [18]. Elsayed et al. [12] ntroduced an algorthm framework that uses multple search operators n each generaton. An approprate mx of search operators s determned adaptvely. LaTorre [29] explored the use of a hybrd memetc algorthm based on the multple offsprng framework. Ther algorthm combnes the exploratve/explotatve strength of two heurstc search methods that separately obtan very compettve results. Vrugt et al. [53] proposed a concept, where dfferent search algorthms run concurrently and learn from each other through nformaton exchange usng a common populaton jde algorthm In 2006, Brest et al. [4] proposed an effectve DE varant (jde), where control parameters are changed durng the run. In ths case, two parameters, namely scale factor F and crossover rate CR, are changed durng the run. In jde, every ndvdual s extended wth F and CR: x (t) = (x (t),1, x(t),2,...,x(t),d, F(t), CR (t) ). jde are changng parameters F and CRaccordng to the followng equatons: { F l + rand 1 (F u F l ) f rand 2 <τ 1, F (t+1) = CR (t+1) = F (t) { rand 3 f rand 4 <τ 2, CR (t) otherwse, otherwse, where rand =1...4 [0, 1] are randomly generated values, τ 1 and τ 2 are learnng steps, F l and F u lower and upper bound for parameter F. (5) (6) 3 The proposed algorthm The proposed ANN regresson on ensemble of DE strateges (nnde) (pseudo-code n Algorthm 1) modfes the generaton of the tral vector n the orgnal DE algorthm. The tral vector y s produced from the orgnal vector x by the default DE mutaton strategy. A local search step (lnes 5 10 n Algorthm 1) s then performed wth probablty p r.the regresson probablty has a great mpact on the performance of the algorthm, because t controls the number of local search steps to be launched. Therefore, t nfluences the exploraton and explotaton of the evolutonary search process. The hgher the probablty of the regresson, the more local search steps are ntated. On the other hand, each local search demands an addtonal processor tme that may cause a performance degradaton of the algorthm. As a result, the proper value of ths parameter needs to be found for each specfc problem on a case-by-case bass. Durng each local search, a regresson ANN s traned usng a tranng set T = {(t (k), x best ); k = 1,...,P}, where each tranng pattern conssts of an nput vector t (k) of dmenson D and a vector of network target outputs, whch s x best n all cases. Input vectors t (k) are derved from the currently processed populaton member x usng randomly selected DE strateges from the ensemble of strateges (ES) collected n Table 1. The neural network thus has D nputs, 1 + log 2 D hdden neurons, and D output neurons. A traned ANN performs the regresson of the best found soluton from the set of tral solutons provded by the ensemble of DE strateges. When the strateges agree and the tral solutons are smlar, the nonlnear transformaton represented by the traned ANN performs a narrowly drected local search. When ths s not the case, a random search ensues. After tranng, the regresson vector r s obtaned by ntroducng the tral vector y to the network. The regresson vector replaces the tral and s used n ts place for subsequent ftness comparson wth the orgnal vector x. Note that only one ftness evaluaton s spent durng ths local search step because the generaton of the regresson vector s performed n genotypc and not n phenotypc search space.

6 900 I. Fster et al. Algorthm 1 The nnde algorthm 1: Intalze the DE populaton x = (x 1,..., x D ) for = 1...NP where NP s the populaton sze. 2: repeat 3: for = 1 to NP do 4: Generate tral vector y from x usng default DE strategy; 5: f rand() < p r then 6: Create a tranng set T from vector x usng ES and target vector x best ; 7: Tran the ANN usng T tll the number of epochs epoch exceeded; 8: Buld regresson vector r from y usng the traned ANN; 9: y = r ; 10: end f 11: f f (y )< f (x ) then 12: x = y ; 13: f f (y )< f (x best ) then 14: x best = y ; 15: end f 16: end f 17: end for 18: untl DE termnaton condton s met In general, the effcency of the local search depends on two facts [24]: How often t s launched and how extensvely the local search process explores the search space. The former s controlled wth the parameter p r, whle the latter depends on the number of epochs needed for tranng the ANN. Typcally, a desgner needs to choose between the often launched short-term and rarely launched long-term local search. The shortterm local search demands a smaller, whle the longterm a larger number of ANN tranng epochs. In ths study, a rarely launched long-term local search heurstc s tested. 4 Expermental results The goal of our expermental work s to show that usng the ANN-based regresson wthn the DE (nnde) and self-adaptve jde [4] (nnjde) can mprove the results of the orgnal DE and jde algorthms. In lne wth ths, the comparatve study of the mentoned algorthms was performed by solvng the CEC 2014 test sute. The SaDE [42] method was also ncluded n ths comparatve study. In order to analyze the mpact of ANN regresson on the orgnal DE and jde algorthms, the followng ssues were nvestgated: the mpact of the regresson probablty p r and the number of ANN tranng epochs epoch, the mpact of the ftness functon evaluatons, and the mpact of problem dmensonalty. A comparatve effcency study of methods wth and wthout the use of ANN local search was performed, and ther convergence graphs were analyzed. The control parameters of the DE and nnde algorthms durng the test were set as follows: F = 0.5, CR = 0.9, and NP = 100. The populaton sze parameter NP was the same for all compared algorthms. The proper value of ths parameter manly depends on the problem to be solved and was determned after extensve expermentaton. The jde and nnjde algorthms parameters were set as follows: F [0.1, 1.0], CR [0.0, 1.0],τ 1 = τ 2 = 0.1. As a termnaton condton, the number of ftness functon evaluatons was used, as specfed n the CEC 2014 benchmark sute,.e., T max = D. Each functon was optmzed 25 tmes. The ANN tranng n nnde and nnjde was termnated at 1000 epochs or when the tranng MSE dropped below 10 6, whchever occurred frst. The ANN mplementaton from the OpenCV lbrary was used that supports a back-propagaton method of tranng, whch was used n our tests wth the learnng rate and momentum scale set to 0.5 and 0.1, respectvely. 4.1 Test sute The CEC 2014 test sute (Table 2) conssts of 30 benchmark functons that are dvded nto four classes: unmodal functons (1 3), smple multmodal functons (4 16), hybrd functons (17 22), composton functons (23 30). Unmodal functons have a sngle global optmum and no local optmums. Unmodal functons n ths sute are non-separable and rotated. Mult-modal functons are ether separable or non-separable. In addton, they are also rotated or/and shfted. To develop the hybrd functons, the varables are randomly dvded nto some subcomponents, and then, dfferent basc functons are used for dfferent subcomponents [26]. Composton functons consst of a sum of two or more basc functons. In ths sute, hybrd functons are used as the basc functons to construct composton functons. Characterstcs of these hybrd and composton functons depend on the characterstcs of the basc functons. The functons of dmensons D = 10, D = 20, and D = 30 were used n our experments. The

7 Artfcal neural network regresson as a local search heurstc 901 Table 2 Summary of the CEC 14 test functons No. Functons F Unmodal functons 1 Rotated hgh condtoned ellptc functon Rotated bent cgar functon Rotated dscus functon 300 Smple multmodal functons 4 Shfted and Rotated Rosenbrocks functon Shfted and rotated Ackleys functon Shfted and rotated Weerstrass functon Shfted and rotated Grewanks functon Shfted Rastrgns functon Shfted and rotated Rastrgns functon Shfted Schwefels functon Shfted and rotated Schwefels functon Shfted and rotated Katsuura functon Shfted and rotated HappyCat functon Shfted and rotated HGBat functon Shfted and rotated expanded Grewanks 1500 plus Rosenbrocks functon 16 Shfted and rotated expanded Scaffers F functon Hybrd functons 17 Hybrd functon 1 (N = 3) Hybrd functon 2 (N = 3) Hybrd functon 3 (N = 4) Hybrd functon 4 (N = 4) Hybrd functon 5 (N = 5) Hybrd functon 6 (N = 5) 2200 Composton functons 23 Composton functon 1 (N = 5) Composton functon 2 (N = 3) Composton functon 3 (N = 3) Composton functon 4 (N = 5) Composton functon 5 (N = 5) Composton functon 6 (N = 5) Composton functon 7 (N = 3) Composton functon 8 (N = 3) 3000 = F (x ) search range of the problem varables s lmted to x [ 100, 100]. 4.2 Impacts of the regresson probablty and the number of ANN tranng epochs The goal of ths experment s twofold. Frstly, we am to dscover how the parameter p r affects the results of the nnde and nnjde algorthms on the CEC 2014 test sute, and secondly, we want to explore how the number of ANN tranng epochs nfluences the results of the nnde algorthm on the same test sute. In the frst experment, the probablty of local search applcaton was vared n the nterval p r [0.005, 0.05] n steps of 0.005, resultng n ten launch confguratons per problem sze D. The results of each confguraton accordng to fve statstcal measures (.e., mnmum, maxmum, average, medan, and standard devaton) accumulated over 25 runs for each functon were aggregated nto a statstcal classfer consstng of 30 5 = 150 varables that served as nput to Fredman statstcal test. The Fredman test [11] calculates the average method ranks over all problem nstances (.e., benchmark functons) for each of the test confguratons. For the case D = 10, Fg. 2

8 902 I. Fster et al. Fg. 2 Average rank dfferences of nnde (a) and nnjde (b) algorthms acheved over all problem nstances wth D = 10 for dfferent settngs of p r. The rank dfference s expressed as bar llustrates the dfferences of average ranks acheved by nnde and nnjde n comparson wth the orgnal DE and jde methods, respectvely. In the fgure, each postve average rank dstance means that the correspondng nstance of nnde and nnjde outperformed the results of the orgnal DE and jde algorthms, and vce versa. Namely, negatve average rank dfferences ndcate that the orgnal DE and jde algorthms outperformed the results acheved by the nnde and nnjde algorthms. Two conclusons can be deduced from the expermental results. Frstly, the obtaned results strongly depend on the probablty of regresson p r. Secondly, the best results for D = 10 are obtaned when p r = 0.01, whch comples wth Potrowsk [41] who proposed performng the local search wth probablty p r = when 100 D ftness functon evaluatons are spent per launch. In the second experment, the ANN tranng epochs n the nnde algorthm were vared by epoch {100, 500, 1000, 2000, 5000} for benchmark functons of dmensons D = 10, D = 20, and D = 30. Extensve experments showed that the settng p r = 0.01, where one local search step s launched n average when usng the populaton sze Np = 100, s not optmal. It turns out that the optmal value of ths parameter lays n a range p r (0.0, 0.01]. Therefore, t s vared as p r {0.01, 0.005, 0.003} n our tests, whch corresponds to one call of the local search heurstc every one, two, and three generatons, respectvely. heght and drecton. As a result, all bars hgher than zero ndcate that the correspondng hybrd DE algorthm outperformed the orgnal DE algorthm n a partcular parameter settng The average ranks obtaned by Fredman nonparametrc tests for experment results obtaned by nnde wth dfferent problem dmensons are llustrated n Fg. 3. Each graph plots the average rank aganst the number of ANN tranng epochs. Each lne corresponds to one of the tested values of p r. Two facts can be concluded from the fgure, as follows: the smaller the probablty of regresson, the better the results, the hgher the dmenson of the problem, the smaller the number of epochs requred. n order to obtan the best results when optmzng the benchmark functons of dmenson D = 10, the number of ANN tranng epochs epoch = 2000 s needed, whle epoch = 100 s enough to obtan the best results for dmenson D = 30. The number of ANN tranng epochs depends on the probablty of regresson by optmzng the functons of dmensons D = 20,.e., epoch {100, 500, 1000, 2000, 5000} and p r {0.01, 0.005, 0.003}. Although nfrequently executed, the local search step sgnfcantly nfluences the results of the optmzaton. On the other hand, the ANN tranng starts to domnate the optmzaton runtme wth a growng number of tranng epochs. Fortunately, the smaller number of requred epochs and nfrequent launchng of local search steps make solvng of the problem optmzaton tractable n hgher dmensons.

9 Artfcal neural network regresson as a local search heurstc 903 Fg. 3 Influence of the epoch number on the results of the optmzaton obtaned by the nnde, where each dagram conssts of three curves representng average ranks accordng to the specfc probablty of regresson p r. a D = 10, b D = 20, c D = Impact of the ftness functon evaluatons One of the more relable ndcators of search stagnaton s when the best result s not mproved for a long term. Alternatvely, ths can also mean that the search process gets stuck n a local optmum. In order to detect these undesrable stuatons durng the run of nnde, the ftness values were montored at three dfferent phases of the search process,.e., at 1/25, 1/5, and at the fnal ftness functon evaluaton. The results of ths test are collated n Tables 3 and 4 for functons of dmenson D = 20. As can be seen from Tables 3 and 4, nnjde successfully progressed toward the global optmum accordng to all benchmark functons,.e., no stagnaton of the search process s detected. 4.4 Impact of the problem dmensonalty The goal of ths experment s to dscover how the qualty of the results obtaned by the nnde depends on the dmenson of the problem. In lne wth ths, three dfferent dmensons of the benchmark functons D {10, 20, 30} were taken nto account. The results of the tests accordng to fve measures are presented n Tables 5 and 6. In ths experment, t was expected that the functons of the hgher dmensons would be harder to optmze,

10 904 I. Fster et al. Table 3 Results of the nnde wth p r = and epoch = 2000 showng an mpact of the ftness functon evaluatons measured after 1 25, 1 5,and 1 1 of the maxmum ftness functon evaluatons for dmenson D = 20 Part 1/2 Func. FEs Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E 002

11 Artfcal neural network regresson as a local search heurstc 905 Table 3 contnued Func. FEs Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E 001 Table 4 Results of the nnde wth p r = and epoch = 2000 showng an mpact of the ftness functon evaluatons measured after 1 25, 1 5,and 1 1 of the maxmum ftness functon evaluatons for dmenson D = 20 Part 2/2 Func. FEs Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E 001

12 906 I. Fster et al. Table 4 contnued Func. FEs Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E+001 Table 5 Results of the nnde wth p r = and epoch {2000, 2000, 100} showng an mpact of the dmensonalty of the problem measured by functon dmensons D {10, 20, 30}, respectvely Part 1/2 Func. Dm. Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E 003

13 Artfcal neural network regresson as a local search heurstc 907 Table 5 contnued Func. Dm. Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E+000 and therefore, the obtaned results would be worse. As a matter of fact, ths assumpton holds n general except for functons f 7, f 11 and f 12, where nnde acheved better results by optmzng the functons of dmenson D = 20 than by the other dmensons. Interestngly, the results of optmzng the functon f 26 are equal by the all algorthms n test. 4.5 A comparatve study In order to show that the hybrdzaton wth ANN regresson mproves the results of the orgnal DE and jde algorthms, a comparatve study was performed. In ths study, the results of the DE and jde algorthms were compared wth the results of nnde and nnjde,.e., hybrdzed versons of the DE algorthms wth p r = and epoch = All the mentoned algorthms used the parameters as reported at the begnnng of ths secton. Ths comparatve study was wdened by an addtonal self-adaptve DE algorthm,.e., the SaDE usng the followng parameter settngs: F s randomly selected from the normal dstrbuton N(0.5, 0.3), whle CR s randomly drawn from a normal dstrbuton N(CRm k, 0.1). The varable CRm k denotes an average value of parameter CR for k-th strategy (four strateges were used n the ensemble strateges) durng the last LP = 20 (.e., learnng rate parameter) generatons. The results accordng to the mean and standard devaton obtaned by solvng the CEC 2014 benchmark functons of dmenson D = 30 are llustrated n Table 7. The best results n the table are presented n bold. As t can be seen from Table 7, the nnjde outperforms the results of the other observed algorthms twelve tmes, SaDE eght tmes, nnde and DE four tmes, and jde once. All algorthms acheved the same results for functons f 23 and f 26. Note that the nnde

14 908 I. Fster et al. Table 6 Results of the nnde wth p r = and epoch {2000, 2000, 100} showng an mpact of the dmensonalty of the problem measured by functon dmensons D {10, 20, 30}, respectvely Part 2/2 Func. Dm. Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E+001

15 Artfcal neural network regresson as a local search heurstc 909 Table 6 contnued Func. Dm. Best Worst Mean Medan Std E E E E E E E E E E E E E E E E E E E E E E E E E E E E E E+002 Table 7 Comparatve study of algorthms DE, nnde, jde, nnjde, and SaDE regardng the mean and standard devaton by dmenson of the functons D = 30 Func. Meas. DE nnde jde nnjde SaDE f1 Mean 1.01E E E E E+003 StDev 8.98E E E E E+003 f2 Mean 2.27E E E E E 014 StDev 7.87E E E E E 014 f3 Mean 2.05E E E E E 014 StDev 2.78E E E E E 014 f4 Mean 2.84E E E E E 013 StDev 1.26E E E E E 014 f5 Mean 2.09E E E E E+001 StDev 7.67E E E E E 002 f6 Mean 4.12E E E E E+001 StDev 3.11E E E E E 001 f7 Mean 2.96E E E E E 014 StDev 1.48E E E E E 014 f8 Mean 6.52E E E E E 013 StDev 3.17E E E E E 014 f9 Mean 1.74E E E E E+001 StDev 1.08E E E E E+000 f10 Mean 2.14E E E E E+000 StDev 9.80E E E E E+000 f11 Mean 6.70E E E E E+003 StDev 3.24E E E E E+002 f12 Mean 2.40E E E E E 001 StDev 2.97E E E E E 002 f13 Mean 3.18E E E E E 001 StDev 4.30E E E E E 002 f14 Mean 2.73E E E E E 001 StDev 3.06E E E E E 001 f15 Mean 1.48E E E E E+000 StDev 1.13E E E E E 001

16 910 I. Fster et al. Table 7 contnued Func. Meas. DE nnde jde nnjde SaDE f16 Mean 1.25E E E E E+001 StDev 2.41E E E E E 001 f17 Mean 1.28E E E E E+002 StDev 3.40E E E E E+002 f18 Mean 5.08E E E E E+001 StDev 1.66E E E E E+001 f19 Mean 4.89E E E E E+000 StDev 8.59E E E E E+000 f20 Mean 1.24E E E E E+001 StDev 6.77E E E E E+000 f21 Mean 2.75E E E E E+002 StDev 2.53E E E E E+002 f22 Mean 1.21E E E E E+002 StDev 1.22E E E E E+001 f23 Mean 3.15E E E E E+002 StDev 9.28E E E E E+000 f24 Mean 2.22E E E E E+002 StDev 7.06E E E E E+000 f25 Mean 2.03E E E E E+002 StDev 2.19E E E E E 001 f26 Mean 1.00E E E E E+002 StDev 4.19E E E E E 002 f27 Mean 3.78E E E E E+002 StDev 8.23E E E E E+002 f28 Mean 8.44E E E E E+002 StDev 4.73E E E E E+001 f29 Mean 6.83E E E E E+005 StDev 2.36E E E E E+006 f30 Mean 1.96E E E E E+003 StDev 1.24E E E E E+003 Fg. 4 Results of the Fredman nonparametrc test, where each dagram llustrates the normalzed average rank of the algorthms n test for the specfed dmensons of the benchmark functons. The closer to one the value of the algorthm s rank, the more sgnfcant s the specfc algorthm. a D = 10, b D = 20, c D = 30

17 Artfcal neural network regresson as a local search heurstc 911 Fg. 5 Convergence graphs for sx selected functons from the benchmark functon sute. a f 6 (D = 10), b f 8 (D = 10), c f 12 (D = 20), d f 18 (D = 20), e f 22 (D = 30), f f 28 (D = 30)

18 912 I. Fster et al. and SaDE obtaned the same results for the functon f 9. However, the statstcal analyss takes nto account also the mnmum, maxmum, medan, and standard devaton values. Ths comparson s therefore more accurate. In summary, the nnde s thus better for solvng problems of hgher dmensons (.e., D = 30), whle the nnjde s better for solvng the problems of lower dmensons (.e., D = 10 and D = 20). In order to evaluate the qualty of the results statstcally, Fredman tests [19] were conducted that compare the average ranks of the compared algorthms. Thus, a null hypothess s placed that states: All algorthms are equvalent, and therefore, ther ranks should be equal. When the null hypothess s rejected, the Nemeny post hoc test [11] s performed, where the crtcal dfference s calculated between the average ranks for each algorthm. Three Fredman tests were performed regardng the values of fve measures obtaned by optmzng 30 functons of three dfferent dmensons. As a result, each algorthm n the tests was compared wth respect to 150 varables. The tests were conducted at the sgnfcance level The results of the Fredman nonparametrc test can be seen n Fg. 4, whch s dvded nto three dagrams. Each dagram shows the ranks and confdence ntervals (crtcal dfferences) for the algorthms under consderaton wth regard to each problem dmensonalty. Note that a sgnfcant dfference between two algorthms s observed f ther confdence ntervals denoted as thckened lnes n Fg. 4 do not overlap. Fgure 4a c shows that the orgnal DE algorthm was sgnfcantly outperformed by all other algorthms n the test for all problem dmensons. The nnjde algorthm exhbts the best results n dmensons D = 10 (Fg. 4a) and D = 20 (Fg. 4b), whle nnde domnates the compettors for D = 30 (Fg. 4c). As demonstrated, the proposed local search heurstc sgnfcantly mproves the results of both orgnal DE and jde algorthms, wth the excepton of the case nnjde vs. jde, where the advantage of nnjde s not conclusve. Thereby, the asserton set at the begnnng of the secton has been successfully confrmed. 4.6 Convergence analyss Convergence graphs were analyzed for functons f 6 and f 8 of dmenson D = 10, functons f 12 and f 18 of dmenson D = 20, and functons f 22 and f 28 of dmenson D = 30. The best out of 25 optmzaton runs was analyzed. Convergence graphs are llustrated n Fg. 5 wth two dagrams per problem dmenson. The followng observatons can be seen from these graphs: nnjde outperforms the results of the orgnal jde for optmzaton of all presented functons, except f 28, nnde outperforms the results of the orgnal DE for optmzaton of all presented functons, except f 22 and f 28, all algorthms acheved the smlar results for optmzaton of the functon f 28. In summary, the presented results confrmed that hybrdzng the orgnal DE and jde algorthms wth the ANN regresson can mprove the results of both. 5 Concluson Recently, hybrdzng the nature-nspred algorthms n order to expand ts applcablty and mprove performance has become a popular trend n computatonal ntellgence [15,16]. Ths paper proposes the hybrdzaton of a DE algorthm wth an ANN-based regresson as a way to apply the local search heurstc. The ANN functons as a predctor of the best soluton from a tranng set of tral vectors produced by an ensemble of DE strateges. As a result, two hybrd DE algorthms were developed,.e., nnde representng the hybrdzaton of the orgnal DE algorthm wth ANN, and nnjde representng the hybrdzaton of the orgnal jde algorthm wth ANN. The results of experments conducted on a CEC 2014 test sute consstng of 30 benchmark functons have shown that the proposed hybrds substantally outperform ther orgnal predecessors. Moreover, the performances gap broadened when the dmensonalty of the problem was ncreased. The experments suggest that the qualty of results hghly depends on the value of parameter p r, whch determnes the probablty of local search executon. Lower values of p r are generally requred for hgher problem dmensons. These prelmnary results advocate further nvestgaton of the proposed hybrdzaton n the future. As the frst next step, however, we would lke to expand

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Differential Evolution Algorithm with a Modified Archiving-based Adaptive Tradeoff Model for Optimal Power Flow

Differential Evolution Algorithm with a Modified Archiving-based Adaptive Tradeoff Model for Optimal Power Flow 1 Dfferental Evoluton Algorthm wth a Modfed Archvng-based Adaptve Tradeoff Model for Optmal Power Flow 2 Outlne Search Engne Constrant Handlng Technque Test Cases and Statstcal Results 3 Roots of Dfferental

More information

Multi-agent system based on self-adaptive differential evolution for solving dynamic optimization problems

Multi-agent system based on self-adaptive differential evolution for solving dynamic optimization problems Mult-agent system based on self-adaptve dfferental evoluton for solvng dynamc optmzaton problems Aleš Čep Faculty of Electrcal Engneerng and Computer Scence Unversty of Marbor ales.cep@um.s ABSTRACT Ths

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Self-adaptive Differential Evolution Algorithm for Constrained Real-Parameter Optimization

Self-adaptive Differential Evolution Algorithm for Constrained Real-Parameter Optimization 26 IEEE Congress on Evolutonary Computaton Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 26 Self-adaptve Dfferental Evoluton Algorthm for Constraned Real-Parameter Optmzaton V.

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Research Article A Novel Hybrid Self-Adaptive Bat Algorithm

Research Article A Novel Hybrid Self-Adaptive Bat Algorithm e Scentfc World Journal, Artcle I 709738, 12 pages http://dx.do.org/10.1155/2014/709738 Research Artcle A Novel Hybrd Self-Adaptve Bat Algorthm Iztok Fster Jr., 1 Smon Fong, 2 Janez Brest, 1 and Iztok

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Solving of Single-objective Problems based on a Modified Multiple-crossover Genetic Algorithm: Test Function Study

Solving of Single-objective Problems based on a Modified Multiple-crossover Genetic Algorithm: Test Function Study Internatonal Conference on Systems, Sgnal Processng and Electroncs Engneerng (ICSSEE'0 December 6-7, 0 Duba (UAE Solvng of Sngle-objectve Problems based on a Modfed Multple-crossover Genetc Algorthm: Test

More information

Real Parameter Single Objective Optimization using Self-Adaptive Differential Evolution Algorithm with more Strategies

Real Parameter Single Objective Optimization using Self-Adaptive Differential Evolution Algorithm with more Strategies Real Parameter Sngle Objectve Optmzaton usng Self-Adaptve Dfferental Evoluton Algorthm wth more Strateges Janez Brest, Borko Boškovć, Aleš Zamuda, Iztok Fster Insttute of Computer Scence, Faculty of Electrcal

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

arxiv: v3 [cs.ne] 5 Jun 2013

arxiv: v3 [cs.ne] 5 Jun 2013 A Hybrd Bat Algorthm Iztok Fster Jr., Dušan Fster, and Xn-She Yang Abstract Swarm ntellgence s a very powerful technque to be used for optmzaton purposes. In ths paper we present a new swarm ntellgence

More information

Research Article An Enhanced Differential Evolution with Elite Chaotic Local Search

Research Article An Enhanced Differential Evolution with Elite Chaotic Local Search Computatonal Intellgence and Neuroscence Volume 215, Artcle ID 583759, 11 pages http://dx.do.org/1.1155/215/583759 Research Artcle An Enhanced Dfferental Evoluton wth Elte Chaotc Local Search Zhaolu Guo,

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Utilizing cumulative population distribution information in differential evolution

Utilizing cumulative population distribution information in differential evolution Utlzng cumulatve populaton dstrbuton nformaton n dfferental evoluton Yong Wang a,b, Zh-Zhong Lu a, Janbn L c, Han-Xong L d, e, Gary G. Yen f a School of Informaton Scence and Engneerng, Central South Unversty,

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Cooperative Micro Differential Evolution for High Dimensional Problems

Cooperative Micro Differential Evolution for High Dimensional Problems Cooperatve Mcro Dfferental Evoluton for Hgh Dmensonal Problems Konstantnos E. Parsopoulos Department of Mathematcs Unversty of Patras GR 26110 Patras, Greece kostasp@math.upatras.gr ABSTRACT Hgh dmensonal

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Lecture 6: Introduction to Linear Regression

Lecture 6: Introduction to Linear Regression Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

An Adaptive Learning Particle Swarm Optimizer for Function Optimization

An Adaptive Learning Particle Swarm Optimizer for Function Optimization An Adaptve Learnng Partcle Swarm Optmzer for Functon Optmzaton Changhe L and Shengxang Yang Abstract Tradtonal partcle swarm optmzaton (PSO) suffers from the premature convergence problem, whch usually

More information

THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS OF A TELESCOPIC HYDRAULIC CYLINDER SUBJECTED TO EULER S LOAD

THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS OF A TELESCOPIC HYDRAULIC CYLINDER SUBJECTED TO EULER S LOAD Journal of Appled Mathematcs and Computatonal Mechancs 7, 6(3), 7- www.amcm.pcz.pl p-issn 99-9965 DOI:.75/jamcm.7.3. e-issn 353-588 THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS

More information

Annexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances

Annexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances ec Annexes Ths Annex frst llustrates a cycle-based move n the dynamc-block generaton tabu search. It then dsplays the characterstcs of the nstance sets, followed by detaled results of the parametercalbraton

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

A Hybrid Differential Evolution Algorithm Game Theory for the Berth Allocation Problem

A Hybrid Differential Evolution Algorithm Game Theory for the Berth Allocation Problem A Hybrd Dfferental Evoluton Algorthm ame Theory for the Berth Allocaton Problem Nasser R. Sabar, Sang Yew Chong, and raham Kendall The Unversty of Nottngham Malaysa Campus, Jalan Broga, 43500 Semenyh,

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation Statstcs for Managers Usng Mcrosoft Excel/SPSS Chapter 13 The Smple Lnear Regresson Model and Correlaton 1999 Prentce-Hall, Inc. Chap. 13-1 Chapter Topcs Types of Regresson Models Determnng the Smple Lnear

More information

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application 7 Determnng Transmsson Losses Penalty Factor Usng Adaptve Neuro Fuzzy Inference System (ANFIS) For Economc Dspatch Applcaton Rony Seto Wbowo Maurdh Hery Purnomo Dod Prastanto Electrcal Engneerng Department,

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

UCLA STAT 13 Introduction to Statistical Methods for the Life and Health Sciences. Chapter 11 Analysis of Variance - ANOVA. Instructor: Ivo Dinov,

UCLA STAT 13 Introduction to Statistical Methods for the Life and Health Sciences. Chapter 11 Analysis of Variance - ANOVA. Instructor: Ivo Dinov, UCLA STAT 3 ntroducton to Statstcal Methods for the Lfe and Health Scences nstructor: vo Dnov, Asst. Prof. of Statstcs and Neurology Chapter Analyss of Varance - ANOVA Teachng Assstants: Fred Phoa, Anwer

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

The Convergence Speed of Single- And Multi-Objective Immune Algorithm Based Optimization Problems

The Convergence Speed of Single- And Multi-Objective Immune Algorithm Based Optimization Problems The Convergence Speed of Sngle- And Mult-Obectve Immune Algorthm Based Optmzaton Problems Mohammed Abo-Zahhad Faculty of Engneerng, Electrcal and Electroncs Engneerng Department, Assut Unversty, Assut,

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Chapter 3 Describing Data Using Numerical Measures

Chapter 3 Describing Data Using Numerical Measures Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The

More information

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications Durban Watson for Testng the Lack-of-Ft of Polynomal Regresson Models wthout Replcatons Ruba A. Alyaf, Maha A. Omar, Abdullah A. Al-Shha ralyaf@ksu.edu.sa, maomar@ksu.edu.sa, aalshha@ksu.edu.sa Department

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

THE general problem tackled using an optimization

THE general problem tackled using an optimization Hgh-Dmensonal Real-Parameter Optmzaton usng Self-Adaptve Dfferental Evoluton Algorthm wth Populaton Sze Reducton Janez Brest, Member, IEEE, Aleš Zamuda, Student Member, IEEE, BorkoBoškovć, Student Member,

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Statistics Chapter 4

Statistics Chapter 4 Statstcs Chapter 4 "There are three knds of les: les, damned les, and statstcs." Benjamn Dsrael, 1895 (Brtsh statesman) Gaussan Dstrbuton, 4-1 If a measurement s repeated many tmes a statstcal treatment

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Real-Time Systems. Multiprocessor scheduling. Multiprocessor scheduling. Multiprocessor scheduling

Real-Time Systems. Multiprocessor scheduling. Multiprocessor scheduling. Multiprocessor scheduling Real-Tme Systems Multprocessor schedulng Specfcaton Implementaton Verfcaton Multprocessor schedulng -- -- Global schedulng How are tasks assgned to processors? Statc assgnment The processor(s) used for

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

Particle Swarm Optimization with Adaptive Mutation in Local Best of Particles

Particle Swarm Optimization with Adaptive Mutation in Local Best of Particles 1 Internatonal Congress on Informatcs, Envronment, Energy and Applcatons-IEEA 1 IPCSIT vol.38 (1) (1) IACSIT Press, Sngapore Partcle Swarm Optmzaton wth Adaptve Mutaton n Local Best of Partcles Nanda ulal

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

We present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}.

We present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}. CS 189 Introducton to Machne Learnng Sprng 2018 Note 26 1 Boostng We have seen that n the case of random forests, combnng many mperfect models can produce a snglodel that works very well. Ths s the dea

More information

Note 10. Modeling and Simulation of Dynamic Systems

Note 10. Modeling and Simulation of Dynamic Systems Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes

DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes 25/6 Canddates Only January Examnatons 26 Student Number: Desk Number:...... DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR Department Module Code Module Ttle Exam Duraton

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

Multi-Robot Formation Control Based on Leader-Follower Optimized by the IGA

Multi-Robot Formation Control Based on Leader-Follower Optimized by the IGA IOSR Journal of Computer Engneerng (IOSR-JCE e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 19, Issue 1, Ver. III (Jan.-Feb. 2017, PP 08-13 www.osrjournals.org Mult-Robot Formaton Control Based on Leader-Follower

More information

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of

More information

Transient Stability Assessment of Power System Based on Support Vector Machine

Transient Stability Assessment of Power System Based on Support Vector Machine ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM Internatonal Conference on Ceramcs, Bkaner, Inda Internatonal Journal of Modern Physcs: Conference Seres Vol. 22 (2013) 757 761 World Scentfc Publshng Company DOI: 10.1142/S2010194513010982 FUZZY GOAL

More information

Basic Business Statistics, 10/e

Basic Business Statistics, 10/e Chapter 13 13-1 Basc Busness Statstcs 11 th Edton Chapter 13 Smple Lnear Regresson Basc Busness Statstcs, 11e 009 Prentce-Hall, Inc. Chap 13-1 Learnng Objectves In ths chapter, you learn: How to use regresson

More information