Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed
|
|
- Caren Russell
- 5 years ago
- Views:
Transcription
1 Bounding the Population Size of OP-CMA-ES on the Noiseless BBOB Testbed Tianjun Liao IRIDIA, CoDE, Université Libre de Bruxelles (ULB), Brussels, Belgium Thomas Stützle IRIDIA, CoDE, Université Libre de Bruxelles (ULB), Brussels, Belgium ABSTRACT A variant of CMA-ES that uses occasional restarts coupled with an increasing population size, which is called OP- CMA-ES, has shown to be a top performing algorithm on the BBOB benchmark set. In this paper, we test a mechanism that bounds the maximum size that the population may reach in OP-CMA-ES, and we experimentally explore the impact of a maximum population size on the BBOB benchmark set. In the proposed bounding mechanism, we use a maximum population size of 0 D, where D is problem dimension. Once the maximum population size is reached or surpassed, the population size is reset to its default starting value λ, which is defined by the λ = + ln(d). Our experimental results show that our scheme for the populationsize update can lead to improved performances on separable and weakly structured multi-modal functions. Categories and Subject Descriptors G.. [Numerical Analysis]: Optimization global optimization, unconstrained optimization; F.. [Analysis of Algorithms and Problem Complexity]: Numerical Algorithms and Problems General Terms Algorithms Keywords Benchmarking, Black-box optimization. INTRODUCTION OP-CMA-ES [] is a variant of CMA-ES [, 0] that uses occasional restarts, which are triggered when the search process is deemed to stagnate, combined with an increasing population size. OP-CMA-ES and several of its variants [,,, ] have shown very good results on the BBOB Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. GECCO Companion, July 0, 0, Amsterdam, The Netherlands. Copyright 0 ACM //0...$.00. benchmark. In this paper, we base our analysis on OP- CMA-ES using its default parameter settings. In particular, the initial population size in OP-CMA-ES is set to λ = + ln(d), where D is dimension of the problem being tackled. At each restart, OP-CMA-ES increases the population size by a factor of two. This setting leads to an exponential increase of the population size in OP- CMA-ES in the number of restarts. In particular, on difficult, multi-modal functions many restarts may occur and, thus, very large population sizes may result if OP-CMA- ES doesn t find a solution better than the optimal threshold or a possible target value. In this paper, we use a mechanism to bound the maximum population size that OP-CMA-ES may use; in fact, bounding the maximum population size is motivated by the fact that sometimes very large populations may, at least theoretically, decrease the performances []. However, occasionally CMA-ES may benefit from large populations [9], which is also the motivation for increasing the population size in OP-CMA-ES. Thus, in our bound on the population size, we do not want to be too restrictive. Thus, we set the upper bound of the population size to 0 D, which leaves for higher dimensional problems the possibility to reach rather large populations (e.g. 000 for D = 0). Once this upper bound is reached, we reset the population size to its initial value, given by λ = + ln(d). Additionally, it gives some additional robustness with respect to the maximum bound on the population size we use. We label the resulting OP- CMA-ES variant -0DDr. The original OP-CMA-ES is labeled.. EXPERIMENTAL PROCEDURE We used the C version of OP-CMA-ES (last modification date 0//0) from Hansen s webpage lri.fr/~hansen/cmaesintro.html. To ensure that the final best solution is inside the bounds, the bound constraints are enforced by clamping each generated solution that violates the bound constraint to the nearest solution on the bounds. The default parameter settings of OP-CMA-ES were used. A maximum of 0 D function evaluations was used for the experiments.. RESULTS The results from the experiments that follow the experimental protocol [] on the benchmark functions given in [, 8] are presented in Figures, and and in Tables and. The expected running time (ERT), used in the figures and tables, depends on a given target function value,
2 f t = f opt + f, and it is computed across all relevant trials as the number of function evaluations executed during each trial while the best function value did not reach f t, summed over all trials and divided by the number of trials that actually reached f t [, ]. Statistical significance is tested with the rank-sum test for a given target f t (0 8 as in Figure ) using, for each trial, either the number of needed function evaluations to reach f t (inverted and multiplied by ), or, if the target was not reached, the best f-value achieved, measured only up to the smallest number of overall function evaluations for any unsuccessful trial under consideration. In the experiments, we found that -0DDr reaches solutions below the optimal threshold of 0 8 in various cases where the default version of OP-CMA-ES, here labeled, could not find such solutions. This was the case for functions f (D =, ), f (D = 0), f (D = 0) and f (D =, ). Compared to, -0DDr uses fewer function evaluations to reach optimal threshold in functions f (D =,, 0), f (D = ), f (D =,,, 0, 0), f (D =,, ), f (D = 0, 0); only in functions f 9 (D = 0) and f 0 (D =, 0, 0), -0DDr uses slightly more function evaluations to reach optimal threshold than. We next examine the impact of the specific choice on the maximum population size. To do so, we explore another bound mechanism, where the maximum population size is set to a constant value of 00; a population size larger than 00 is then kept to 00. We label the resulting algorithm Figure shows that -0DDr clearly performs better than -00. To situate the performance of -0DDr better with respect to other variants of OP-CMA-ES, in Figure we show the comparisons between -0DDr and the performance data for the OP-CMA-ES variants, CMA mah [] and OPsaACM [] in the aforementioned functions f, f,f,f 9, f 0, f,f,f,f. We find that -0DDr reaches the optimal threshold in functions f (D = ), f 9(D = 0), f (D = 0), f (D = 0) and f (D = ) where both, CMA mah and OPsaACM, cannot reach optimal threshold. In functions f (D =,, 0), f (D =, ), f (D = ), f (D =, ), f (D =,, 0) -0DDr uses fewer function evaluations to reach optimal threshold than CMA mah and OPsaACM.. CPU TIMING EXPERIMENT The -0DDr was run on f 8 until at least 0 seconds have passed. These experiment were conducted with Intel Xeon E0 (. GHz) on Linux (kernel ). The results were.e 0,.E 0,.E 0, 9.E 0,.E 0 and.0e 0 seconds per function evaluation in dimensions,,, 0, 0, and 0, respectively.. CONCLUSIONS In this paper, we have studied the impact of bounding the population size in OP-CMA-ES together with re-initialization of the population size. Obviously, using a maximum population size of 0 D does not worsen results on functions that are easy for OP-CMA-ES, that is, on functions where OP-CMA-ES within the first trial or very few restarts finds the optimum in such cases the bounds do not take effect. However, for various difficult, multi-modal functions we observed improved performance of our new OP-CMA-ES variants over the default OP-CMA-ES. Hence, these results would encourage us to explore bounds on the maximum population size also for other OP-CMA-ES variants such as Bipop-CMA-ES. Finally, one may further explore different settings for the bounds on the maximum population size, which may lead to further improvements in performance.. ACKNOWLEDGMENTS The authors would like to thank the great and hard work of the BBOB team. This work was supported by the Meta-X project funded by the Scientific Research Directorate of the French Community of Belgium. Thomas Stützle acknowledges support from the Belgian F.R.S.-FNRS, of which he is a Research Associate. Tianjun Liao acknowledges a fellowship from the China Scholarship Council.. REFERENCES [] A. Auger and N. Hansen. A restart CMA evolution strategy with increasing population size. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 00), pages 9. IEEE Press, 00. [] D. Brockhoff, A. Auger, and N. Hansen. On the impact of active covariance matrix adaptation in the CMA-ES with mirrored mutations and small initial population size on the noiseless BBOB testbed. In GECCO 0 (Companion), pages 9 9, 00. [] D. Brockhoff, A. Auger, and N. Hansen. Comparing mirrored mutations and active covariance matrix adaptation in the OP-CMA-ES on the noiseless BBOB testbed. In T. Soule, editor, GECCO 0 (Companion), pages 9 0. ACM, 0. [] D. Brockhoff, A. Auger, and N. Hansen. On the impact of a small initial population size in the OP active CMA-ES with mirrored mutations on the noiseless BBOB testbed. In T. Soule, editor, GECCO 0 (Companion), pages 9 9. ACM, 0. [] T. Chen, K. Tang, G. Chen, and X. Yao. A large population size can be unhelpful in evolutionary algorithms. Theoretical Computer Science, : 0, 0. [] S. Finck, N. Hansen, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 009: Presentation of the noiseless functions. Technical Report 009/0, Research Center PPE, 009. Updated February 00. [] N. Hansen, A. Auger, S. Finck, and R. Ros. Real-parameter black-box optimization benchmarking 0: Experimental setup. Technical report, INRIA, 0. [8] N. Hansen, S. Finck, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 009: Noiseless functions definitions. Technical Report RR-89, INRIA, 009. Updated February 00. [9] N. Hansen and S. Kern. Evaluating the CMA evolution strategy on multimodal test functions. In Parallel Problem Solving from Nature PPSN VIII, volume of Lecture Notes in Computer Science, pages 8 9. Springer, Heidelberg, Germany, 00. [0] N. Hansen, S. Muller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution
3 -0DDr -00 Sphere Linear slope Ellipsoid separable Attractive sector 8 Rastrigin separable Step-ellipsoid 8 Skew Rastrigin-Bueche separ Rosenbrock original Rosenbrock rotated Ellipsoid Discus Bent cigar Sharp ridge Sum of different powers Rastrigin Weierstrass Schaffer F, condition Schaffer F, condition Griewank-Rosenbrock F8F Schwefel x*sin(x) Gallagher 0 peaks Gallagher peaks Katsuuras Lunacek bi-rastrigin DDr Figure : Expected running time (ERT in number of f-evaluations) divided by dimension for target function value 0 8 as log 0 values versus dimension. Different symbols correspond to different algorithms given in the legend of f and f. Light symbols give the maximum number of function evaluations from the longest trial divided by dimension. Horizontal lines give linear scaling, slanted dotted lines give quadratic scaling. Black stars indicate statistically better result compared to all other algorithms with p < and Bonferroni correction number of dimensions (six). Legend: :-0DDr, :, :-00
4 Rastrigin separable Griewank-Rosenbrock F8F Gallagher peaks Skew Rastrigin-Bueche separ Schwefel x*sin(x) Katsuuras Weierstrass Gallagher 0 peaks Lunacek bi-rastrigin -0DDr CMA_mah OPsaACM Figure : Comparing -0DDr to CMA mah and OPsaACM. CMA mah is the active covariance matrix adaptation version of OP-CMA-ES with mirrored mutation and a small initial population size. CMA sa is self-adaptive surrogate-assisted version of OP-CMA-ES. Expected running time (ERT) divided by dimension for target function value 0 8 as log 0 values. Different symbols correspond to different algorithms given in legend of f. Light symbols give the maximum number of function evaluations from all trials divided by the dimension. Horizontal lines give linear scaling, the slanted dotted lines give quadratic scaling. strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, (): 8, 00. [] N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9():9 9, 00. [] I. Loshchilov, M. Schoenauer, and M. Sebag. Black-box optimization benchmarking of OP-SaACM-ES and Bipop-SaACM-ES on the BBOB-0 noiseless testbed. In T. Soule, editor, GECCO 0 (Companion), pages 8. ACM, 0. [] K. Price. Differential evolution vs. the functions of the second ICEO. In Proceedings of the IEEE International Congress on Evolutionary Computation, pages, 99.
5 .0 f-,-d separable fcts 0 8 log0 of (# f-evals / dimension).0 f0-,-d ill-conditioned fcts 0 8 log0 of (# f-evals / dimension).0 f0-,-d weakly structured multi-modal fcts 0 8 log0 of (# f-evals / dimension) best 009-0DDr -00 best 009-0DDr -00 best 009-0DDr f-9,-d moderate fcts 0 8 log0 of (# f-evals / dimension).0 f-9,-d multi-modal fcts 0 8 log0 of (# f-evals / dimension).0 f-,-d all functions 0 8 log0 of (# f-evals / dimension) best 009-0DDr -00 best DDr best 009-0DDr -00 Figure : Bootstrapped empirical cumulative distribution of the number of objective function evaluations divided by dimension (FEvals/D) for 0 targets in 0 [ 8..] for all functions and subgroups in -D. The best 009 line corresponds to the best ERT observed during BBOB 009 for each single target.
6 .0 f-,0-d separable fcts 0 8 log0 of (# f-evals / dimension).0 f0-,0-d ill-conditioned fcts 0 8 log0 of (# f-evals / dimension).0 f0-,0-d weakly structured multi-modal fcts 0 8 log0 of (# f-evals / dimension) best DDr best 009-0DDr -00 best 009-0DDr f-9,0-d moderate fcts 0 8 log0 of (# f-evals / dimension).0 f-9,0-d multi-modal fcts 0 8 log0 of (# f-evals / dimension).0 f-,0-d all functions 0 8 log0 of (# f-evals / dimension) best 009-0DDr -00 best 009-0DDr -00 best 009-0DDr -00 Figure : Bootstrapped empirical cumulative distribution of the number of objective function evaluations divided by dimension (FEvals/D) for 0 targets in 0 [ 8..] for all functions and subgroups in 0-D. The best 009 line corresponds to the best ERT observed during BBOB 009 for each single target.
7 f / -0DD.0().() () () 0() () /.0().() () () 0() () / -00.0().() () () 0() () / f / -0DD() () 9() () () () / () () 9() () () () / -00 () () 9() () () () / f 0 / -0DD0.99() () 0() 0() () () / 0.99() 8() 90(00) (9) 0() (0) 8/ () 8() () 8() 9() 9() / f / -0DD.() (9).e(e).0e(e).8e(e).8e(e) /.() e 0/ -00.() e 0/ f / -0DD.() (8) 0() () () () /.() (8) 0() () () () / -00.() (8) 0() () () () / f / -0DD.(0.).8(0.).(0.).(0.).(0.).(0.) /.(0.).8(0.).(0.).(0.).(0.).(0.) / -00.(0.).8(0.).(0.).(0.).(0.).(0.) / f 9 / -0DD.().(0.).(0.9).(0.).(0.).() /.().(0.).(0.9).(0.).(0.).() / -00.().(0.).(0.9).(0.).(0.).() / f8 9 0 / -0DD.(.0).().().().8().() /.(.0).().().().8().() / -00.(.0).().().().8().() / f / -0DD.() 0(0) 8.().().().() /.() 0(0) 8.().().().() / -00.() 0(0) 8.().().().() / f / -0DD.(0.9).(0.).(0.).(0.).(0.).(0.) /.(0.9).(0.).(0.).(0.).(0.).(0.) / -00.(0.9).(0.).(0.).(0.).(0.).(0.) / f 0 / -0DD.().(0.).(0.).(0.).(0.).(0.) /.().(0.).(0.).(0.).(0.).(0.) / -00.().(0.).(0.).(0.).(0.).(0.) / f / -0DD.9().9().0().().8().8() /.9().9().0().().8().8() / -00.9().9().0().().8().8() / f / -0DD.().8().().(0.).9().9() /.().8().().(0.).9().9() / -00.().8().().(0.).9().9() / f / -0DD.().().(.0).(0.9).(0.).(0.) /.().().(.0).(0.9).(0.).(0.) / -00.().().(.0).(0.9).(0.).(0.) / f / -0DD.(0.).(0.9) (0.) (0.) (0.) (0.) /.(0.).(0.9) (0.) (0.) (0.) (0.) / -00.(0.).(0.9) (0.) (0.) (0.) (0.) / f / -0DD.0().().() (0.) 0.(0.) 0.(0.) /.0().().() (0.) 0.(0.) 0.(0.) / -00.0().().() (0.) 0.(0.) 0.(0.) / f / -0DD.() 0.9(0.) 0.(0.) 0.(0.).(0.).(0.) /.() 0.9(0.) 0.(0.) 0.(0.).(0.).(0.) / -00.() 0.9(0.) 0.(0.) 0.(0.).(0.).(0.) / f / -0DD.(0.).8(0.) 0.(0.).0(0.) 0.9(0.).0(0.) /.(0.).8(0.) 0.(0.).0(0.) 0.9(0.).0(0.) / -00.(0.).8(0.) 0.(0.).0(0.) 0.9(0.).0(0.) / f9.e.e.e / -0DD() (0) 9(8).0().0().0() / () (0) 8().().().() / -00 () (0) 8().().().() / f / -0DD.() ().().().().() /.() ().().().().() / -00.() ().().().().() / f 0 9 / -0DD.() () (8) (8) () () /.() 8.(9) 90(9) 8() () () / -00.() 8.(9) () 0() 0() 0() / f / -0DD.() () (9) 0() 9() 9() /.() () (8) () 8() 8(08) / -00.() 9() 80(80) 9() 9(0) 90() / f / -0DD.().().() 0.9() 0.9() 0.9() /.().().0().().().() / -00.().().0().().().() / f.e.e 9.e.e.e / -0DD.0() 0() e 0/.0() e 0/ -00.0() e 0/ Table : Expected running time (ERT in number of function evaluations) divided by the respective best ERT measured during BBOB-009 (given in the respective first row) for different f values in dimension. The central 80% range divided by two is given in braces. The median number of conducted function evaluations is additionally given in italics, if ERT(0 ) =. #succ is the number of trials that reached the final target f opt Best results are printed in bold.. -0DD in the table denotes -0DDr.
8 f / -0DD.(0.) (0.) 0() () () () /.(0.) (0.) 0() () () () / -00.(0.) (0.) 0() () () () / f / -0DD() () 8() () () () / () () 8() () () () / -00 () () 8() () () () / f 0 / -0DD() e 0/ () e 0/ -00 ().9e(e) e 0/ f e 9/ -0DD.9e(e) e 0/ e 0/ -00 e 0/ f / -0DD() (9) (0) () 09() 0(9) / () (9) (0) () 09() 0(9) / -00 () (9) (0) () 09() 0(9) / f / -0DD.(0.).(0.).(0.).(0.).(0.).(0.) /.(0.).(0.).(0.).(0.).(0.).(0.) / -00.(0.).(0.).(0.).(0.).(0.).(0.) / f / -0DD.().().9().(.0).(.0).(.0) /.().().9().(.0).(.0).(.0) / -00.().().9().(.0).(.0).(.0) / f / -0DD.().(0.).(0.).8(0.).8(0.).9(0.) /.().(0.).(0.).8(0.).8(0.).9(0.) / -00.().(0.).(0.).8(0.).8(0.).9(0.) / f9 0 9 / -0DD.0().9().().().().() /.0().9().().().().() / -00.0().9().().().().() / f / -0DD.8(0.).9(0.).(0.).().().() /.8(0.).9(0.).(0.).().().() / -00.8(0.).9(0.).(0.).().().() / f / -0DD0(0.).(0.).0(0.).().().() / 0(0.).(0.).0(0.).().().() / -00 0(0.).(0.).0(0.).().().() / f / -0DD.().().().().8(0.).9(0.) /.().().().().8(0.).9(0.) / -00.().().().().8(0.).9(0.) / f / -0DD.(0.).().().().(0.9).(0.9) /.(0.).().().().(0.9).(0.9) / -00.(0.).().().().(0.9).(0.9) / f / -0DD.8(0.).(0.).(0.).(0.).(0.).(0.) /.8(0.).(0.).(0.).(0.).(0.).(0.) / -00.8(0.).(0.).(0.).(0.).(0.).(0.) / f 08.e.e.e.e.e / -0DD.(0.) 0.99(0.) 0.0(0.) 0.(0.) 0.(0.) 0.(0.) /.(0.) 0.99(0.) 0.0(0.) 0.(0.) 0.(0.) 0.(0.) / -00.(0.) 0.99(0.) 0.(0.) 0.(0.) 0.(0.) 0.(0.) / f 8 0.9e.0e.e / -0DD.(0.) 0.(0.) 0.(0.) 0.8(0.) (0.) 0.9(0.) /.(0.) 0.(0.) 0.(0.) 0.8(0.) (0.) 0.9(0.) / -00.(0.) 0.(0.) 0.(0.) 0.8(0.) (0.) 0.(0.) / f / -0DD.9() 0.8(0.) 0.(0.) 0.9(0.).(0.).(0.) /.9() 0.8(0.) 0.(0.) 0.9(0.).(0.).(0.) / -00.9() 0.8(0.) 0.(0.) 0.9(0.).(0.).(0.) / f e.e / -0DD(0.) 0.(0.) 0.(0.).() 0.9(0.).0(0.) / (0.) 0.(0.) 0.(0.).() 0.9(0.).0(0.) / -00 (0.) 0.(0.) 0.(0.).() 0.9(0.).0(0.) / f9.e.e.e.e / -0DD8().e(e).0() (0.).0(0.9).0(0.9) / 8().e(e).0() 0.(0.) 0.(0.) 0.(0.) / -00 8().e(e).0() e 0/ f0 8 0.e.e.e.e / -0DD.().() 0.9(0.).().().() /.().() (0.) 0.(0.) 0.(0.) 0.(0.) / -00.().8() e 0/ f 0 89 / -0DD.() (89) 0() () 90(0) () /.() 9(0) 89() () 9() 9(890) / -00.() 8(9) 80() () 8(0) 8() / f e / -0DD 9(8) 8(8) e 0/ 9(8) (9) e 0/ -00 () 0(89) e 0/ f..9e 8.e 8.e / -0DD.() (9).() () () () /.() (9).() 00(0) 0() 8() / -00.() (9).() (0) () () / f.e.e.e.e.e.e / -0DD e 0/ e 0/ -00 e 0/ Table : Expected running time (ERT in number of function evaluations) divided by the respective best ERT measured during BBOB-009 (given in the respective first row) for different f values in dimension 0. The central 80% range divided by two is given in braces. The median number of conducted function evaluations is additionally given in italics, if ERT(0 ) =. #succ is the number of trials that reached the final target f opt Best results are printed in bold.. -0DD in the table denotes -0DDr.
Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed
Author manuscript, published in "GECCO workshop on Black-Box Optimization Benchmarking (BBOB') () 9-" DOI :./8.8 Mirrored Variants of the (,)-CMA-ES Compared on the Noiseless BBOB- Testbed [Black-Box Optimization
More informationComparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed
Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed Raymond Ros To cite this version: Raymond Ros. Comparison of NEWUOA with Different Numbers of Interpolation
More informationBlack-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed
Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed Raymond Ros To cite this version: Raymond Ros. Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed. Genetic
More informationBenchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed
Benchmarking a BI-Population CMA-ES on the BBOB- Function Testbed Nikolaus Hansen Microsoft Research INRIA Joint Centre rue Jean Rostand Orsay Cedex, France Nikolaus.Hansen@inria.fr ABSTRACT We propose
More informationBenchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds
Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds Tom Schaul Courant Institute of Mathematical Sciences, New York University
More informationBenchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed
Benchmarking a Hyrid ulti Level Single Linkage Algorithm on the BBOB Noiseless Tested László Pál Sapientia - Hungarian University of Transylvania 00 iercurea-ciuc, Piata Liertatii, Nr., Romania pallaszlo@sapientia.siculorum.ro
More informationBBOB-Benchmarking Two Variants of the Line-Search Algorithm
BBOB-Benchmarking Two Variants of the Line-Search Algorithm Petr Pošík Czech Technical University in Prague, Faculty of Electrical Engineering, Dept. of Cybernetics Technická, Prague posik@labe.felk.cvut.cz
More informationBenchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed
Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed Anne Auger, Nikolaus Hansen To cite this version: Anne Auger, Nikolaus Hansen. Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed. ACM-GECCO
More informationBenchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts
Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Example Paper The BBOBies ABSTRACT We have benchmarked the Nelder-Mead downhill simplex method on the noisefree BBOB- testbed
More informationBenchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed
Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed Lukáš Bajer Institute of Computer Science Academy of Sciences of the Czech Republic and Faculty of Mathematics
More informationBenchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed
Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB- Noiseless Testbed Nikolaus Hansen, Raymond Ros To cite this version: Nikolaus Hansen, Raymond Ros. Benchmarking a Weighted Negative
More informationTuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability
More informationThe Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study
The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-0 Testbed: A Preliminary Study Dimo Brockhoff, Bernd Bischl, Tobias Wagner To cite this version: Dimo Brockhoff, Bernd
More informationInvestigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds
Investigating the Impact o Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds Tom Schaul Courant Institute o Mathematical Sciences, New York University Broadway, New
More informationarxiv: v1 [cs.ne] 9 May 2016
Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES) arxiv:1605.02720v1 [cs.ne] 9 May 2016 ABSTRACT Ilya Loshchilov University of Freiburg Freiburg, Germany ilya.loshchilov@gmail.com
More informationAdaptive Coordinate Descent
Adaptive Coordinate Descent Ilya Loshchilov 1,2, Marc Schoenauer 1,2, Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 and Laboratoire de Recherche en Informatique (UMR CNRS 8623) Université
More informationBenchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed
Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed Lukáš Bajer,, Zbyněk Pitra,, Martin Holeňa Faculty of Mathematics and Physics, Charles University, Institute of Computer
More informationBI-population CMA-ES Algorithms with Surrogate Models and Line Searches
BI-population CMA-ES Algorithms with Surrogate Models and Line Searches Ilya Loshchilov 1, Marc Schoenauer 2 and Michèle Sebag 2 1 LIS, École Polytechnique Fédérale de Lausanne 2 TAO, INRIA CNRS Université
More informationBenchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed
Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Babatunde Sawyerr 1 Aderemi Adewumi 2 Montaz Ali 3 1 University of Lagos, Lagos, Nigeria 2 University
More informationA Restart CMA Evolution Strategy With Increasing Population Size
Anne Auger and Nikolaus Hansen A Restart CMA Evolution Strategy ith Increasing Population Size Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005 c IEEE A Restart CMA Evolution Strategy
More informationarxiv: v1 [cs.ne] 29 Jul 2014
A CUDA-Based Real Parameter Optimization Benchmark Ke Ding and Ying Tan School of Electronics Engineering and Computer Science, Peking University arxiv:1407.7737v1 [cs.ne] 29 Jul 2014 Abstract. Benchmarking
More informationCovariance Matrix Adaptation in Multiobjective Optimization
Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France Mastertitelformat Scenario: Multiobjective
More informationReal-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions
Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions Steffen Finck, Nikolaus Hansen, Raymond Ros and Anne Auger Report 2009/20, Research Center PPE, re-compiled
More informationComparison-Based Optimizers Need Comparison-Based Surrogates
Comparison-Based Optimizers Need Comparison-Based Surrogates Ilya Loshchilov 1,2, Marc Schoenauer 1,2, and Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 Laboratoire de Recherche
More informationSurrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES
Covariance Matrix Adaptation-Evolution Strategy Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating and Covariance-Matrix Adaptation-ES Ilya Loshchilov, Marc Schoenauer,
More informationLocal-Meta-Model CMA-ES for Partially Separable Functions
Local-Meta-Model CMA-ES for Partially Separable Functions Zyed Bouzarkouna, Anne Auger, Didier Yu Ding To cite this version: Zyed Bouzarkouna, Anne Auger, Didier Yu Ding. Local-Meta-Model CMA-ES for Partially
More informationExperimental Comparisons of Derivative Free Optimization Algorithms
Experimental Comparisons of Derivative Free Optimization Algorithms Anne Auger Nikolaus Hansen J. M. Perez Zerpa Raymond Ros Marc Schoenauer TAO Project-Team, INRIA Saclay Île-de-France, and Microsoft-INRIA
More informationEmpirical comparisons of several derivative free optimization algorithms
Empirical comparisons of several derivative free optimization algorithms A. Auger,, N. Hansen,, J. M. Perez Zerpa, R. Ros, M. Schoenauer, TAO Project-Team, INRIA Saclay Ile-de-France LRI, Bat 90 Univ.
More informationA (1+1)-CMA-ES for Constrained Optimisation
A (1+1)-CMA-ES for Constrained Optimisation Dirk Arnold, Nikolaus Hansen To cite this version: Dirk Arnold, Nikolaus Hansen. A (1+1)-CMA-ES for Constrained Optimisation. Terence Soule and Jason H. Moore.
More informationThe Dispersion Metric and the CMA Evolution Strategy
The Dispersion Metric and the CMA Evolution Strategy Monte Lunacek Department of Computer Science Colorado State University Fort Collins, CO 80523 lunacek@cs.colostate.edu Darrell Whitley Department of
More informationIntroduction to Randomized Black-Box Numerical Optimization and CMA-ES
Introduction to Randomized Black-Box Numerical Optimization and CMA-ES July 3, 2017 CEA/EDF/Inria summer school "Numerical Analysis" Université Pierre-et-Marie-Curie, Paris, France Anne Auger, Asma Atamna,
More informationTutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation
Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.
More informationAdaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models
J. Hlaváčová (Ed.): ITAT 07 Proceedings, pp. 36 43 CEUR Workshop Proceedings Vol. 885, ISSN 63-0073, c 07 J. Repický, L. Bajer, Z. Pitra, M. Holeňa Adaptive Generation-Based Evolution Control for Gaussian
More informationEvolution Strategies and Covariance Matrix Adaptation
Evolution Strategies and Covariance Matrix Adaptation Cours Contrôle Avancé - Ecole Centrale Paris Anne Auger January 2014 INRIA Research Centre Saclay Île-de-France University Paris-Sud, LRI (UMR 8623),
More informationIntroduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties
1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen
More informationBio-inspired Continuous Optimization: The Coming of Age
Bio-inspired Continuous Optimization: The Coming of Age Anne Auger Nikolaus Hansen Nikolas Mauny Raymond Ros Marc Schoenauer TAO Team, INRIA Futurs, FRANCE http://tao.lri.fr First.Last@inria.fr CEC 27,
More informationProblem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.
Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms Anne Auger & Nikolaus Hansen INRIA Saclay - Ile-de-France, project team TAO Universite Paris-Sud, LRI, Bat. 49 945 ORSAY
More informationOptimisation numérique par algorithmes stochastiques adaptatifs
Optimisation numérique par algorithmes stochastiques adaptatifs Anne Auger M2R: Apprentissage et Optimisation avancés et Applications anne.auger@inria.fr INRIA Saclay - Ile-de-France, project team TAO
More informationViability Principles for Constrained Optimization Using a (1+1)-CMA-ES
Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES Andrea Maesani and Dario Floreano Laboratory of Intelligent Systems, Institute of Microengineering, Ecole Polytechnique Fédérale de
More informationStochastic Methods for Continuous Optimization
Stochastic Methods for Continuous Optimization Anne Auger et Dimo Brockhoff Paris-Saclay Master - Master 2 Informatique - Parcours Apprentissage, Information et Contenu (AIC) 2016 Slides taken from Auger,
More informationDecomposition and Metaoptimization of Mutation Operator in Differential Evolution
Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Karol Opara 1 and Jaros law Arabas 2 1 Systems Research Institute, Polish Academy of Sciences 2 Institute of Electronic
More informationAdvanced Optimization
Advanced Optimization Lecture 3: 1: Randomized Algorithms for for Continuous Discrete Problems Problems November 22, 2016 Master AIC Université Paris-Saclay, Orsay, France Anne Auger INRIA Saclay Ile-de-France
More informationReal-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions
Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions Nikolaus Hansen, Steffen Finck, Raymond Ros, Anne Auger To cite this version: Nikolaus Hansen, Steffen Finck, Raymond
More informationarxiv: v2 [cs.ai] 13 Jun 2011
A Linear Time Natural Evolution Strategy for Non-Separable Functions Yi Sun, Faustino Gomez, Tom Schaul, and Jürgen Schmidhuber IDSIA, University of Lugano & SUPSI, Galleria, Manno, CH-698, Switzerland
More informationOn the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions
On the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions ABSTRACT Ryoji Tanabe Graduate School of Arts and Sciences The University of Tokyo Tokyo, Japan rt.ryoji.tanabe@gmail.com
More informationAddressing Numerical Black-Box Optimization: CMA-ES (Tutorial)
Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial) Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat. 490 91405
More informationA CMA-ES with Multiplicative Covariance Matrix Updates
A with Multiplicative Covariance Matrix Updates Oswin Krause Department of Computer Science University of Copenhagen Copenhagen,Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik
More informationInvestigating the Local-Meta-Model CMA-ES for Large Population Sizes
Investigating the Local-Meta-Model CMA-ES for Large Population Sizes Zyed Bouzarkouna, Anne Auger, Didier Yu Ding To cite this version: Zyed Bouzarkouna, Anne Auger, Didier Yu Ding. Investigating the Local-Meta-Model
More informationA comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms
A comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms Rodolphe Le Riche 1,2 1 Ecole des Mines de Saint Etienne, France 2 CNRS LIMOS France March 2018 MEXICO
More informationStochastic optimization and a variable metric approach
The challenges for stochastic optimization and a variable metric approach Microsoft Research INRIA Joint Centre, INRIA Saclay April 6, 2009 Content 1 Introduction 2 The Challenges 3 Stochastic Search 4
More informationTesting Gaussian Process Surrogates on CEC 2013 multi-modal benchmark
ITAT 2016 Proceedings, CEUR Workshop Proceedings Vol. 1649, pp. 138 146 http://ceur-ws.org/vol-1649, Series ISSN 1613-0073, c 2016 N. Orekhov, L. Bajer, M. Holeňa Testing Gaussian Process Surrogates on
More informationTutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation
Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.
More informationEfficient Natural Evolution Strategies
Efficient Natural Evolution Strategies Evolution Strategies and Evolutionary Programming Track ABSTRACT Yi Sun Manno 698, Switzerland yi@idsiach Tom Schaul Manno 698, Switzerland tom@idsiach Efficient
More informationMulti-objective Optimization with Unbounded Solution Sets
Multi-objective Optimization with Unbounded Solution Sets Oswin Krause Dept. of Computer Science University of Copenhagen Copenhagen, Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik
More informationTransitional Particle Swarm Optimization
International Journal of Electrical and Computer Engineering (IJECE) Vol. 7, No. 3, June 7, pp. 6~69 ISSN: 88-878, DOI:.59/ijece.v7i3.pp6-69 6 Transitional Particle Swarm Optimization Nor Azlina Ab Aziz,
More informationCrossover and the Different Faces of Differential Evolution Searches
WCCI 21 IEEE World Congress on Computational Intelligence July, 18-23, 21 - CCIB, Barcelona, Spain CEC IEEE Crossover and the Different Faces of Differential Evolution Searches James Montgomery Abstract
More informationParticle Swarm Optimization with Velocity Adaptation
In Proceedings of the International Conference on Adaptive and Intelligent Systems (ICAIS 2009), pp. 146 151, 2009. c 2009 IEEE Particle Swarm Optimization with Velocity Adaptation Sabine Helwig, Frank
More informationThe CMA Evolution Strategy: A Tutorial
The CMA Evolution Strategy: A Tutorial Nikolaus Hansen November 6, 205 Contents Nomenclature 3 0 Preliminaries 4 0. Eigendecomposition of a Positive Definite Matrix... 5 0.2 The Multivariate Normal Distribution...
More informationNew Instances for the Single Machine Total Weighted Tardiness Problem
HELMUT-SCHMIDT-UNIVERSITÄT UNIVERSITÄT DER BUNDESWEHR HAMBURG LEHRSTUHL FÜR BETRIEBSWIRTSCHAFTSLEHRE, INSBES. LOGISTIK-MANAGEMENT Prof. Dr. M. J. Geiger Arbeitspapier / Research Report RR-10-03-01 March
More informationThree Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms
Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn
More informationThree Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms
Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn
More informationStochastic Local Search in Continuous Domains
Stochastic Local Search in Continuous Domains Questions to be Answered When Designing A Novel Algorithm Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering, Department of
More informationDynamic Optimization using Self-Adaptive Differential Evolution
Dynamic Optimization using Self-Adaptive Differential Evolution IEEE Congress on Evolutionary Computation (IEEE CEC 2009), Trondheim, Norway, May 18-21, 2009 J. Brest, A. Zamuda, B. Bošković, M. S. Maučec,
More informationTest Generation for Designs with Multiple Clocks
39.1 Test Generation for Designs with Multiple Clocks Xijiang Lin and Rob Thompson Mentor Graphics Corp. 8005 SW Boeckman Rd. Wilsonville, OR 97070 Abstract To improve the system performance, designs with
More informationInteger Least Squares: Sphere Decoding and the LLL Algorithm
Integer Least Squares: Sphere Decoding and the LLL Algorithm Sanzheng Qiao Department of Computing and Software McMaster University 28 Main St. West Hamilton Ontario L8S 4L7 Canada. ABSTRACT This paper
More informationA CMA-ES for Mixed-Integer Nonlinear Optimization
A CMA-ES for Mixed-Integer Nonlinear Optimization Nikolaus Hansen To cite this version: Nikolaus Hansen. A CMA-ES for Mixed-Integer Nonlinear Optimization. [Research Report] RR-, INRIA.. HAL Id: inria-
More informationCenter-based initialization for large-scale blackbox
See discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/903587 Center-based initialization for large-scale blackbox problems ARTICLE FEBRUARY 009 READS
More informationAdaptive Differential Evolution and Exponential Crossover
Proceedings of the International Multiconference on Computer Science and Information Technology pp. 927 931 ISBN 978-83-60810-14-9 ISSN 1896-7094 Adaptive Differential Evolution and Exponential Crossover
More informationTotal Ordering on Subgroups and Cosets
Total Ordering on Subgroups and Cosets Alexander Hulpke Department of Mathematics Colorado State University 1874 Campus Delivery Fort Collins, CO 80523-1874 hulpke@math.colostate.edu Steve Linton Centre
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1 Population-Based Algorithm Portfolios for Numerical Optimization Fei Peng, Student Member, IEEE, Ke Tang, Member, IEEE, Guoliang Chen, and Xin Yao, Fellow,
More informationEgocentric Particle Swarm Optimization
Egocentric Particle Swarm Optimization Foundations of Evolutionary Computation Mandatory Project 1 Magnus Erik Hvass Pedersen (971055) February 2005, Daimi, University of Aarhus 1 Introduction The purpose
More informationParameter Sensitivity Analysis of Social Spider Algorithm
Parameter Sensitivity Analysis of Social Spider Algorithm James J.Q. Yu, Student Member, IEEE and Victor O.K. Li, Fellow, IEEE Department of Electrical and Electronic Engineering The University of Hong
More informationReducing Delay Uncertainty in Deeply Scaled Integrated Circuits Using Interdependent Timing Constraints
Reducing Delay Uncertainty in Deeply Scaled Integrated Circuits Using Interdependent Timing Constraints Emre Salman and Eby G. Friedman Department of Electrical and Computer Engineering University of Rochester
More informationMulti-start JADE with knowledge transfer for numerical optimization
Multi-start JADE with knowledge transfer for numerical optimization Fei Peng, Ke Tang,Guoliang Chen and Xin Yao Abstract JADE is a recent variant of Differential Evolution (DE) for numerical optimization,
More informationAn Enhanced Aggregation Pheromone System for Real-Parameter Optimization in the ACO Metaphor
An Enhanced Aggregation Pheromone System for Real-Parameter ptimization in the AC Metaphor Shigeyoshi Tsutsui Hannan University, Matsubara, saka 58-852, Japan tsutsui@hannan-u.ac.jp Abstract. In previous
More informationBeta Damping Quantum Behaved Particle Swarm Optimization
Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,
More informationBlock Diagonal Natural Evolution Strategies
Block Diagonal Natural Evolution Strategies Giuseppe Cuccu and Faustino Gomez IDSIA USI-SUPSI 6928 Manno-Lugano, Switzerland {giuse,tino}@idsia.ch http://www.idsia.ch/{~giuse,~tino} Abstract. The Natural
More informationTHE objective of global optimization is to find the
Large Scale Global Optimization Using Differential Evolution With Self-adaptation and Cooperative Co-evolution Aleš Zamuda, Student Member, IEEE, Janez Brest, Member, IEEE, Borko Bošković, Student Member,
More informationBlame and Coercion: Together Again for the First Time
Blame and Coercion: Together Again for the First Time Supplementary Material Jeremy Siek Indiana University, USA jsiek@indiana.edu Peter Thiemann Universität Freiburg, Germany thiemann@informatik.uni-freiburg.de
More informationNoisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere
Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere S. Finck Vorarlberg University of Applied Sciences Hochschulstrasse 1 Dornbirn, Austria steffen.finck@fhv.at
More informationPARTICLE swarm optimization (PSO) is one powerful and. A Competitive Swarm Optimizer for Large Scale Optimization
IEEE TRANSACTIONS ON CYBERNETICS, VOL. XX, NO. X, XXXX XXXX 1 A Competitive Swarm Optimizer for Large Scale Optimization Ran Cheng and Yaochu Jin, Senior Member, IEEE Abstract In this paper, a novel competitive
More informationEvolutionary Ensemble Strategies for Heuristic Scheduling
0 International Conference on Computational Science and Computational Intelligence Evolutionary Ensemble Strategies for Heuristic Scheduling Thomas Philip Runarsson School of Engineering and Natural Science
More informationDynamic Search Fireworks Algorithm with Covariance Mutation for Solving the CEC 2015 Learning Based Competition Problems
Dynamic Search Fireworks Algorithm with Covariance Mutation for Solving the CEC 05 Learning Based Competition Problems Chao Yu, Ling Chen Kelley,, and Ying Tan, The Key Laboratory of Machine Perception
More informationB-Positive Particle Swarm Optimization (B.P.S.O)
Int. J. Com. Net. Tech. 1, No. 2, 95-102 (2013) 95 International Journal of Computing and Network Technology http://dx.doi.org/10.12785/ijcnt/010201 B-Positive Particle Swarm Optimization (B.P.S.O) Muhammad
More informationREMEDA: Random Embedding EDA for optimising functions with intrinsic dimension
REMEDA: Random Embedding EDA for optimising functions with intrinsic dimension Momodou L. Sanyang 1,2 and Ata Kabán 1 1 School of Computer Science, University of Birmingham, Edgbaston, UK, B15 2TT., {M.L.Sanyang,
More informationWhite Hole-Black Hole Algorithm
White Hole-Black Hole Algorithm Suad Khairi Mohammed Faculty of Electrical and Electronics Engineering University Malaysia Pahang Pahang, Malaysia suad.khairim@gmail.com Zuwairie Ibrahim Faculty of Electrical
More informationEvolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution
Evolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution Michael G. Epitropakis, Member, IEEE, Vassilis P. Plagianakos and Michael N. Vrahatis Abstract In
More informationCLASSICAL gradient methods and evolutionary algorithms
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 2, NO. 2, JULY 1998 45 Evolutionary Algorithms and Gradient Search: Similarities and Differences Ralf Salomon Abstract Classical gradient methods and
More informationOn the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms
On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms [Extended Abstract] Tobias Storch Department of Computer Science 2, University of Dortmund, 44221 Dortmund,
More informationAdaptive Coordinate Descent
Aaptive Coorinate Descent Ilya Loshchilov TAO, INRIA Saclay U. Paris Su, F-9 Orsay Marc Schoenauer TAO, INRIA Saclay U. Paris Su, F-9 Orsay firstname.lastname@inria.fr Michèle Sebag CNRS, LRI UMR 86 U.
More informationMethodology to Achieve Higher Tolerance to Delay Variations in Synchronous Circuits
Methodology to Achieve Higher Tolerance to Delay Variations in Synchronous Circuits Emre Salman and Eby G. Friedman Department of Electrical and Computer Engineering University of Rochester Rochester,
More informationToward Effective Initialization for Large-Scale Search Spaces
Toward Effective Initialization for Large-Scale Search Spaces Shahryar Rahnamayan University of Ontario Institute of Technology (UOIT) Faculty of Engineering and Applied Science 000 Simcoe Street North
More informationA Study of the Dirichlet Priors for Term Frequency Normalisation
A Study of the Dirichlet Priors for Term Frequency Normalisation ABSTRACT Ben He Department of Computing Science University of Glasgow Glasgow, United Kingdom ben@dcs.gla.ac.uk In Information Retrieval
More informationIMPROVEMENT OF CLASSICAL EVOLUTIONARY PROGRAMMING USING STATE FEEDBACK CONTROLLER. Received October 2012; revised October 2013
International Journal of Innovative Computing, Information and Control ICIC International c 2014 ISSN 1349-4198 Volume 10, Number 4, August 2014 pp. 1413 1433 IMPROVEMENT OF CLASSICAL EVOLUTIONARY PROGRAMMING
More informationStochastic Search using the Natural Gradient
Keywords: stochastic search, natural gradient, evolution strategies Sun Yi Daan Wierstra Tom Schaul Jürgen Schmidhuber IDSIA, Galleria 2, Manno 6928, Switzerland yi@idsiach daan@idsiach tom@idsiach juergen@idsiach
More informationParameter Estimation of Complex Chemical Kinetics with Covariance Matrix Adaptation Evolution Strategy
MATCH Communications in Mathematical and in Computer Chemistry MATCH Commun. Math. Comput. Chem. 68 (2012) 469-476 ISSN 0340-6253 Parameter Estimation of Complex Chemical Kinetics with Covariance Matrix
More informationEfficient Covariance Matrix Update for Variable Metric Evolution Strategies
Efficient Covariance Matrix Update for Variable Metric Evolution Strategies Thorsten Suttorp, Nikolaus Hansen, Christian Igel To cite this version: Thorsten Suttorp, Nikolaus Hansen, Christian Igel. Efficient
More informationMajority Rule with Differential Latency: An Absorbing Markov Chain to Model Consensus
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Majority Rule with Differential Latency: An Absorbing Markov Chain to Model Consensus
More informationLarge Scale Continuous EDA Using Mutual Information
Large Scale Continuous EDA Using Mutual Information Qi Xu School of Computer Science University of Birmingham Email: qxx506@student.bham.ac.uk Momodou L. Sanyang School of Computer Science University of
More informationTwo Adaptive Mutation Operators for Optima Tracking in Dynamic Optimization Problems with Evolution Strategies
Two Adaptive Mutation Operators for Optima Tracking in Dynamic Optimization Problems with Evolution Strategies Claudio Rossi Claudio.Rossi@upm.es Antonio Barrientos Antonio.Barrientos@upm.es Jaime del
More information