Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed

Size: px
Start display at page:

Download "Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed"

Transcription

1 Author manuscript, published in "GECCO workshop on Black-Box Optimization Benchmarking (BBOB') () 9-" DOI :./8.8 Mirrored Variants of the (,)-CMA-ES Compared on the Noiseless BBOB- Testbed [Black-Box Optimization Benchmarking Workshop] Anne Auger, Dimo Brockhoff, and Nikolaus Hansen Projet TAO, INRIA Saclay Ile-de-France LRI, Bât 9, Univ. Paris-Sud 9 Orsay Cedex, France firstname.lastname@inria.fr inria-, version - Jul ABSTRACT Derandomization by means of mirrored samples has been recently introduced to enhance the performances of (, λ)- Evolution-Strategies (ESs) with the aim of designing fast and robust stochastic local search algorithms. This paper compares on the BBOB- noiseless benchmark testbed two variants of the (,)-CMA-ES where the mirrored samples are used. Independent restarts are conducted up to a total budget of D function evaluations, where D is the dimension of the search space. The results show that the improved variants are significantly faster than the baseline (,)-CMA-ES on functions in D (respectively when using sequential selection in addition) by a factor of up to (on the attractive sector function). In no case, the (,)-CMA-ES is significantly faster on any tested target function value in D and D. Moreover, the algorithm employing both mirroring and sequential selection is significantly better than the algorithm without sequentialism on five functions in D with expected running times that are about % smaller. Categories and Subject Descriptors G.. [Numerical Analysis]: Optimization global optimization, unconstrained optimization; F.. [Analysis of Algorithms and Problem Complexity]: Numerical Algorithms and Problems General Terms Algorithms. INTRODUCTION Evolution Strategies (ESs) are robust stochastic search algorithms for numerical optimization where the function to be minimized, f, maps the continuous search space R D c ACM,. This is the authors version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published at GECCO, July,, Portland, OR, USA. into R where D is the dimension of the search space. Recently, a new derandomization technique replacing the independent sampling of new solutions by mirrored sampling has been introduced to enhance the performances of ESs []. In this paper, we assess quantitatively the improvements that can be brought when using mirrored samples instead of independent ones. To do so, we compare on the BBOB- noiseless testbed the (,)-Covariance-Matrix- Adaptation Evolution-Strategy (CMA-ES) with two variants: first the (, m)-cma-es where mirrored samples are used, and second the (, s m)-cma-es that in addition to the mirrored samples uses the concept of sequential selection []. Both variants are described in Sec... THE ALGORITHMS TESTED The three algorithms tested are variants of the well-known CMA-ES [, 9, 8] where at each iteration n, λ new solutions are generated by sampling independently λ random vectors (N i(, C n)) i λ following a multivariate normal distribution with mean vector and covariance matrix C n. The vectors are added to the current solution X n to create the λ new solutions or offspring Xn i = X n + σ nn i(, C n) where σ n is a strictly positive parameter called step-size. In the simple (,)-CMA-ES, the number of offspring λ equals and X n+ is set to the best solution among Xn,..., Xn, i.e., X n+ = argmin{f(xn),..., f(xn)}. In the mirrored variant, denoted (, m)-cma-es, the second and fourth offspring are replaced by the offspring symmetric to the first and third with respect to X n, namely Xn = X n σ nn (, C n) and Xn = X n σ nn (, C n), where σ nn (, C n) and σ nn (, C n) are the random vectors added to X n to get Xn and Xn. The first/second and third/fourth offspring are, thus, negatively correlated. The update of X n+ is then identical to the (,)-CMA-ES, namely X n+ = argmin{f(xn),..., f(xn)}. In the (, s m)-cma-es, sequential selection is implemented. The four offspring solutions are generated with mirrored sampling. Evaluations are carried out in a sequential manner: after evaluating solution Xn, i it is compared to X n and if f(xn) i f(x n), the sequence of evaluations is stopped and X n+ = Xn. i In case all four offspring are worse than X n, X n+ = argmin{f(xn),..., f(xn)} according to the comma selection. In sequential selection, the number of offspring evaluated is a random variable by itself ranging from to λ = allowing to reduce the number of offspring adaptively as long as improvements are easy to achieve [].

2 inria-, version - Jul Covariance matrix and step-size are updated using the selected steps [9, ]. Independent Restarts. Similar to [], we independently restarted all algorithms as long as function evaluations were left, where maximally D function evaluations have been used. Parameter setting. We used the default parameter and termination settings (cf. [,, 8]) found in the source code on the WWW with two exceptions. We rectified the learning rate of the rankone update of the covariance matrix for small values of λ, setting c = min(, λ/)/((d +.) + µ eff ). The original value was not designed to work for λ <. We modified the damping parameter for the step-size to d σ =.+µ eff /λ+c σ. The setting was found by performing experiments on the sphere function, f : d σ was set as large as possible while still showing close to optimal performance, but, at least as large such that decreasing it by a factor of two did not lead to inacceptable performance. For µ eff /λ =. and µ eff D + the former setting of d σ is recovered. For a smaller ratio of µ eff /λ or for µ eff > D +, the new setting allows larger (i.e. faster) changes of σ. Here, µ eff =. For λ, the new setting might be harmful in a noisy or too rugged landscape. Finally, the step-size multiplier was clamped from above at exp(), while we do not believe this had any effect in the presented experiments. Each initial solution X was uniformly sampled in [, ] D and the step-size σ was initialized to. The source code used for the experiments is available at. As the same parameter setting has been used for all test functions, the crafting effort CrE of all algorithms is.. CPU TIMING EXPERIMENTS For the timing experiment, all three algorithms were run on f 8 with a maximum of D function evaluations and restarted until at least seconds have passed (according to Figure in []). The experiments have been conducted with an 8 core Intel Xeon E machine with. GHz under Ubuntu 9. linux and Matlab R8a. The time per function evaluation was.;.;.;.;.;. times seconds for (,)-CMA-ES,.;.;.;.;.;. times seconds for (, m)-cma-es, and.;.;.; 8.;.; 8. times seconds for (, s m)-cma-es in dimensions ; ; ; ; ; respectively. Note that MATLAB distributes the computations over all 8 cores only for D and D.. RESULTS In this section, experiments according to [] on the benchmark functions given in [, ] are presented. The expected running time (ERT), used in the figures and table, depends on a given target function value, f t = f opt + f, and is computed over all relevant trials as the number of function evaluations executed during each trial while the best function value did not reach f t, summed over all trials and divided by the number of trials that actually reached f t [, ]. Statistical significance is tested with the rank-sum test for a given target f t using, for each trial, either the number of needed function evaluations to reach f t (inverted cmaes.m, version..beta, from ~hansen/cmaes_inmatlab.html bbob--results and multiplied by ), or, if the target was not reached, the best f-value achieved, measured only up to the smallest number of overall function evaluations for any unsuccessful trial under consideration.. Comparing (, m )- With (,)-CMA-ES The (, m)-cma-es is compared with the (,)-CMA-ES in Figures and and in Table. Mirroring within the (, m)-cma-es seems to have an important positive impact compared to the baseline (,)-CMA-ES. In D, we observe a slight worsening only on the Gallagher functions f and f which are not statistically significant. On the other hand, the (, m)-cma-es outperforms the (,)-CMA-ES on functions of which show statistically significant differences (in D and for a target of ): on the sphere function (f ), the expected running time is % lower, on the separable ellipsoid (f ) it is 8% lower, on the non-separable ellipsoid (f ) 9% lower and on the attractive sector function (f ) the expected running times even differ by a factor of about. These differences are less significant in D (Table ), but Fig. shows similar improvements also for D and smaller dimensions.. Comparing (, s m)- With (, m )-CMA-ES Results comparing the (, m)-cma-es and the (, s m)- CMA-ES are presented in Figures and and in Table. The results show that the (, s m)-cma-es is statistically significantly better than the (, m)-cma-es on five functions in D with expected running times that are about % smaller on f and f, % smaller on f, % smaller on f, and about % smaller on f than the ones for the (, m)-cma-es. Contrary, there are only two functions where the (, s m)-cma-es is worse than the (, m)-cma-es (% on f and about % on f ) but these differences are not statistically significant. To conclude, the (, s m)-cma- ES should be clearly preferred over the (, m)-cma-es on the BBOB- testbed and, according to Sec.., also over the (,)-CMA-ES. Worth to mention is also the fact, that the (, s m)-cma-es shows better performance than the best algorithm from the BBOB-9 benchmarking on the attractive sector function (f ) in D by about % and a slightly better performance on the ellipsoid (f ) in D, see [].. Comparing (, s m)- With (,)-CMA-ES Regarding this comparison, we refrain from showing the plots and tables due to space limitations. Not surprisingly when considering Sec.., the results show a statistically significant improvement for the (, s m)-cma-es over the (,)- CMA-ES on functions in D: f (% faster), f (9% faster), f (faster by a factor of about ), f (about times faster), f (9% faster), f (% faster), and f (about % faster). Only on the Gallagher function with peaks (f ), an increase of the expected running time by a factor of about can be observed which is, however, not statistically significant due to the low number of successful runs.. CONCLUSIONS The idea behind derandomization by means of mirroring introduced in [] is to use only one random sample from a multivariate normal distribution to create two (negatively correlated or mirrored) offspring. Thereby, one offspring is generated by adding a random sample to the parent solution

3 Sphere Ellipsoid separable Rastrigin separable Skew Rastrigin-Bueche Linear slope Attractive sector Step-ellipsoid 8 Rosenbrock original inria-, version - Jul Schaffer F, cond. Sharp ridge 9 Rosenbrock rotated 8 Schaffer F, cond. Sum of diff. powers Ellipsoid Discus Rastrigin 9 Griewank-Rosenbrock Schwefel x*sin(x) Weierstrass Bent cigar Gallagher peaks Gallagher peaks Katsuuras Lunacek bi-rastrigin Figure : Expected running time (ERT in log of number of function evaluations) of (, m)-cma-es versus (,)-CMA-ES for target values f [ 8, ] in each dimension for functions f f. Markers on the upper or right edge indicate that the target value was never reached by (, m)-cma-es or (,)-CMA-ES respectively. Markers represent dimension: :+, :, :, :, :.

4 -D -D all functions of trials f- +: / -: /9 -: / : / of trials f- +: / -: / -: / : / inria-, version - Jul separable fcts moderate fcts ill-conditioned fcts multi-modal fcts weak structure fcts of trials of trials of trials of trials of trials. log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f- - - log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) of trials of trials of trials of trials of trials. log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f- - log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f- - - log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) Figure : Empirical cumulative distributions (ECDF) of run lengths and speed-up ratios in -D (left) and - D (right). Left sub-columns: ECDF of the number of function evaluations divided by dimension D (FEvals/D) to reach a target value f opt + f with f = k, where k {,,, 8} is given by the first value in the legend, for (, m)-cma-es (solid) and (,)-CMA-ES (dashed). Light beige lines show the ECDF of FEvals for target value f = 8 of algorithms benchmarked during BBOB-9. Right sub-columns: ECDF of FEval ratios of (, m)-cma-es divided by (,)-CMA-ES, all trial pairs for each function. Pairs where both trials failed are disregarded, pairs where one trial failed are visible in the limits being > or <. The legends indicate the number of functions that were solved in at least one trial ((, m)-cma-es first).

5 Sphere Ellipsoid separable Rastrigin separable Skew Rastrigin-Bueche Linear slope Attractive sector Step-ellipsoid 8 Rosenbrock original inria-, version - Jul Schaffer F, cond. Sharp ridge 9 Rosenbrock rotated 8 Schaffer F, cond. Sum of diff. powers Ellipsoid Discus Rastrigin 9 Griewank-Rosenbrock Schwefel x*sin(x) Weierstrass Bent cigar Gallagher peaks Gallagher peaks Katsuuras Lunacek bi-rastrigin Figure : Expected running time (ERT in log of number of function evaluations) of (, s m)-cma-es versus (, m)-cma-es for target values f [ 8, ] in each dimension for functions f f. Markers on the upper or right edge indicate that the target value was never reached by (, s m)-cma-es or (, m)-cma-es respectively. Markers represent dimension: :+, :, :, :, :.

6 inria-, version - Jul f e+ e+ e- e- e- e- #succ f / (,)-CMA-ES..9 / (,m)-cma-es.. 9 / f / (,)-CMA-ES / (,m)-cma-es 9 / f / (,)-CMA-ES..e / (,m)-cma-es. 9 / f / (,)-CMA-ES.e / (,m)-cma-es..e / f / (,)-CMA-ES / (,m)-cma-es / f 8 8 / (,)-CMA-ES / (,m)-cma-es / f / (,)-CMA-ES / (,m)-cma-es... 8 / f 8 9 / (,)-CMA-ES / (,m)-cma-es / f 9 / (,)-CMA-ES / (,m)-cma-es / f 8 88 / (,)-CMA-ES / (,m)-cma-es / f / (,)-CMA-ES / (,m)-cma-es / f / (,)-CMA-ES / (,m)-cma-es.. / f 9 8 / (,)-CMA-ES 9... / (,m)-cma-es / f / (,)-CMA-ES / (,m)-cma-es / f 9.9e.e.e.e / (,)-CMA-ES 8. 8.e / (,m)-cma-es. 8.e / f.e.e.e / (,)-CMA-ES..e / (,m)-cma-es.9.8.e / f. 9 9 / (,)-CMA-ES 8.e / (,m)-cma-es.. 9.e / f e.e / (,)-CMA-ES.e / (,m)-cma-es e / f 9.e.e.e / (,)-CMA-ES.e.e / (,m)-cma-es 9.e.e / f 8.8e.e.e.e / (,)-CMA-ES / (,m)-cma-es.8.9.e / f 8 / (,)-CMA-ES / (,m)-cma-es.... / f 9 9 / (,)-CMA-ES.9 / (,m)-cma-es / f.e.e.e.e / (,)-CMA-ES. 8 / (,m)-cma-es.8..e / f.e.e9.e.e.e / (,)-CMA-ES..e / (,m)-cma-es..e / -D -D f e+ e+ e- e- e- e- #succ f / (,)-CMA-ES / (,m)-cma-es. / f / (,)-CMA-ES 8 9 / (,m)-cma-es 8 / f / (,)-CMA-ES.e / (,m)-cma-es.e / f 8.e 9/ (,)-CMA-ES.e / (,m)-cma-es.e / f / (,)-CMA-ES / (,m)-cma-es / f 8 / (,)-CMA-ES / (,m)-cma-es / f 9.e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e / f 8 9 / (,)-CMA-ES / (,m)-cma-es / f 9 / (,)-CMA-ES..... / (,m)-cma-es / f 8.e.e.e.e / (,)-CMA-ES / (,m)-cma-es / f 98.e.e / (,)-CMA-ES / (,m)-cma-es / f 9.e.e / (,)-CMA-ES / (,m)-cma-es / f 8.9e.e.e / (,)-CMA-ES 8..e / (,m)-cma-es e / f 9.e / (,)-CMA-ES / (,m)-cma-es / f.e.e.e.e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e / f.e.e.9e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e / f.e.e 8.e / (,)-CMA-ES.8e.e / (,m)-cma-es..e / f 8.e.8e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e / f 9.e.e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e.e / f 8.e.e.e.e.e / (,)-CMA-ES..e / (,m)-cma-es..e / f.e.e.e.8e / (,)-CMA-ES / (,m)-cma-es / f.e.e.e.e / (,)-CMA-ES 8. 8/ (,m)-cma-es...9 / f..e.9e 8.e 8.e / (,)-CMA-ES.e / (,m)-cma-es 9.e / f.e.e.e.e.e.e / (,)-CMA-ES.e / (,m)-cma-es.e / Table : Expected running time (ERT in number of function evaluations) divided by the best ERT measured during BBOB-9 (given in the respective first row) for the algorithms (,)-CMA-ES and is (, m)-cma- ES for different f values for functions f f. The median number of conducted function evaluations is additionally given in italics, if ERT( ) =. #succ is the number of trials that reached the final target f opt + 8. Bold entries are statistically significantly better compared to the other algorithm, with p =. or p = k where k > is the number following the symbol, with Bonferroni correction of 8. and a second offspring then equals the solution which is symmetric to the first one with respect to the parent (by adding the negative sample to the parent). Here, this concept of mirroring has been integrated within two variants of a simple (,)-CMA-ES (of which the (, s m)-cma-es uses sequential selection [] in addition and the (, m)-cma-es does not) and compared on the noiseless BBOB- testbed. The results show that using mirroring gives an improvement of at least % on functions (stat. significant in D and as target). Combining mirroring with sequential selection is even better showing further improvements over the (, m)-cma-es by at least % on functions. Acknowledgments This work was supported by the French national research agency (ANR) in the projects SYSCOMM ANR-SYSC- and COSINUS ANR-COSI--.. REFERENCES [] A. Auger, D. Brockhoff, and N. Hansen. Mirrored sampling and sequential selection for evolution strategies. Rapport de Recherche RR-9, INRIA Saclay Île-de-France, April.

7 -D -D all functions of trials f- +: / -: / -: / : / of trials f- +: / -: / -: / : / separable fcts of trials. log of FEvals / DIM. + f f log of FEvals(A)/FEvals(A) +: / -: / -: / : / of trials. log of FEvals / DIM. + f f log of FEvals(A)/FEvals(A) +: / -: / -: / : / inria-, version - Jul moderate fcts ill-conditioned fcts multi-modal fcts weak structure fcts of trials of trials of trials of trials. log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f- - - log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) of trials of trials of trials of trials. log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM. + f log of FEvals / DIM f- - log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) +: / -: / -: / : / f log of FEvals(A)/FEvals(A) Figure : Empirical cumulative distributions of run lengths of (, s m)-cma-es (solid) and (, m)-cma-es (dashed) and speed-up ratios of the former divided by the latter in -D (left) and -D (right) as in Fig.. [] A. Auger, S. Finck, N. Hansen, and R. Ros. BBOB 9: Comparison tables of all algorithms on all noiseless functions. Technical Report RT, INRIA, April. [] A. Auger and N. Hansen. Performance evaluation of an advanced local search evolutionary algorithm. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC ), pages 8,. [] S. Finck, N. Hansen, R. Ros, and A. Auger. Real-parameter

8 inria-, version - Jul f e+ e+ e- e- e- e- #succ f / (,m)-cma-es.. 9 / (, s m )-CMA-ES 8. / f / (,m)-cma-es 9 / (, s m )-CMA-ES 8 8 / f / (,m)-cma-es. 9 / (, s m )-CMA-ES.8 98.e / f / (,m)-cma-es..e / (, s m )-CMA-ES.9.e / f / (,m)-cma-es / (, s m )-CMA-ES / f 8 8 / (,m)-cma-es / (, s m )-CMA-ES / f / (,m)-cma-es... 8 / (, s m )-CMA-ES..9.9 / f 8 9 / (,m)-cma-es / (, s m )-CMA-ES / f 9 / (,m)-cma-es / (, s m )-CMA-ES....8 / f 8 88 / (,m)-cma-es / (, s m )-CMA-ES.... / f / (,m)-cma-es / (, s m )-CMA-ES / f / (,m)-cma-es.. / (, s m )-CMA-ES / f 9 8 / (,m)-cma-es / (, s m )-CMA-ES / f / (,m)-cma-es / (, s m )-CMA-ES / f 9.9e.e.e.e / (,m)-cma-es. 8.e / (, s m )-CMA-ES.8 8 / f.e.e.e / (,m)-cma-es.9.8.e / (, s m )-CMA-ES. 9.e / f. 9 9 / (,m)-cma-es.. 9.e / (, s m )-CMA-ES....e / f e.e / (,m)-cma-es e / (, s m )-CMA-ES. 8..e / f 9.e.e.e / (,m)-cma-es 9.e.e / (, s m )-CMA-ES.e 8. / f 8.8e.e.e.e / (,m)-cma-es.8.9.e / (, s m )-CMA-ES...e / f 8 / (,m)-cma-es.... / (, s m )-CMA-ES / f 9 9 / (,m)-cma-es / (, s m )-CMA-ES / f.e.e.e.e / (,m)-cma-es.8..e / (, s m )-CMA-ES. 8.9.e / f.e.e 9.e.e.e / (,m)-cma-es..e / (, s m )-CMA-ES..e / -D -D f e+ e+ e- e- e- e- #succ f / (,m)-cma-es. / (, s m )-CMA-ES / f / (,m)-cma-es 8 / (, s m )-CMA-ES 9 / f / (,m)-cma-es.e / (, s m )-CMA-ES.e / f 8.e 9/ (,m)-cma-es.e / (, s m )-CMA-ES.e / f / (,m)-cma-es / (, s m )-CMA-ES / f 8 / (,m)-cma-es / (, s m )-CMA-ES..... / f 9.e.e.e / (,m)-cma-es.e / (, s m )-CMA-ES.e / f 8 9 / (,m)-cma-es / (, s m )-CMA-ES / f 9 / (,m)-cma-es / (, s m )-CMA-ES / f 8.e.e.e.e / (,m)-cma-es / (, s m )-CMA-ES / f 98.e.e / (,m)-cma-es / (, s m )-CMA-ES..... / f 9.e.e / (,m)-cma-es / (, s m )-CMA-ES / f 8.9e.e.e / (,m)-cma-es e / (, s m )-CMA-ES...e / f 9.e / (,m)-cma-es / (, s m )-CMA-ES / f.e.e.e.e.e.e / (,m)-cma-es.e / (, s m )-CMA-ES.e / f.e.e.9e.e.e / (,m)-cma-es.e / (, s m )-CMA-ES 9.e / f.e.e 8.e / (,m)-cma-es..e / (, s m )-CMA-ES..e.e / f 8.e.8e.e.e / (,m)-cma-es.e / (, s m )-CMA-ES.e / f 9.e.e.e.e / (,m)-cma-es.e.e / (, s m )-CMA-ES.8e.e / f 8.e.e.e.e.e / (,m)-cma-es..e / (, s m )-CMA-ES..e / f.e.e.e.8e / (,m)-cma-es / (, s m )-CMA-ES / f.e.e.e.e / (,m)-cma-es...9 / (, s m )-CMA-ES / f..e.9e 8.e 8.e / (,m)-cma-es 9.e / (, s m )-CMA-ES.e / f.e.e.e.e.e.e / (,m)-cma-es.e / (, s m )-CMA-ES.e / Table : Expected running time (ERT in number of function evaluations) divided by the best ERT measured during BBOB-9 as in Table but now comparing (, m)-cma-es and (, s m)-cma-es. black-box optimization benchmarking 9: Presentation of the noiseless functions. Technical Report 9/, Research Center PPE, 9. Updated February. [] N. Hansen. Benchmarking a BI-population CMA-ES on the BBOB-9 function testbed. In F. Rothlauf, editor, GECCO (Companion), pages ACM, 9. [] N. Hansen, A. Auger, S. Finck, and R. Ros. Real-parameter black-box optimization benchmarking : Experimental setup. Technical Report RR-, INRIA,. [] N. Hansen, S. Finck, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 9: Noiseless functions definitions. Technical Report RR9, INRIA, 9. Updated February. [8] N. Hansen and S. Kern. Evaluating the CMA evolution strategy on multimodal test functions. In X. Yao et al., editors, Parallel Problem Solving from Nature PPSN VIII, volume of LNCS, pages 8 9. Springer,. [9] N. Hansen, S. D. Müller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation. Evolutionary Computation, (): 8,. [] N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9():9 9,. [] K. Price. Differential evolution vs. the functions of the second ICEO. In Proceedings of the IEEE International Congress on Evolutionary Computation, pages, 99.

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed Raymond Ros To cite this version: Raymond Ros. Comparison of NEWUOA with Different Numbers of Interpolation

More information

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed Raymond Ros To cite this version: Raymond Ros. Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed. Genetic

More information

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed Bounding the Population Size of OP-CMA-ES on the Noiseless BBOB Testbed Tianjun Liao IRIDIA, CoDE, Université Libre de Bruxelles (ULB), Brussels, Belgium tliao@ulb.ac.be Thomas Stützle IRIDIA, CoDE, Université

More information

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed Benchmarking a BI-Population CMA-ES on the BBOB- Function Testbed Nikolaus Hansen Microsoft Research INRIA Joint Centre rue Jean Rostand Orsay Cedex, France Nikolaus.Hansen@inria.fr ABSTRACT We propose

More information

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB- Noiseless Testbed Nikolaus Hansen, Raymond Ros To cite this version: Nikolaus Hansen, Raymond Ros. Benchmarking a Weighted Negative

More information

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed Anne Auger, Nikolaus Hansen To cite this version: Anne Auger, Nikolaus Hansen. Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed. ACM-GECCO

More information

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds Tom Schaul Courant Institute of Mathematical Sciences, New York University

More information

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Example Paper The BBOBies ABSTRACT We have benchmarked the Nelder-Mead downhill simplex method on the noisefree BBOB- testbed

More information

BBOB-Benchmarking Two Variants of the Line-Search Algorithm

BBOB-Benchmarking Two Variants of the Line-Search Algorithm BBOB-Benchmarking Two Variants of the Line-Search Algorithm Petr Pošík Czech Technical University in Prague, Faculty of Electrical Engineering, Dept. of Cybernetics Technická, Prague posik@labe.felk.cvut.cz

More information

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed Benchmarking a Hyrid ulti Level Single Linkage Algorithm on the BBOB Noiseless Tested László Pál Sapientia - Hungarian University of Transylvania 00 iercurea-ciuc, Piata Liertatii, Nr., Romania pallaszlo@sapientia.siculorum.ro

More information

Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds

Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds Investigating the Impact o Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds Tom Schaul Courant Institute o Mathematical Sciences, New York University Broadway, New

More information

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-0 Testbed: A Preliminary Study Dimo Brockhoff, Bernd Bischl, Tobias Wagner To cite this version: Dimo Brockhoff, Bernd

More information

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed Lukáš Bajer Institute of Computer Science Academy of Sciences of the Czech Republic and Faculty of Mathematics

More information

Covariance Matrix Adaptation in Multiobjective Optimization

Covariance Matrix Adaptation in Multiobjective Optimization Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France Mastertitelformat Scenario: Multiobjective

More information

Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed

Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed Lukáš Bajer,, Zbyněk Pitra,, Martin Holeňa Faculty of Mathematics and Physics, Charles University, Institute of Computer

More information

arxiv: v1 [cs.ne] 9 May 2016

arxiv: v1 [cs.ne] 9 May 2016 Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES) arxiv:1605.02720v1 [cs.ne] 9 May 2016 ABSTRACT Ilya Loshchilov University of Freiburg Freiburg, Germany ilya.loshchilov@gmail.com

More information

Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions

Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions Steffen Finck, Nikolaus Hansen, Raymond Ros and Anne Auger Report 2009/20, Research Center PPE, re-compiled

More information

Adaptive Coordinate Descent

Adaptive Coordinate Descent Adaptive Coordinate Descent Ilya Loshchilov 1,2, Marc Schoenauer 1,2, Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 and Laboratoire de Recherche en Informatique (UMR CNRS 8623) Université

More information

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Babatunde Sawyerr 1 Aderemi Adewumi 2 Montaz Ali 3 1 University of Lagos, Lagos, Nigeria 2 University

More information

A Restart CMA Evolution Strategy With Increasing Population Size

A Restart CMA Evolution Strategy With Increasing Population Size Anne Auger and Nikolaus Hansen A Restart CMA Evolution Strategy ith Increasing Population Size Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005 c IEEE A Restart CMA Evolution Strategy

More information

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability

More information

arxiv: v1 [cs.ne] 29 Jul 2014

arxiv: v1 [cs.ne] 29 Jul 2014 A CUDA-Based Real Parameter Optimization Benchmark Ke Ding and Ying Tan School of Electronics Engineering and Computer Science, Peking University arxiv:1407.7737v1 [cs.ne] 29 Jul 2014 Abstract. Benchmarking

More information

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms Anne Auger & Nikolaus Hansen INRIA Saclay - Ile-de-France, project team TAO Universite Paris-Sud, LRI, Bat. 49 945 ORSAY

More information

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.

More information

Experimental Comparisons of Derivative Free Optimization Algorithms

Experimental Comparisons of Derivative Free Optimization Algorithms Experimental Comparisons of Derivative Free Optimization Algorithms Anne Auger Nikolaus Hansen J. M. Perez Zerpa Raymond Ros Marc Schoenauer TAO Project-Team, INRIA Saclay Île-de-France, and Microsoft-INRIA

More information

Advanced Optimization

Advanced Optimization Advanced Optimization Lecture 3: 1: Randomized Algorithms for for Continuous Discrete Problems Problems November 22, 2016 Master AIC Université Paris-Saclay, Orsay, France Anne Auger INRIA Saclay Ile-de-France

More information

Empirical comparisons of several derivative free optimization algorithms

Empirical comparisons of several derivative free optimization algorithms Empirical comparisons of several derivative free optimization algorithms A. Auger,, N. Hansen,, J. M. Perez Zerpa, R. Ros, M. Schoenauer, TAO Project-Team, INRIA Saclay Ile-de-France LRI, Bat 90 Univ.

More information

Comparison-Based Optimizers Need Comparison-Based Surrogates

Comparison-Based Optimizers Need Comparison-Based Surrogates Comparison-Based Optimizers Need Comparison-Based Surrogates Ilya Loshchilov 1,2, Marc Schoenauer 1,2, and Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 Laboratoire de Recherche

More information

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches BI-population CMA-ES Algorithms with Surrogate Models and Line Searches Ilya Loshchilov 1, Marc Schoenauer 2 and Michèle Sebag 2 1 LIS, École Polytechnique Fédérale de Lausanne 2 TAO, INRIA CNRS Université

More information

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties 1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen

More information

Investigating the Local-Meta-Model CMA-ES for Large Population Sizes

Investigating the Local-Meta-Model CMA-ES for Large Population Sizes Investigating the Local-Meta-Model CMA-ES for Large Population Sizes Zyed Bouzarkouna, Anne Auger, Didier Yu Ding To cite this version: Zyed Bouzarkouna, Anne Auger, Didier Yu Ding. Investigating the Local-Meta-Model

More information

Evolution Strategies and Covariance Matrix Adaptation

Evolution Strategies and Covariance Matrix Adaptation Evolution Strategies and Covariance Matrix Adaptation Cours Contrôle Avancé - Ecole Centrale Paris Anne Auger January 2014 INRIA Research Centre Saclay Île-de-France University Paris-Sud, LRI (UMR 8623),

More information

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES Introduction to Randomized Black-Box Numerical Optimization and CMA-ES July 3, 2017 CEA/EDF/Inria summer school "Numerical Analysis" Université Pierre-et-Marie-Curie, Paris, France Anne Auger, Asma Atamna,

More information

Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions

Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions Nikolaus Hansen, Steffen Finck, Raymond Ros, Anne Auger To cite this version: Nikolaus Hansen, Steffen Finck, Raymond

More information

Cumulative Step-size Adaptation on Linear Functions

Cumulative Step-size Adaptation on Linear Functions Cumulative Step-size Adaptation on Linear Functions Alexandre Chotard, Anne Auger, Nikolaus Hansen To cite this version: Alexandre Chotard, Anne Auger, Nikolaus Hansen. Cumulative Step-size Adaptation

More information

Optimisation numérique par algorithmes stochastiques adaptatifs

Optimisation numérique par algorithmes stochastiques adaptatifs Optimisation numérique par algorithmes stochastiques adaptatifs Anne Auger M2R: Apprentissage et Optimisation avancés et Applications anne.auger@inria.fr INRIA Saclay - Ile-de-France, project team TAO

More information

A (1+1)-CMA-ES for Constrained Optimisation

A (1+1)-CMA-ES for Constrained Optimisation A (1+1)-CMA-ES for Constrained Optimisation Dirk Arnold, Nikolaus Hansen To cite this version: Dirk Arnold, Nikolaus Hansen. A (1+1)-CMA-ES for Constrained Optimisation. Terence Soule and Jason H. Moore.

More information

Bio-inspired Continuous Optimization: The Coming of Age

Bio-inspired Continuous Optimization: The Coming of Age Bio-inspired Continuous Optimization: The Coming of Age Anne Auger Nikolaus Hansen Nikolas Mauny Raymond Ros Marc Schoenauer TAO Team, INRIA Futurs, FRANCE http://tao.lri.fr First.Last@inria.fr CEC 27,

More information

Stochastic Methods for Continuous Optimization

Stochastic Methods for Continuous Optimization Stochastic Methods for Continuous Optimization Anne Auger et Dimo Brockhoff Paris-Saclay Master - Master 2 Informatique - Parcours Apprentissage, Information et Contenu (AIC) 2016 Slides taken from Auger,

More information

Local-Meta-Model CMA-ES for Partially Separable Functions

Local-Meta-Model CMA-ES for Partially Separable Functions Local-Meta-Model CMA-ES for Partially Separable Functions Zyed Bouzarkouna, Anne Auger, Didier Yu Ding To cite this version: Zyed Bouzarkouna, Anne Auger, Didier Yu Ding. Local-Meta-Model CMA-ES for Partially

More information

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES Covariance Matrix Adaptation-Evolution Strategy Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating and Covariance-Matrix Adaptation-ES Ilya Loshchilov, Marc Schoenauer,

More information

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial)

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial) Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial) Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat. 490 91405

More information

Stochastic optimization and a variable metric approach

Stochastic optimization and a variable metric approach The challenges for stochastic optimization and a variable metric approach Microsoft Research INRIA Joint Centre, INRIA Saclay April 6, 2009 Content 1 Introduction 2 The Challenges 3 Stochastic Search 4

More information

A CMA-ES for Mixed-Integer Nonlinear Optimization

A CMA-ES for Mixed-Integer Nonlinear Optimization A CMA-ES for Mixed-Integer Nonlinear Optimization Nikolaus Hansen To cite this version: Nikolaus Hansen. A CMA-ES for Mixed-Integer Nonlinear Optimization. [Research Report] RR-, INRIA.. HAL Id: inria-

More information

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.

More information

A CMA-ES with Multiplicative Covariance Matrix Updates

A CMA-ES with Multiplicative Covariance Matrix Updates A with Multiplicative Covariance Matrix Updates Oswin Krause Department of Computer Science University of Copenhagen Copenhagen,Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik

More information

The Dispersion Metric and the CMA Evolution Strategy

The Dispersion Metric and the CMA Evolution Strategy The Dispersion Metric and the CMA Evolution Strategy Monte Lunacek Department of Computer Science Colorado State University Fort Collins, CO 80523 lunacek@cs.colostate.edu Darrell Whitley Department of

More information

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Karol Opara 1 and Jaros law Arabas 2 1 Systems Research Institute, Polish Academy of Sciences 2 Institute of Electronic

More information

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models J. Hlaváčová (Ed.): ITAT 07 Proceedings, pp. 36 43 CEUR Workshop Proceedings Vol. 885, ISSN 63-0073, c 07 J. Repický, L. Bajer, Z. Pitra, M. Holeňa Adaptive Generation-Based Evolution Control for Gaussian

More information

Multi-objective Optimization with Unbounded Solution Sets

Multi-objective Optimization with Unbounded Solution Sets Multi-objective Optimization with Unbounded Solution Sets Oswin Krause Dept. of Computer Science University of Copenhagen Copenhagen, Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik

More information

Efficient Covariance Matrix Update for Variable Metric Evolution Strategies

Efficient Covariance Matrix Update for Variable Metric Evolution Strategies Efficient Covariance Matrix Update for Variable Metric Evolution Strategies Thorsten Suttorp, Nikolaus Hansen, Christian Igel To cite this version: Thorsten Suttorp, Nikolaus Hansen, Christian Igel. Efficient

More information

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization Nikolaus Hansen INRIA, Research Centre Saclay Machine Learning and Optimization Team, TAO Univ. Paris-Sud, LRI MSRC

More information

How information theory sheds new light on black-box optimization

How information theory sheds new light on black-box optimization How information theory sheds new light on black-box optimization Anne Auger Colloque Théorie de l information : nouvelles frontières dans le cadre du centenaire de Claude Shannon IHP, 26, 27 and 28 October,

More information

arxiv: v2 [cs.ai] 13 Jun 2011

arxiv: v2 [cs.ai] 13 Jun 2011 A Linear Time Natural Evolution Strategy for Non-Separable Functions Yi Sun, Faustino Gomez, Tom Schaul, and Jürgen Schmidhuber IDSIA, University of Lugano & SUPSI, Galleria, Manno, CH-698, Switzerland

More information

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague

More information

Evolutionary Ensemble Strategies for Heuristic Scheduling

Evolutionary Ensemble Strategies for Heuristic Scheduling 0 International Conference on Computational Science and Computational Intelligence Evolutionary Ensemble Strategies for Heuristic Scheduling Thomas Philip Runarsson School of Engineering and Natural Science

More information

Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES

Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES Andrea Maesani and Dario Floreano Laboratory of Intelligent Systems, Institute of Microengineering, Ecole Polytechnique Fédérale de

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

Egocentric Particle Swarm Optimization

Egocentric Particle Swarm Optimization Egocentric Particle Swarm Optimization Foundations of Evolutionary Computation Mandatory Project 1 Magnus Erik Hvass Pedersen (971055) February 2005, Daimi, University of Aarhus 1 Introduction The purpose

More information

The CMA Evolution Strategy: A Tutorial

The CMA Evolution Strategy: A Tutorial The CMA Evolution Strategy: A Tutorial Nikolaus Hansen November 6, 205 Contents Nomenclature 3 0 Preliminaries 4 0. Eigendecomposition of a Positive Definite Matrix... 5 0.2 The Multivariate Normal Distribution...

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

A comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms

A comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms A comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms Rodolphe Le Riche 1,2 1 Ecole des Mines de Saint Etienne, France 2 CNRS LIMOS France March 2018 MEXICO

More information

Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces

Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics

More information

Stochastic Search using the Natural Gradient

Stochastic Search using the Natural Gradient Keywords: stochastic search, natural gradient, evolution strategies Sun Yi Daan Wierstra Tom Schaul Jürgen Schmidhuber IDSIA, Galleria 2, Manno 6928, Switzerland yi@idsiach daan@idsiach tom@idsiach juergen@idsiach

More information

REMEDA: Random Embedding EDA for optimising functions with intrinsic dimension

REMEDA: Random Embedding EDA for optimising functions with intrinsic dimension REMEDA: Random Embedding EDA for optimising functions with intrinsic dimension Momodou L. Sanyang 1,2 and Ata Kabán 1 1 School of Computer Science, University of Birmingham, Edgbaston, UK, B15 2TT., {M.L.Sanyang,

More information

Numerical optimization Typical di culties in optimization. (considerably) larger than three. dependencies between the objective variables

Numerical optimization Typical di culties in optimization. (considerably) larger than three. dependencies between the objective variables 100 90 80 70 60 50 40 30 20 10 4 3 2 1 0 1 2 3 4 0 What Makes a Function Di cult to Solve? ruggedness non-smooth, discontinuous, multimodal, and/or noisy function dimensionality non-separability (considerably)

More information

Adaptive Differential Evolution and Exponential Crossover

Adaptive Differential Evolution and Exponential Crossover Proceedings of the International Multiconference on Computer Science and Information Technology pp. 927 931 ISBN 978-83-60810-14-9 ISSN 1896-7094 Adaptive Differential Evolution and Exponential Crossover

More information

Principled Design of Continuous Stochastic Search: From Theory to

Principled Design of Continuous Stochastic Search: From Theory to Contents Principled Design of Continuous Stochastic Search: From Theory to Practice....................................................... 3 Nikolaus Hansen and Anne Auger 1 Introduction: Top Down Versus

More information

Particle Swarm Optimization with Velocity Adaptation

Particle Swarm Optimization with Velocity Adaptation In Proceedings of the International Conference on Adaptive and Intelligent Systems (ICAIS 2009), pp. 146 151, 2009. c 2009 IEEE Particle Swarm Optimization with Velocity Adaptation Sabine Helwig, Frank

More information

On the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions

On the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions On the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions ABSTRACT Ryoji Tanabe Graduate School of Arts and Sciences The University of Tokyo Tokyo, Japan rt.ryoji.tanabe@gmail.com

More information

Czech Technical University in Prague Faculty of Electrical Engineering. Department of Cybernetics DIPLOMA THESIS

Czech Technical University in Prague Faculty of Electrical Engineering. Department of Cybernetics DIPLOMA THESIS Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics DIPLOMA THESIS Differential evolution with adaptive encoding Author: Bc. Václav Klemš Supervisor: Ing. Petr

More information

Transitional Particle Swarm Optimization

Transitional Particle Swarm Optimization International Journal of Electrical and Computer Engineering (IJECE) Vol. 7, No. 3, June 7, pp. 6~69 ISSN: 88-878, DOI:.59/ijece.v7i3.pp6-69 6 Transitional Particle Swarm Optimization Nor Azlina Ab Aziz,

More information

Efficient Natural Evolution Strategies

Efficient Natural Evolution Strategies Efficient Natural Evolution Strategies Evolution Strategies and Evolutionary Programming Track ABSTRACT Yi Sun Manno 698, Switzerland yi@idsiach Tom Schaul Manno 698, Switzerland tom@idsiach Efficient

More information

Quality Gain Analysis of the Weighted Recombination Evolution Strategy on General Convex Quadratic Functions

Quality Gain Analysis of the Weighted Recombination Evolution Strategy on General Convex Quadratic Functions Quality Gain Analysis of the Weighted Recombination Evolution Strategy on General Convex Quadratic Functions Youhei Akimoto, Anne Auger, Nikolaus Hansen To cite this version: Youhei Akimoto, Anne Auger,

More information

Large Scale Continuous EDA Using Mutual Information

Large Scale Continuous EDA Using Mutual Information Large Scale Continuous EDA Using Mutual Information Qi Xu School of Computer Science University of Birmingham Email: qxx506@student.bham.ac.uk Momodou L. Sanyang School of Computer Science University of

More information

Beta Damping Quantum Behaved Particle Swarm Optimization

Beta Damping Quantum Behaved Particle Swarm Optimization Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,

More information

Variable Metric Reinforcement Learning Methods Applied to the Noisy Mountain Car Problem

Variable Metric Reinforcement Learning Methods Applied to the Noisy Mountain Car Problem Variable Metric Reinforcement Learning Methods Applied to the Noisy Mountain Car Problem Verena Heidrich-Meisner and Christian Igel Institut für Neuroinformatik, Ruhr-Universität Bochum, Germany {Verena.Heidrich-Meisner,Christian.Igel}@neuroinformatik.rub.de

More information

Testing Gaussian Process Surrogates on CEC 2013 multi-modal benchmark

Testing Gaussian Process Surrogates on CEC 2013 multi-modal benchmark ITAT 2016 Proceedings, CEUR Workshop Proceedings Vol. 1649, pp. 138 146 http://ceur-ws.org/vol-1649, Series ISSN 1613-0073, c 2016 N. Orekhov, L. Bajer, M. Holeňa Testing Gaussian Process Surrogates on

More information

Block Diagonal Natural Evolution Strategies

Block Diagonal Natural Evolution Strategies Block Diagonal Natural Evolution Strategies Giuseppe Cuccu and Faustino Gomez IDSIA USI-SUPSI 6928 Manno-Lugano, Switzerland {giuse,tino}@idsia.ch http://www.idsia.ch/{~giuse,~tino} Abstract. The Natural

More information

2 Differential Evolution and its Control Parameters

2 Differential Evolution and its Control Parameters COMPETITIVE DIFFERENTIAL EVOLUTION AND GENETIC ALGORITHM IN GA-DS TOOLBOX J. Tvrdík University of Ostrava 1 Introduction The global optimization problem with box constrains is formed as follows: for a

More information

Improving on the Kalman Swarm

Improving on the Kalman Swarm Improving on the Kalman Swarm Extracting Its Essential Characteristics Christopher K. Monson and Kevin D. Seppi Brigham Young University, Provo UT 84602, USA {c,kseppi}@cs.byu.edu Abstract. The Kalman

More information

Completely Derandomized Self-Adaptation in Evolution Strategies

Completely Derandomized Self-Adaptation in Evolution Strategies Completely Derandomized Self-Adaptation in Evolution Strategies Nikolaus Hansen and Andreas Ostermeier In Evolutionary Computation 9(2) pp. 159-195 (2001) Errata Section 3 footnote 9: We use the expectation

More information

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES.

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES. A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES. Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics Many parts adapted

More information

Stochastic Local Search in Continuous Domains

Stochastic Local Search in Continuous Domains Stochastic Local Search in Continuous Domains Questions to be Answered When Designing A Novel Algorithm Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering, Department of

More information

Parameter Estimation of Complex Chemical Kinetics with Covariance Matrix Adaptation Evolution Strategy

Parameter Estimation of Complex Chemical Kinetics with Covariance Matrix Adaptation Evolution Strategy MATCH Communications in Mathematical and in Computer Chemistry MATCH Commun. Math. Comput. Chem. 68 (2012) 469-476 ISSN 0340-6253 Parameter Estimation of Complex Chemical Kinetics with Covariance Matrix

More information

The particle swarm optimization algorithm: convergence analysis and parameter selection

The particle swarm optimization algorithm: convergence analysis and parameter selection Information Processing Letters 85 (2003) 317 325 www.elsevier.com/locate/ipl The particle swarm optimization algorithm: convergence analysis and parameter selection Ioan Cristian Trelea INA P-G, UMR Génie

More information

Natural Gradient for the Gaussian Distribution via Least Squares Regression

Natural Gradient for the Gaussian Distribution via Least Squares Regression Natural Gradient for the Gaussian Distribution via Least Squares Regression Luigi Malagò Department of Electrical and Electronic Engineering Shinshu University & INRIA Saclay Île-de-France 4-17-1 Wakasato,

More information

Natural Evolution Strategies for Direct Search

Natural Evolution Strategies for Direct Search Tobias Glasmachers Natural Evolution Strategies for Direct Search 1 Natural Evolution Strategies for Direct Search PGMO-COPI 2014 Recent Advances on Continuous Randomized black-box optimization Thursday

More information

Center-based initialization for large-scale blackbox

Center-based initialization for large-scale blackbox See discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/903587 Center-based initialization for large-scale blackbox problems ARTICLE FEBRUARY 009 READS

More information

A Practical Guide to Benchmarking and Experimentation

A Practical Guide to Benchmarking and Experimentation A Practical Guide to Benchmarking and Experimentation Inria Research Centre Saclay, CMAP, Ecole polytechnique, Université Paris-Saclay Installing IPython is not a prerequisite to follow the tutorial for

More information

IMPROVEMENT OF CLASSICAL EVOLUTIONARY PROGRAMMING USING STATE FEEDBACK CONTROLLER. Received October 2012; revised October 2013

IMPROVEMENT OF CLASSICAL EVOLUTIONARY PROGRAMMING USING STATE FEEDBACK CONTROLLER. Received October 2012; revised October 2013 International Journal of Innovative Computing, Information and Control ICIC International c 2014 ISSN 1349-4198 Volume 10, Number 4, August 2014 pp. 1413 1433 IMPROVEMENT OF CLASSICAL EVOLUTIONARY PROGRAMMING

More information

Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere

Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere S. Finck Vorarlberg University of Applied Sciences Hochschulstrasse 1 Dornbirn, Austria steffen.finck@fhv.at

More information

Differential Evolution Based Particle Swarm Optimization

Differential Evolution Based Particle Swarm Optimization Differential Evolution Based Particle Swarm Optimization Mahamed G.H. Omran Department of Computer Science Gulf University of Science and Technology Kuwait mjomran@gmail.com Andries P. Engelbrecht Department

More information

Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization.

Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization. nd Workshop on Advanced Research and Technology in Industry Applications (WARTIA ) Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization

More information

Globally Convergent Evolution Strategies

Globally Convergent Evolution Strategies Globally Convergent Evolution Strategies Y. Diouane S. Gratton L. N. Vicente July 9, 24 Abstract In this paper we show how to modify a large class of evolution strategies (ES s) for unconstrained optimization

More information

Power Prediction in Smart Grids with Evolutionary Local Kernel Regression

Power Prediction in Smart Grids with Evolutionary Local Kernel Regression Power Prediction in Smart Grids with Evolutionary Local Kernel Regression Oliver Kramer, Benjamin Satzger, and Jörg Lässig International Computer Science Institute, Berkeley CA 94704, USA, {okramer, satzger,

More information

CLASSICAL gradient methods and evolutionary algorithms

CLASSICAL gradient methods and evolutionary algorithms IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 2, NO. 2, JULY 1998 45 Evolutionary Algorithms and Gradient Search: Similarities and Differences Ralf Salomon Abstract Classical gradient methods and

More information

Not all parents are equal for MO-CMA-ES

Not all parents are equal for MO-CMA-ES Author manuscript, published in "Evolutionary Multi-Criterion Optimization (EMO (" Not all parents are equal for MO-CMA-ES Ilya Loshchilov,, Marc Schoenauer,, and Michèle Sebag, TAO Project-team, INRIA

More information

Sensitivity of Parameter Control Mechanisms with Respect to Their Initialization

Sensitivity of Parameter Control Mechanisms with Respect to Their Initialization Sensitivity of Parameter Control Mechanisms with Respect to Their Initialization Carola Doerr 1 and Markus Wagner 2 1 Sorbonne Université, CNRS, Laboratoire d Informatique de Paris 6, LIP6, 75005 Paris,

More information

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN ,

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN , ESANN'200 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 25-27 April 200, D-Facto public., ISBN 2-930307-0-3, pp. 79-84 Investigating the Influence of the Neighborhood

More information