Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds

Size: px
Start display at page:

Download "Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds"

Transcription

1 Investigating the Impact o Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds Tom Schaul Courant Institute o Mathematical Sciences, New York University Broadway, New York, USA schaul@cims.nyu.edu ABSTRACT Natural Evolution Strategies (NES) are a recent member o the class o real-valued optimization algorithms that are based on adapting search distributions. Exponential NES (xnes) are the most common instantiation o NES, and particularly appropriate or the BBOB benchmarks, given that many are non-separable, and their relatively small problem dimensions. The technique o adaptation sampling, which adapts learning rates online urther improves the algorithm s perormance. This report provides an extensive empirical comparison to study the impact o adaptation sampling in xnes, both on the noise-ree and noisy BBOB testbeds. Categories and Subject Descriptors G.. [Numerical Analysis]: Optimization global optimization, unconstrained optimization; F.. [Analysis o Algorithms and Problem Complexity]: Numerical Algorithms and Problems General Terms Algorithms Keywords Evolution Strategies, Natural Gradient, Benchmarking. INTRODUCTION Evolution strategies (ES), in contrast to traditional evolutionary algorithms, aim at repeating the type o mutation that led to those good individuals. We can characterize those mutations by an explicitly parameterized search distribution rom which new candidate samples are drawn, akin to estimation o distribution algorithms (EDA). Covariance matrix adaptation ES (CMA-ES []) innovated the ield by introducing a parameterization that includes the ull covariance matrix, allowing them to solve highly non-separable problems. Permission to make digital or hard copies o all or part o this work or personal or classroom use is granted without ee provided that copies are not made or distributed or proit or commercial advantage and that copies bear this notice and the ull citation on the irst page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior speciic permission and/or a ee. GECCO, July,, Philadelphia, USA. Copyright ACM ----//...$.. A more recent variant, natural evolution strategies (NES [,,, ]) aims at a higher level o generality, providing a procedure to update the search distribution s parameters or any type o distribution, by ascending the gradient towards higher expected itness. Further, it has been shown [, ] that ollowing the natural gradient to adapt the search distribution is highly beneicial, because it appropriately normalizes the update step with respect to its uncertainty and makes the algorithm scale-invariant. Exponential NES (xnes), the most common instantiation o NES, used a search distribution parameterized by a mean vector and a ull covariance matrix, and is thus most similar to CMA-ES (in act, the precise relation is described in [] and []). Given the relatively small problem dimensions o the BBOB benchmarks, and the act that many are non-separable, it is also among the most appropriate NES variants or the task. Adaptation sampling (AS) is a technique or the online adaptation o its learning rate, which is designed to speed up convergence. This may be beneicial to algorithms like xnes, because the optimization traverses qualitatively dierent phases, during which dierent learning rates may be optimal. In this report, we retain the original ormulation o xnes (including all parameter settings, except or an added stopping criterion), and compare this base-line algorithm with an AS-augmented variant. We describe the empirical perormance on all benchmark unctions (both noise-ree and noisy) o the BBOB workshop.. NATURAL EVOLUTION STRATEGIES Natural evolution strategies (NES) maintain a search distribution π and adapt the distribution parameters θ by ollowing the natural gradient [] o expected itness J, that is, maximizing Z J(θ) = E θ [(z)] = (z) π(z θ) dz Just like their close relative CMA-ES [], NES algorithms are invariant under monotone transormations o the itness unction and linear transormations o the search space. Each iteration the algorithm produces n samples z i π(z θ), i {,..., n}, i.i.d. rom its search distribution, which is parameterized by θ. The gradient w.r.t. the parameters θ can be rewritten (see []) as Z θ J(θ) = θ (z) π(z θ) dz = E θ [(z) θ log π(z θ)]

2 rom which we obtain a Monte Carlo estimate θ J(θ) nx (z i) θ log π(z i θ) n i= o the search gradient. The key step then consists in replacing this gradient h by the natural gradient deined as F θ J(θ) where F = E θ log π (z θ) θ log π (z θ) i is the Fisher inormation matrix. The search distribution is iteratively updated using natural gradient ascent θ θ + ηf θ J(θ) with learning rate parameter η.. Exponential NES While the NES ormulation is applicable to arbitrary parameterizable search distributions [, ], the most common variant employs multinormal search distributions. For that case, two helpul techniques were introduced in [], namely an exponential parameterization o the covariance matrix, which guarantees positive-deiniteness, and a novel method or changing the coordinate system into a natural one, which makes the algorithm computationally eicient. The resulting algorithm, NES with a multivariate Gaussian search distribution and using both these techniques is called xnes, and the pseudocode is given in Algorithm. Algorithm : Exponential NES (xnes) input:, µ init, η σ, η B, u k initialize µ µ init σ B I repeat or k =... n do draw sample s k N (, I) z k µ + σb s k evaluate the itness (z k ) end sort {(s k, z k )} with respect to (z k ) and assign utilities u k to each sample compute gradients δ J P n k= u k s k M J P n k= u k (s k s k I) σj tr( M J)/d B J M J σj I update parameters µ µ + σb δ J σ σ exp(η σ/ σj) B B exp(η B / B J) until stopping criterion is met. Adaptation Sampling First introduced in [] (chapter, section.), adaptation sampling is a new meta-learning technique [] that can adapt hyper-parameters online, in an economical way that is grounded on a measure statistical improvement. Here, we apply it to the learning rate o the global stepsize η σ. The idea is to consider whether a larger learningrate η σ = ησ would have been more likely to generate the good samples in the current batch. For this we determine the (hypothetical) search distribution that would have resulted rom such a larger update π( θ ). Then we compute importance weights w k = π(z k θ ) π(z k θ) or each o the n samples z k in our current population, generated rom the actual search distribution π( θ). We then conduct a weighted Mann-Whitney test [] (appendix A) to determine i the set {rank(z k )} is inerior to its reweighted counterpart {w k rank(z k )} (corresponding to the larger learning rate), with statistical signiicance ρ. I so, we increase the learning rate by a actor o + c, up to at most η σ = (where c =.). Otherwise it decays to its initial value: η σ ( c ) η σ + c η σ,init The procedure is summarized in algorithm (or details and derivations, see []). The combination o xnes with adaptation sampling is dubbed xnes-as. One interpretation o why adaptation sampling is helpul is that hal-way into the search, (ater a local attractor has been ound, e.g., towards the end o the valley on the Rosenbrock benchmarks or ), the convergence speed can be boosted by an increased learning rate. For such situations, an online adaptation o hyper-parameters is inherently wellsuited. Algorithm : Adaptation sampling input : η σ,t,η σ,init, θ t, θ t, {(z k, (z k ))}, c, ρ output: η σ,t+ compute hypothetical θ, given θ t and using /η σ,t or k =... n do w k = π(z k θ ) π(z k θ) end S {rank(z k )} S {w k rank(z k )} i weighted-mann-whitney(s, S ) < ρ then return ( c ) η σ + c η σ,init else return min(( + c ) η σ, ) end. EXPERIMENTAL SETTINGS We use identical deault hyper-parameter values or all benchmarks (both noisy and noise-ree unctions), which are taken rom [, ]. Table summarizes all the hyperparameters used. In addition, we make use o the provided target itness opt to trigger independent algorithm restarts, using a simple ad-hoc procedure: I the log-progress during the past d It turns out that this use o opt is technically not permitted by the BBOB guidelines, so strictly speaking a dierent restart strategy should be employed, or example the one described in [].

3 Table : Deault parameter values or xnes (including the utility unction and adaptation sampling) as a unction o problem dimension d. parameter deault value n η σ = η B u k ρ + log(d) ( + log(d)) d d max `, log( n + ) log(k) P n j= max `, log( n + ) log(j) n (d + ) c evaluations is too small, i.e., i opt t log opt t d < (r + ) m / [log opt t + ] where m is the remaining budget o evaluations divided by d, t is the best itness encountered until evaluation t and r is the number o restarts so ar. The total budget is d / evaluations. Implementations o this and other NES algorithm variants are available in Python through the PyBrain machine learning library [], as well as in other languages at www. idsia.ch/~tom/nes.html.. RESULTS Results rom experiments according to [] on the benchmark unctions given in [,,, ] are presented in Figures, and, and in Tables, and. The expected running time (ERT), used in the igures and table, depends on a given target unction value, t = opt +, and is computed over all relevant trials as the number o unction evaluations executed during each trial while the best unction value did not reach t, summed over all trials and divided by the number o trials that actually reached t [, ]. Statistical signiicance is tested with the rank-sum test or a given target t ( using, or each trial, either the number o needed unction evaluations to reach t (inverted and multiplied by ), or, i the target was not reached, the best -value achieved, measured only up to the smallest number o overall unction evaluations or any unsuccessul trial under consideration. Some o the result plots (like perormance scaling with dimension), as well as the CPU-timing results were omitted here but are available in stand-alone benchmarking reports [, ].. DISCUSSION Naturally, the core algorithm (xnes) being identical in both variants presented here, we do not expect drastically dierent results. In act, the direct comparison Figures and show remarkably similar results on the majority o unctions, which we see rom the points clearly scattered along the diagonal. This indicates that on most benchmarks, adaptation sampling is not used (much) and xnes-as operates with the same conservative learning rates as xnes. The unctions where we observe dierences are telling, however. Most notably on the sphere unction (, and its noisy siblings), adaptation sampling drastically improves perormance, by up to a actor. Adaptation sampling still improves perormance signiicantly on many other unctions (see Table ), but the speed-ups are less striking. There is a single unction where adaptation sampling signiicantly hurts perormance, namely on in dimension (the sharp ridge): apparently the learning rates are increased to an overly aggressive level. In conclusion, we can recommend using adaptation sampling or xnes in general, and particularly so i the itness landscapes are smooth. Acknowlegements The author wants to thank the organizers o the BBOB workshop or providing such a well-designed benchmark setup, and especially such high-quality post-processing utilities. This work was unded in part through AFR postdoc grant number, o the National Research Fund Luxembourg.. REFERENCES [] S. I. Amari. Natural Gradient Works Eiciently in Learning. Neural Computation, :,. [] S. Finck, N. Hansen, R. Ros, and A. Auger. : Presentation o the noiseless unctions. Technical Report /, Research Center PPE,. Updated February. [] S. Finck, N. Hansen, R. Ros, and A. Auger. : Presentation o the noisy unctions. Technical Report /, Research Center PPE,. [] N. Fukushima, Y. Nagata, S. Kobayashi, and I. Ono. Proposal o distance-weighted exponential natural evolution strategies. In IEEE Congress o Evolutionary Computation, pages. IEEE,. [] T. Glasmachers, T. Schaul, and J. Schmidhuber. A Natural Evolution Strategy or Multi-Objective Optimization. In Parallel Problem Solving rom Nature (PPSN),. [] T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, and J. Schmidhuber. Exponential Natural Evolution Strategies. In Genetic and Evolutionary Computation Conerence (GECCO), Portland, OR,. [] N. Hansen, A. Auger, S. Finck, and R. Ros. : Experimental setup. Technical report, INRIA,. [] N. Hansen, S. Finck, R. Ros, and A. Auger. : Noiseless unctions deinitions. Technical Report RR-, INRIA,. Updated February. [] N. Hansen, S. Finck, R. Ros, and A. Auger. : Noisy unctions deinitions. Technical Report RR-, INRIA,. Updated February.

4 Sphere Sphere Ellipsoid Ellipsoid separable separable Rastrigin Rastrigin separable separable Skew Skew Rastrigin-Bueche Rastrigin-Bueche separ Linear Linear slope slope Attractive sector Step-ellipsoid Step-ellipsoid Rosenbrock Rosenbrock original original Rosenbrock rotated Sharp Sharp ridge ridge Schaer Schaer F, F, condition cond. Gallagher Gallagher peaks peaks Ellipsoid Ellipsoid Sum Sum o o dierent di. powers powers Schaer Schaer F, F, condition cond. Gallagher Gallagher peaks peaks Discus Rastrigin Rastrigin Griewank-Rosenbrock FF Katsuuras Bent Bent cigar cigar Weierstrass Schweel x*sin(x) Schweel x*sin(x) Lunacek Lunacek bi-rastrigin bi-rastrigin Figure : Expected running time (ERT in log o number o unction evaluations) o xnes (x-axis) versus xnes-as (y-axis) or target values [, ] in each dimension on unctions. Markers on the upper or right edge indicate that the target value was never reached. Markers represent dimension: :+, :, :, :, :, :.

5 Sphere Gauss moderate noise Gauss uniorm Sphere moderate noise uni Sphere Cauchy moderate noise Cauchy Gauss Ellipsoid noise Gauss uniorm Ellipsoid noise uni Cauchy Ellipsoid Cauchy noise Sphere (moderate) Rosenbrock (moderate) Sphere Rosenbrock Step-ellipsoid Sphere (moderate) Rosenbrock (moderate) Sphere Rosenbrock Step-ellipsoid Rosenbrock moderate uni Rosenbrock moderate Gauss Sphere Gauss Sphere uni Sphere (moderate) Rosenbrock (moderate) Sphere Rosenbrock Step-ellipsoid Rosenbrock moderate Cauchy Sphere Cauchy Ellipsoid Sum o di. powers Schaer F Griewank-Rosenbrock Gallagher Sum o di powers Gauss Schaer F Gauss Ellipsoid Sum o di. powers Schaer F Griewank-Rosenbrock Gallagher Ellipsoid Sum o di. powers Schaer F Griewank-Rosenbrock Gallagher Sum o di powers uni Sum o di powers Cauchy Schaer F uni Schaer F Cauchy Rosenbrock Gauss Rosenbrock uni Rosenbrock Cauchy Griewank-Rosenbrock uni Griewank-Rosenbrock Cauchy Griewank-Rosenbrock Gauss Step-ellipsoid Gauss Step-ellipsoid uni Step-ellipsoid Cauchy Gallagher Gauss Gallagher uni Gallagher Cauchy Figure : Expected running time (ERT in log o number o unction evaluations) o xnes (x-axis) versus xnes-as (y-axis) or target values [, ] in each dimension on unctions. Markers on the upper or right edge indicate that the target value was never reached. Markers represent dimension: :+, :, :, :, :, :.

6 -D -D e+ e- e- e- e- #succ e+ e- e- e- e- #succ / / : xnes.() () () () () / : xnes.() () () () () / : +as.() () () () () / : +as.() () () () () / / / : xnes.() () () () () / : xnes (.) (.) () (.) () / : +as () () () () () / : +as () () () () () / / / : xnes.() () () () () / : xnes ().e / : +as.(.) () () () () / : +as ().e / /.e / : xnes.() () () () () / : xnes().e / : +as.() () () () () / : +as ().e / / / : xnes.() () () () () / : xnes () () () () () / : +as () () () () () / : +as.() () () () () / / / : xnes.().(.).(.).(.).(.) / : xnes.(.).(.).(.).(.).(.) / : +as.().(.).(.).(.).(.) / : +as.(.).(.).(.).(.).(.) / / / : xnes.().().().().() / : xnes.(.).(.).(.).(.).(.) / : +as.().().().().() / : +as.(.).(.).(.).(.).(.) / / / : xnes.().().().().() / : xnes.(.).().().(.) (.) / : +as.().() () () () / : +as.(.).().() () () / / / : xnes.().().().().() / : xnes.().().() () () / : +as.() () () () () / : +as.() () () () () / / / : xnes.(.).(.).(.).(.).(.) / : xnes.(.).(.).(.).(.).(.) / : +as.(.).(.).(.).(.).(.) / : +as.(.).(.).(.).(.).(.) / / / : xnes.().(.).(.).(.).(.) / : xnes.(.).(.).(.).(.).(.) / : +as.().(.).(.).(.).(.) / : +as.(.).(.).(.).(.).(.) / / / : xnes() () () () () / : xnes (.).().(.).(.).(.) / : +as () () () () () / : +as.() () () () () / / / : xnes.(.).(.).(.).(.).(.) / : +as.(.).(.).(.).(.) : xnes (.).(.).(.).(.).(.) /.(.) / : +as.() () () () () / / / : xnes.().().(.).(.).(.) / : +as.().().().(.).(.) : xnes.(.) (.) (.) (.).(.) / / : +as.(.) () () ().(.) / /.e.e.e.e / : xnes.() () () () () / : xnes ().e / : +as.() () () () () / : +as ().e / /.e.e.e / : xnes.().().().().() / : xnes () () () () () / : +as.().().().().() / : +as ().() () () () /. / / : xnes.().(.).(.).().() / : xnes.().(.).(.).() () / : +as.().(.).(.).().() / : +as.(.).(.).(.).(.) () / /.e.e / : xnes.().(.).(.).().() / : +as.(.).(.).(.) : xnes.(.).(.).(.).() () /.(.).() / : +as.(.).(.).(.).(.).() /.e.e.e / : xnes() ().().().() /.e.e.e.e / : xnes ().e / : +as () () () () () / : +as ().e / / : xnes.() ().().().() /.e.e.e.e / : xnes.().e / : +as.() () () () () / : +as.().e / / : xnes() () () () () / / : xnes () () () () () / : +as () () () () () / : +as () () () () () / / : xnes() () () () () /.e / : xnes () () () () () / : +as () () () () () / : +as () () () () () /. / : xnes.() ().e /..e.e.e / : xnes.().e / : +as.().e / : +as.().e /.e.e.e.e / : xnes.().e /.e.e.e.e.e / : xnes.e / : +as.().e / : +as.e / Table : ERT in number o unction evaluations divided by the best ERT measured during BBOB- given in the respective irst row with the central % range divided by two in brackets or dierent values. #succ is the number o trials that reached the inal target opt +. :xnes is xnes and :+as is xnes-as. Bold entries are statistically signiicantly better compared to the other algorithm, with p =. or p = k where k {,,,... } is the number ollowing the symbol, with Bonerroni correction o. A indicates the same tested against the best BBOB-.

7 noiseless unctions proportion o trials D -D proportion +: / -: / -: / -: / proportion o trials proportion +: / -: / -: / -: /. log o FEvals / DIM log o FEvals ratio. log o FEvals / DIM log o FEvals ratio noisy unctions proportion o trials proportion +: / -: / -: / -: / proportion o trials proportion +: / -: / -: / -: /. log o FEvals / DIM log o FEvals ratio. log o FEvals / DIM log o FEvals ratio Figure : Empirical cumulative distributions (ECDF) o run lengths and speed-up ratios in -D (let) and - D (right). Let sub-columns: ECDF o the number o unction evaluations divided by dimension D (FEvals/D) to reach a target value opt + with = k, where k {,,, } is given by the irst value in the legend, or xnes ( ) and xnes-as ( ). Light beige lines show the ECDF o FEvals or target value = o all algorithms benchmarked during BBOB-. Right sub-columns: ECDF o FEval ratios o xnes divided by xnes-as, all trial pairs or each unction. Pairs where both trials ailed are disregarded, pairs where one trial ailed are visible in the limits being > or <. The legends indicate the number o unctions that were solved in at least one trial (xnes irst). -D -D e+ e- e- e- e- #succ e+ e- e- e- e- #succ / / : xnes.().(.) () () (.) / : xnes.() (.) (.) (.) (.) / : +as.().().() () () / : +as.().().().().() / / / : xnes.().().() () () / : xnes.(.) (.) (.) (.) (.) / : +as.().().().() () / : +as.(.).().().().() / / / : xnes.().() () () () / : xnes.() (.).(.) (.) (.) / : +as.().() () () () / : +as.().().().().() / /.e.e.e.e / : xnes.(.).().().().() / : xnes.().e / : +as.(.).().().().() / : +as ().e / /.e.e.e.e.e / : xnes.(.).().().().() / : xnes().e / : +as.(.) () () () () / : +as ().e / / / : xnes.(.).(.).(.).(.).() / : xnes.(.).(.).(.).(.).(.) / : +as.(.).().().().() / : +as.(.).(.).(.).(.).(.) / / / : xnes.() () ().() () / : xnes.(.) ().e / : +as.().(.).().().() / : +as.(.) ().e / /.e.e.e.e / : xnes() ().e / : xnes.e / : +as () ().e / : +as.e / / / : xnes.().(.).(.).(.).(.)/ : xnes.(.).(.) (.) (.) (.) / : +as.().(.).(.).(.).(.)/ : +as.(.).(.).(.) (.) () /.e.e.e.e / : xnes.(.).().().().() / : xnes / : +as.(.).().().().() / : +as /.e.e.e.e / : xnes.().(.).e / : xnes / : +as.().().e / : +as / / / : xnes.().().() () () / : xnes.(.) ().e / : +as.().() () () () / : +as.(.) () () () () / /.e.e.e.e / : xnes().().().().() / : xnes.().e / : +as.().().().().() / : +as.().e / /.e.e.e.e.e / : xnes() ().e / : xnes.e / : +as.() ().e / : +as.e / /.e.e.e / : xnes.().().().().() / : xnes.(.).(.).(.).(.).(.) / : +as.(.).().().().() / : +as.(.).(.).(.).(.).(.) / Table : Relative ERT in number o -evaluations, see Table or details.

8 -D -D e+ e- e- e- e- #succ e+ e- e- e- e- #succ /.e.e.e.e.e / : xnes.().().().().() / : xnes().e / : +as.().().().().() / : +as ().e /.e.e.e.e /.e.e.e.e.e / : xnes.().e / : xnes.e / : +as.().e / : +as.e / / / : xnes.(.).(.).(.).(.).(.)/ : xnes.(.).(.).(.).(.).(.) / : +as.(.).(.).(.).(.).(.)/ : +as.(.).(.).(.).(.).(.) / /.e.e.e / : xnes.().(.).().() () / : xnes.(.) ().e / : +as.().().().().() / : +as.(.) ().e /.e.e /.e.e.e.e / : xnes.() ().e / : xnes ().e / : +as () ().e / : +as.().e /. / / : xnes.().(.).(.).().() / : xnes.(.).(.).(.).(.).(.) / : +as.().(.).(.).().() / : +as.(.).(.).(.).(.).() /.e /.e.e.e.e / : xnes.().() ().e / : xnes.(.).e / : +as.().() ().e / : +as.(.).e /.e.e.e /.e.e.e.e : xnes().e / : xnes.().e / : +as.().e / : +as ().e / /.e.e.e / : xnes.().().() () () / : xnes.(.).(.).().e / : +as.().(.).() () () / : +as.(.).(.).().e /.e.e.e /.e.e.e / : xnes. ().e / : xnes.().e / : +as.(.) ().e / : +as.().e / : xnes. () / : xnes.(.) / : +as.(.) () / : +as.() /.e.e.e /.e.e.e / : xnes. () ().e / : xnes.().e / : +as.(.) ().e / : +as.(.).e / /.e.e.e.e.e / : xnes.().().().().() / : xnes ().().().().() / : +as ().().().().() / : +as ().(.).().(.).() /.e.e.e /.e.e.e.e.e / : xnes() () ().e / : xnes.e / : +as ().() ().e / : +as.e / /.e.e.e.e / : xnes() ().().().() / : xnes ().().().().() / : +as (.) ().().().() / : +as ().().().().() / Table : Relative ERT in number o -evaluations, see Table or details. [] N. Hansen and A. Ostermeier. Completely derandomized sel-adaptation in evolution strategies. IEEE Transactions on Evolutionary Computation, :,. [] K. Price. Dierential evolution vs. the unctions o the second ICEO. In Proceedings o the IEEE International Congress on Evolutionary Computation, pages,. [] T. Schaul. Studies in Continuous Black-box Optimization. Ph.D. thesis, Technische Universität München,. [] T. Schaul. Benchmarking Exponential Natural Evolution Strategies on the Noiseless and Noisy Black-box Optimization Testbeds. In Black-box Optimization Benchmarking Workshop, Genetic and Evolutionary Computation Conerence, Philadelphia, PA,. [] T. Schaul. Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds. In Black-box Optimization Benchmarking Workshop, Genetic and Evolutionary Computation Conerence, Philadelphia, PA,. [] T. Schaul. Natural Evolution Strategies Converge on Sphere Functions. In Genetic and Evolutionary Computation Conerence (GECCO), Philadelphia, PA,. [] T. Schaul, J. Bayer, D. Wierstra, Y. Sun, M. Felder, F. Sehnke, T. Rückstieß, and J. Schmidhuber. PyBrain. Journal o Machine Learning Research, :,. [] T. Schaul and J. Schmidhuber. Metalearning. Scholarpedia, ():,. [] Y. Sun, D. Wierstra, T. Schaul, and J. Schmidhuber. Stochastic search using the natural gradient. In International Conerence on Machine Learning (ICML),. [] D. Wierstra, T. Schaul, T. Glasmachers, Y. Sun, and J. Schmidhuber. Natural Evolution Strategies. Technical report,. [] D. Wierstra, T. Schaul, J. Peters, and J. Schmidhuber. Natural Evolution Strategies. In Proceedings o the IEEE Congress on Evolutionary Computation (CEC), Hong Kong, China,.

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds Tom Schaul Courant Institute of Mathematical Sciences, New York University

More information

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed Raymond Ros To cite this version: Raymond Ros. Comparison of NEWUOA with Different Numbers of Interpolation

More information

Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed

Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed Author manuscript, published in "GECCO workshop on Black-Box Optimization Benchmarking (BBOB') () 9-" DOI :./8.8 Mirrored Variants of the (,)-CMA-ES Compared on the Noiseless BBOB- Testbed [Black-Box Optimization

More information

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed Bounding the Population Size of OP-CMA-ES on the Noiseless BBOB Testbed Tianjun Liao IRIDIA, CoDE, Université Libre de Bruxelles (ULB), Brussels, Belgium tliao@ulb.ac.be Thomas Stützle IRIDIA, CoDE, Université

More information

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed Raymond Ros To cite this version: Raymond Ros. Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed. Genetic

More information

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed Benchmarking a BI-Population CMA-ES on the BBOB- Function Testbed Nikolaus Hansen Microsoft Research INRIA Joint Centre rue Jean Rostand Orsay Cedex, France Nikolaus.Hansen@inria.fr ABSTRACT We propose

More information

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed Anne Auger, Nikolaus Hansen To cite this version: Anne Auger, Nikolaus Hansen. Benchmarking the (+)-CMA-ES on the BBOB-9 Function Testbed. ACM-GECCO

More information

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts Example Paper The BBOBies ABSTRACT We have benchmarked the Nelder-Mead downhill simplex method on the noisefree BBOB- testbed

More information

BBOB-Benchmarking Two Variants of the Line-Search Algorithm

BBOB-Benchmarking Two Variants of the Line-Search Algorithm BBOB-Benchmarking Two Variants of the Line-Search Algorithm Petr Pošík Czech Technical University in Prague, Faculty of Electrical Engineering, Dept. of Cybernetics Technická, Prague posik@labe.felk.cvut.cz

More information

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB- Noiseless Testbed Nikolaus Hansen, Raymond Ros To cite this version: Nikolaus Hansen, Raymond Ros. Benchmarking a Weighted Negative

More information

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed Benchmarking a Hyrid ulti Level Single Linkage Algorithm on the BBOB Noiseless Tested László Pál Sapientia - Hungarian University of Transylvania 00 iercurea-ciuc, Piata Liertatii, Nr., Romania pallaszlo@sapientia.siculorum.ro

More information

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed Lukáš Bajer Institute of Computer Science Academy of Sciences of the Czech Republic and Faculty of Mathematics

More information

High Dimensions and Heavy Tails for Natural Evolution Strategies

High Dimensions and Heavy Tails for Natural Evolution Strategies High Dimensions and Heavy Tails for Natural Evolution Strategies Tom Schaul IDSIA, University of Lugano Manno, Switzerland tom@idsia.ch Tobias Glasmachers IDSIA, University of Lugano Manno, Switzerland

More information

Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed

Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed Lukáš Bajer,, Zbyněk Pitra,, Martin Holeňa Faculty of Mathematics and Physics, Charles University, Institute of Computer

More information

Block Diagonal Natural Evolution Strategies

Block Diagonal Natural Evolution Strategies Block Diagonal Natural Evolution Strategies Giuseppe Cuccu and Faustino Gomez IDSIA USI-SUPSI 6928 Manno-Lugano, Switzerland {giuse,tino}@idsia.ch http://www.idsia.ch/{~giuse,~tino} Abstract. The Natural

More information

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-0 Testbed: A Preliminary Study Dimo Brockhoff, Bernd Bischl, Tobias Wagner To cite this version: Dimo Brockhoff, Bernd

More information

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Babatunde Sawyerr 1 Aderemi Adewumi 2 Montaz Ali 3 1 University of Lagos, Lagos, Nigeria 2 University

More information

arxiv: v1 [cs.ne] 9 May 2016

arxiv: v1 [cs.ne] 9 May 2016 Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES) arxiv:1605.02720v1 [cs.ne] 9 May 2016 ABSTRACT Ilya Loshchilov University of Freiburg Freiburg, Germany ilya.loshchilov@gmail.com

More information

Covariance Matrix Adaptation in Multiobjective Optimization

Covariance Matrix Adaptation in Multiobjective Optimization Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France Mastertitelformat Scenario: Multiobjective

More information

arxiv: v2 [cs.ai] 13 Jun 2011

arxiv: v2 [cs.ai] 13 Jun 2011 A Linear Time Natural Evolution Strategy for Non-Separable Functions Yi Sun, Faustino Gomez, Tom Schaul, and Jürgen Schmidhuber IDSIA, University of Lugano & SUPSI, Galleria, Manno, CH-698, Switzerland

More information

Stochastic Search using the Natural Gradient

Stochastic Search using the Natural Gradient Keywords: stochastic search, natural gradient, evolution strategies Sun Yi Daan Wierstra Tom Schaul Jürgen Schmidhuber IDSIA, Galleria 2, Manno 6928, Switzerland yi@idsiach daan@idsiach tom@idsiach juergen@idsiach

More information

How information theory sheds new light on black-box optimization

How information theory sheds new light on black-box optimization How information theory sheds new light on black-box optimization Anne Auger Colloque Théorie de l information : nouvelles frontières dans le cadre du centenaire de Claude Shannon IHP, 26, 27 and 28 October,

More information

Natural Evolution Strategies for Direct Search

Natural Evolution Strategies for Direct Search Tobias Glasmachers Natural Evolution Strategies for Direct Search 1 Natural Evolution Strategies for Direct Search PGMO-COPI 2014 Recent Advances on Continuous Randomized black-box optimization Thursday

More information

Efficient Natural Evolution Strategies

Efficient Natural Evolution Strategies Efficient Natural Evolution Strategies Evolution Strategies and Evolutionary Programming Track ABSTRACT Yi Sun Manno 698, Switzerland yi@idsiach Tom Schaul Manno 698, Switzerland tom@idsiach Efficient

More information

Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions

Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions Steffen Finck, Nikolaus Hansen, Raymond Ros and Anne Auger Report 2009/20, Research Center PPE, re-compiled

More information

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability

More information

Adaptive Coordinate Descent

Adaptive Coordinate Descent Adaptive Coordinate Descent Ilya Loshchilov 1,2, Marc Schoenauer 1,2, Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 and Laboratoire de Recherche en Informatique (UMR CNRS 8623) Université

More information

A CMA-ES with Multiplicative Covariance Matrix Updates

A CMA-ES with Multiplicative Covariance Matrix Updates A with Multiplicative Covariance Matrix Updates Oswin Krause Department of Computer Science University of Copenhagen Copenhagen,Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik

More information

arxiv: v1 [cs.ne] 29 Jul 2014

arxiv: v1 [cs.ne] 29 Jul 2014 A CUDA-Based Real Parameter Optimization Benchmark Ke Ding and Ying Tan School of Electronics Engineering and Computer Science, Peking University arxiv:1407.7737v1 [cs.ne] 29 Jul 2014 Abstract. Benchmarking

More information

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches BI-population CMA-ES Algorithms with Surrogate Models and Line Searches Ilya Loshchilov 1, Marc Schoenauer 2 and Michèle Sebag 2 1 LIS, École Polytechnique Fédérale de Lausanne 2 TAO, INRIA CNRS Université

More information

Testing Gaussian Process Surrogates on CEC 2013 multi-modal benchmark

Testing Gaussian Process Surrogates on CEC 2013 multi-modal benchmark ITAT 2016 Proceedings, CEUR Workshop Proceedings Vol. 1649, pp. 138 146 http://ceur-ws.org/vol-1649, Series ISSN 1613-0073, c 2016 N. Orekhov, L. Bajer, M. Holeňa Testing Gaussian Process Surrogates on

More information

Advanced Optimization

Advanced Optimization Advanced Optimization Lecture 3: 1: Randomized Algorithms for for Continuous Discrete Problems Problems November 22, 2016 Master AIC Université Paris-Saclay, Orsay, France Anne Auger INRIA Saclay Ile-de-France

More information

arxiv: v1 [cs.lg] 13 Dec 2013

arxiv: v1 [cs.lg] 13 Dec 2013 Efficient Baseline-free Sampling in Parameter Exploring Policy Gradients: Super Symmetric PGPE Frank Sehnke Zentrum für Sonnenenergie- und Wasserstoff-Forschung, Industriestr. 6, Stuttgart, BW 70565 Germany

More information

Natural Gradient for the Gaussian Distribution via Least Squares Regression

Natural Gradient for the Gaussian Distribution via Least Squares Regression Natural Gradient for the Gaussian Distribution via Least Squares Regression Luigi Malagò Department of Electrical and Electronic Engineering Shinshu University & INRIA Saclay Île-de-France 4-17-1 Wakasato,

More information

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES Covariance Matrix Adaptation-Evolution Strategy Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating and Covariance-Matrix Adaptation-ES Ilya Loshchilov, Marc Schoenauer,

More information

Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms *

Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms * Proceedings of the 8 th International Conference on Applied Informatics Eger, Hungary, January 27 30, 2010. Vol. 1. pp. 87 94. Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms

More information

Experimental Comparisons of Derivative Free Optimization Algorithms

Experimental Comparisons of Derivative Free Optimization Algorithms Experimental Comparisons of Derivative Free Optimization Algorithms Anne Auger Nikolaus Hansen J. M. Perez Zerpa Raymond Ros Marc Schoenauer TAO Project-Team, INRIA Saclay Île-de-France, and Microsoft-INRIA

More information

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms Anne Auger & Nikolaus Hansen INRIA Saclay - Ile-de-France, project team TAO Universite Paris-Sud, LRI, Bat. 49 945 ORSAY

More information

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.

More information

Numerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods

Numerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods Numerical Methods - Lecture 1 Numerical Methods Lecture. Analysis o errors in numerical methods Numerical Methods - Lecture Why represent numbers in loating point ormat? Eample 1. How a number 56.78 can

More information

The achievable limits of operational modal analysis. * Siu-Kui Au 1)

The achievable limits of operational modal analysis. * Siu-Kui Au 1) The achievable limits o operational modal analysis * Siu-Kui Au 1) 1) Center or Engineering Dynamics and Institute or Risk and Uncertainty, University o Liverpool, Liverpool L69 3GH, United Kingdom 1)

More information

OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION

OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION Xu Bei, Yeo Jun Yoon and Ali Abur Teas A&M University College Station, Teas, U.S.A. abur@ee.tamu.edu Abstract This paper presents

More information

Natural Evolution Strategies

Natural Evolution Strategies Natural Evolution Strategies Daan Wierstra, Tom Schaul, Jan Peters and Juergen Schmidhuber Abstract This paper presents Natural Evolution Strategies (NES), a novel algorithm for performing real-valued

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Blackbox Optimization Marc Toussaint U Stuttgart Blackbox Optimization The term is not really well defined I use it to express that only f(x) can be evaluated f(x) or 2 f(x)

More information

Stochastic optimization and a variable metric approach

Stochastic optimization and a variable metric approach The challenges for stochastic optimization and a variable metric approach Microsoft Research INRIA Joint Centre, INRIA Saclay April 6, 2009 Content 1 Introduction 2 The Challenges 3 Stochastic Search 4

More information

Gaussian Process Regression Models for Predicting Stock Trends

Gaussian Process Regression Models for Predicting Stock Trends Gaussian Process Regression Models or Predicting Stock Trends M. Todd Farrell Andrew Correa December 5, 7 Introduction Historical stock price data is a massive amount o time-series data with little-to-no

More information

Fitness Expectation Maximization

Fitness Expectation Maximization Fitness Expectation Maximization Daan Wierstra 1, Tom Schaul 1, Jan Peters 3 and Jürgen Schmidhuber 1,2 1 IDSIA, Galleria 2, 6298 Manno-Lugano, Switzerland 2 TU Munich, Boltzmannstr. 3, 85748 Garching,

More information

Comparison-Based Optimizers Need Comparison-Based Surrogates

Comparison-Based Optimizers Need Comparison-Based Surrogates Comparison-Based Optimizers Need Comparison-Based Surrogates Ilya Loshchilov 1,2, Marc Schoenauer 1,2, and Michèle Sebag 2,1 1 TAO Project-team, INRIA Saclay - Île-de-France 2 Laboratoire de Recherche

More information

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization Nikolaus Hansen INRIA, Research Centre Saclay Machine Learning and Optimization Team, TAO Univ. Paris-Sud, LRI MSRC

More information

Probabilistic Model of Error in Fixed-Point Arithmetic Gaussian Pyramid

Probabilistic Model of Error in Fixed-Point Arithmetic Gaussian Pyramid Probabilistic Model o Error in Fixed-Point Arithmetic Gaussian Pyramid Antoine Méler John A. Ruiz-Hernandez James L. Crowley INRIA Grenoble - Rhône-Alpes 655 avenue de l Europe 38 334 Saint Ismier Cedex

More information

Scattered Data Approximation of Noisy Data via Iterated Moving Least Squares

Scattered Data Approximation of Noisy Data via Iterated Moving Least Squares Scattered Data Approximation o Noisy Data via Iterated Moving Least Squares Gregory E. Fasshauer and Jack G. Zhang Abstract. In this paper we ocus on two methods or multivariate approximation problems

More information

Evolution Strategies and Covariance Matrix Adaptation

Evolution Strategies and Covariance Matrix Adaptation Evolution Strategies and Covariance Matrix Adaptation Cours Contrôle Avancé - Ecole Centrale Paris Anne Auger January 2014 INRIA Research Centre Saclay Île-de-France University Paris-Sud, LRI (UMR 8623),

More information

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties 1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen

More information

Optimisation numérique par algorithmes stochastiques adaptatifs

Optimisation numérique par algorithmes stochastiques adaptatifs Optimisation numérique par algorithmes stochastiques adaptatifs Anne Auger M2R: Apprentissage et Optimisation avancés et Applications anne.auger@inria.fr INRIA Saclay - Ile-de-France, project team TAO

More information

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES Introduction to Randomized Black-Box Numerical Optimization and CMA-ES July 3, 2017 CEA/EDF/Inria summer school "Numerical Analysis" Université Pierre-et-Marie-Curie, Paris, France Anne Auger, Asma Atamna,

More information

Multi-objective Optimization with Unbounded Solution Sets

Multi-objective Optimization with Unbounded Solution Sets Multi-objective Optimization with Unbounded Solution Sets Oswin Krause Dept. of Computer Science University of Copenhagen Copenhagen, Denmark oswin.krause@di.ku.dk Tobias Glasmachers Institut für Neuroinformatik

More information

Stochastic Methods for Continuous Optimization

Stochastic Methods for Continuous Optimization Stochastic Methods for Continuous Optimization Anne Auger et Dimo Brockhoff Paris-Saclay Master - Master 2 Informatique - Parcours Apprentissage, Information et Contenu (AIC) 2016 Slides taken from Auger,

More information

Lecture 8 Optimization

Lecture 8 Optimization 4/9/015 Lecture 8 Optimization EE 4386/5301 Computational Methods in EE Spring 015 Optimization 1 Outline Introduction 1D Optimization Parabolic interpolation Golden section search Newton s method Multidimensional

More information

SPOC: An Innovative Beamforming Method

SPOC: An Innovative Beamforming Method SPOC: An Innovative Beamorming Method Benjamin Shapo General Dynamics Ann Arbor, MI ben.shapo@gd-ais.com Roy Bethel The MITRE Corporation McLean, VA rbethel@mitre.org ABSTRACT The purpose o a radar or

More information

Bayesian Technique for Reducing Uncertainty in Fatigue Failure Model

Bayesian Technique for Reducing Uncertainty in Fatigue Failure Model 9IDM- Bayesian Technique or Reducing Uncertainty in Fatigue Failure Model Sriram Pattabhiraman and Nam H. Kim University o Florida, Gainesville, FL, 36 Copyright 8 SAE International ABSTRACT In this paper,

More information

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models J. Hlaváčová (Ed.): ITAT 07 Proceedings, pp. 36 43 CEUR Workshop Proceedings Vol. 885, ISSN 63-0073, c 07 J. Repický, L. Bajer, Z. Pitra, M. Holeňa Adaptive Generation-Based Evolution Control for Gaussian

More information

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Karol Opara 1 and Jaros law Arabas 2 1 Systems Research Institute, Polish Academy of Sciences 2 Institute of Electronic

More information

A (1+1)-CMA-ES for Constrained Optimisation

A (1+1)-CMA-ES for Constrained Optimisation A (1+1)-CMA-ES for Constrained Optimisation Dirk Arnold, Nikolaus Hansen To cite this version: Dirk Arnold, Nikolaus Hansen. A (1+1)-CMA-ES for Constrained Optimisation. Terence Soule and Jason H. Moore.

More information

Bio-inspired Continuous Optimization: The Coming of Age

Bio-inspired Continuous Optimization: The Coming of Age Bio-inspired Continuous Optimization: The Coming of Age Anne Auger Nikolaus Hansen Nikolas Mauny Raymond Ros Marc Schoenauer TAO Team, INRIA Futurs, FRANCE http://tao.lri.fr First.Last@inria.fr CEC 27,

More information

The Deutsch-Jozsa Problem: De-quantization and entanglement

The Deutsch-Jozsa Problem: De-quantization and entanglement The Deutsch-Jozsa Problem: De-quantization and entanglement Alastair A. Abbott Department o Computer Science University o Auckland, New Zealand May 31, 009 Abstract The Deustch-Jozsa problem is one o the

More information

Equidistant Polarizing Transforms

Equidistant Polarizing Transforms DRAFT 1 Equidistant Polarizing Transorms Sinan Kahraman Abstract arxiv:1708.0133v1 [cs.it] 3 Aug 017 This paper presents a non-binary polar coding scheme that can reach the equidistant distant spectrum

More information

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods

Today. Introduction to optimization Definition and motivation 1-dimensional methods. Multi-dimensional methods. General strategies, value-only methods Optimization Last time Root inding: deinition, motivation Algorithms: Bisection, alse position, secant, Newton-Raphson Convergence & tradeos Eample applications o Newton s method Root inding in > 1 dimension

More information

Gradient-based Adaptive Stochastic Search

Gradient-based Adaptive Stochastic Search 1 / 41 Gradient-based Adaptive Stochastic Search Enlu Zhou H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology November 5, 2014 Outline 2 / 41 1 Introduction

More information

Empirical comparisons of several derivative free optimization algorithms

Empirical comparisons of several derivative free optimization algorithms Empirical comparisons of several derivative free optimization algorithms A. Auger,, N. Hansen,, J. M. Perez Zerpa, R. Ros, M. Schoenauer, TAO Project-Team, INRIA Saclay Ile-de-France LRI, Bat 90 Univ.

More information

Finding all Bessel type solutions for Linear Differential Equations with Rational Function Coefficients

Finding all Bessel type solutions for Linear Differential Equations with Rational Function Coefficients Finding all essel type solutions or Linear Dierential Equations with Rational Function Coeicients Mark van Hoeij & Quan Yuan Florida State University, Tallahassee, FL 06-07, USA hoeij@math.su.edu & qyuan@math.su.edu

More information

Investigating the Local-Meta-Model CMA-ES for Large Population Sizes

Investigating the Local-Meta-Model CMA-ES for Large Population Sizes Investigating the Local-Meta-Model CMA-ES for Large Population Sizes Zyed Bouzarkouna, Anne Auger, Didier Yu Ding To cite this version: Zyed Bouzarkouna, Anne Auger, Didier Yu Ding. Investigating the Local-Meta-Model

More information

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION

ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION ITERATED SRINKAGE ALGORITHM FOR BASIS PURSUIT MINIMIZATION Michael Elad The Computer Science Department The Technion Israel Institute o technology Haia 3000, Israel * SIAM Conerence on Imaging Science

More information

Linear Regression (continued)

Linear Regression (continued) Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression

More information

Basic mathematics of economic models. 3. Maximization

Basic mathematics of economic models. 3. Maximization John Riley 1 January 16 Basic mathematics o economic models 3 Maimization 31 Single variable maimization 1 3 Multi variable maimization 6 33 Concave unctions 9 34 Maimization with non-negativity constraints

More information

Evolving Dynamic Multi-objective Optimization Problems. with Objective Replacement. Abstract

Evolving Dynamic Multi-objective Optimization Problems. with Objective Replacement. Abstract Evolving Dynamic Multi-objective Optimization Problems with Objective eplacement Sheng-Uei Guan, Qian Chen and Wenting Mo Department o Electrical and Computer Engineering National University o Singapore

More information

Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere

Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere S. Finck Vorarlberg University of Applied Sciences Hochschulstrasse 1 Dornbirn, Austria steffen.finck@fhv.at

More information

A Restart CMA Evolution Strategy With Increasing Population Size

A Restart CMA Evolution Strategy With Increasing Population Size Anne Auger and Nikolaus Hansen A Restart CMA Evolution Strategy ith Increasing Population Size Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005 c IEEE A Restart CMA Evolution Strategy

More information

Incorporating Pheromone Courtship Mode into Swarm Intelligence with Convergence Analysis

Incorporating Pheromone Courtship Mode into Swarm Intelligence with Convergence Analysis roceedings o the 7th WSEAS International Conerence on Systems Theory and Scientiic Computation, Athens, Greece, August 4-6, 7 5 Incorporating heromone Courtship Mode into Swarm Intelligence with Convergence

More information

Analysis Scheme in the Ensemble Kalman Filter

Analysis Scheme in the Ensemble Kalman Filter JUNE 1998 BURGERS ET AL. 1719 Analysis Scheme in the Ensemble Kalman Filter GERRIT BURGERS Royal Netherlands Meteorological Institute, De Bilt, the Netherlands PETER JAN VAN LEEUWEN Institute or Marine

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Andrea Mambrini 1 University of Birmingham, Birmingham UK 6th June 2013 1 / 33 Andrea Mambrini GSGP: theory-laden

More information

Supplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values

Supplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values Supplementary material or Continuous-action planning or discounted ininite-horizon nonlinear optimal control with Lipschitz values List o main notations x, X, u, U state, state space, action, action space,

More information

On High-Rate Cryptographic Compression Functions

On High-Rate Cryptographic Compression Functions On High-Rate Cryptographic Compression Functions Richard Ostertág and Martin Stanek Department o Computer Science Faculty o Mathematics, Physics and Inormatics Comenius University Mlynská dolina, 842 48

More information

DETC A GENERALIZED MAX-MIN SAMPLE FOR RELIABILITY ASSESSMENT WITH DEPENDENT VARIABLES

DETC A GENERALIZED MAX-MIN SAMPLE FOR RELIABILITY ASSESSMENT WITH DEPENDENT VARIABLES Proceedings o the ASME International Design Engineering Technical Conerences & Computers and Inormation in Engineering Conerence IDETC/CIE August 7-,, Bualo, USA DETC- A GENERALIZED MAX-MIN SAMPLE FOR

More information

AH 2700A. Attenuator Pair Ratio for C vs Frequency. Option-E 50 Hz-20 khz Ultra-precision Capacitance/Loss Bridge

AH 2700A. Attenuator Pair Ratio for C vs Frequency. Option-E 50 Hz-20 khz Ultra-precision Capacitance/Loss Bridge 0 E ttenuator Pair Ratio or vs requency NEEN-ERLN 700 Option-E 0-0 k Ultra-precision apacitance/loss ridge ttenuator Ratio Pair Uncertainty o in ppm or ll Usable Pairs o Taps 0 0 0. 0. 0. 07/08/0 E E E

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

A Particle Swarm Optimization Algorithm for Neighbor Selection in Peer-to-Peer Networks

A Particle Swarm Optimization Algorithm for Neighbor Selection in Peer-to-Peer Networks A Particle Swarm Optimization Algorithm or Neighbor Selection in Peer-to-Peer Networks Shichang Sun 1,3, Ajith Abraham 2,4, Guiyong Zhang 3, Hongbo Liu 3,4 1 School o Computer Science and Engineering,

More information

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

An Evolutionary Strategy for. Decremental Multi-Objective Optimization Problems

An Evolutionary Strategy for. Decremental Multi-Objective Optimization Problems An Evolutionary Strategy or Decremental Multi-Objective Optimization Problems Sheng-Uei Guan *, Qian Chen and Wenting Mo School o Engineering and Design, Brunel University, UK Department o Electrical and

More information

Estimation and detection of a periodic signal

Estimation and detection of a periodic signal Estimation and detection o a periodic signal Daniel Aronsson, Erik Björnemo, Mathias Johansson Signals and Systems Group, Uppsala University, Sweden, e-mail: Daniel.Aronsson,Erik.Bjornemo,Mathias.Johansson}@Angstrom.uu.se

More information

2.6 Two-dimensional continuous interpolation 3: Kriging - introduction to geostatistics. References - geostatistics. References geostatistics (cntd.

2.6 Two-dimensional continuous interpolation 3: Kriging - introduction to geostatistics. References - geostatistics. References geostatistics (cntd. .6 Two-dimensional continuous interpolation 3: Kriging - introduction to geostatistics Spline interpolation was originally developed or image processing. In GIS, it is mainly used in visualization o spatial

More information

APPENDIX 1 ERROR ESTIMATION

APPENDIX 1 ERROR ESTIMATION 1 APPENDIX 1 ERROR ESTIMATION Measurements are always subject to some uncertainties no matter how modern and expensive equipment is used or how careully the measurements are perormed These uncertainties

More information

Finger Search in the Implicit Model

Finger Search in the Implicit Model Finger Search in the Implicit Model Gerth Stølting Brodal, Jesper Sindahl Nielsen, Jakob Truelsen MADALGO, Department o Computer Science, Aarhus University, Denmark. {gerth,jasn,jakobt}@madalgo.au.dk Abstract.

More information

Linear Quadratic Regulator (LQR) I

Linear Quadratic Regulator (LQR) I Optimal Control, Guidance and Estimation Lecture Linear Quadratic Regulator (LQR) I Pro. Radhakant Padhi Dept. o Aerospace Engineering Indian Institute o Science - Bangalore Generic Optimal Control Problem

More information

arxiv: v4 [math.oc] 28 Apr 2017

arxiv: v4 [math.oc] 28 Apr 2017 Journal of Machine Learning Research 18 (2017) 1-65 Submitted 11/14; Revised 10/16; Published 4/17 Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles arxiv:1106.3708v4

More information

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial)

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial) Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial) Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat. 490 91405

More information

Objectives. By the time the student is finished with this section of the workbook, he/she should be able

Objectives. By the time the student is finished with this section of the workbook, he/she should be able FUNCTIONS Quadratic Functions......8 Absolute Value Functions.....48 Translations o Functions..57 Radical Functions...61 Eponential Functions...7 Logarithmic Functions......8 Cubic Functions......91 Piece-Wise

More information

Upper bound on singlet fraction of mixed entangled two qubitstates

Upper bound on singlet fraction of mixed entangled two qubitstates Upper bound on singlet raction o mixed entangled two qubitstates Satyabrata Adhikari Indian Institute o Technology Jodhpur, Rajasthan Collaborator: Dr. Atul Kumar Deinitions A pure state ψ AB H H A B is

More information

arxiv:quant-ph/ v2 12 Jan 2006

arxiv:quant-ph/ v2 12 Jan 2006 Quantum Inormation and Computation, Vol., No. (25) c Rinton Press A low-map model or analyzing pseudothresholds in ault-tolerant quantum computing arxiv:quant-ph/58176v2 12 Jan 26 Krysta M. Svore Columbia

More information

R Based Probability Distributions

R Based Probability Distributions General Comments R Based Probability Distributions When a parameter name is ollowed by an equal sign, the value given is the deault. Consider a random variable that has the range, a x b. The parameter,

More information

CHAPTER 1: INTRODUCTION. 1.1 Inverse Theory: What It Is and What It Does

CHAPTER 1: INTRODUCTION. 1.1 Inverse Theory: What It Is and What It Does Geosciences 567: CHAPTER (RR/GZ) CHAPTER : INTRODUCTION Inverse Theory: What It Is and What It Does Inverse theory, at least as I choose to deine it, is the ine art o estimating model parameters rom data

More information