Covariance Matrix Adaptation in Multiobjective Optimization

Similar documents
Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed

arxiv: v1 [cs.ne] 9 May 2016

Multi-objective Optimization with Unbounded Solution Sets

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed

A Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization

Not all parents are equal for MO-CMA-ES

A Restart CMA Evolution Strategy With Increasing Population Size

Articulating User Preferences in Many-Objective Problems by Sampling the Weighted Hypervolume

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed

Comparison-Based Optimizers Need Comparison-Based Surrogates

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

The Logarithmic Hypervolume Indicator

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study

A (1+1)-CMA-ES for Constrained Optimisation

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.

Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES

Advanced Optimization

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed

Local-Meta-Model CMA-ES for Partially Separable Functions

Adaptive Coordinate Descent

Robust Multi-Objective Optimization in High Dimensional Spaces

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed

Stochastic Methods for Continuous Optimization

Multiobjective Optimization

PIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems

Theory of the Hypervolume Indicator: Optimal µ-distributions and the Choice of the Reference Point

Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds

Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Evolution Strategies and Covariance Matrix Adaptation

A comparative introduction to two optimization classics: the Nelder-Mead and the CMA-ES algorithms

Effects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed

Introduction to Optimization

Behavior of EMO Algorithms on Many-Objective Optimization Problems with Correlated Objectives

Multiobjective Optimisation An Overview

Decoding Strategies for the 0/1 Multi-objective Unit Commitment Problem

Empirical comparisons of several derivative free optimization algorithms

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization

Testing Gaussian Process Surrogates on CEC 2013 multi-modal benchmark

Handling Uncertainty in Indicator-Based Multiobjective Optimization

Internal Report The Multi-objective Variable Metric Evolution Strategy Part I

Bio-inspired Continuous Optimization: The Coming of Age

Combinatorial Optimization of Stochastic Multi-objective Problems: an Application to the Flow-Shop Scheduling Problem

Efficient Covariance Matrix Update for Variable Metric Evolution Strategies

Genetic Algorithm: introduction

Variable Metric Reinforcement Learning Methods Applied to the Noisy Mountain Car Problem

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators

arxiv: v2 [cs.ai] 13 Jun 2011

Investigating the Local-Meta-Model CMA-ES for Large Population Sizes

A CMA-ES with Multiplicative Covariance Matrix Updates

Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial)

Natural Evolution Strategies for Direct Search

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models

Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism

Evolutionary Ensemble Strategies for Heuristic Scheduling

Research Article A Novel Ranking Method Based on Subjective Probability Theory for Evolutionary Multiobjective Optimization

Experimental Comparisons of Derivative Free Optimization Algorithms

BBOB-Benchmarking Two Variants of the Line-Search Algorithm

The CMA Evolution Strategy: A Tutorial

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Efficient Non-domination Level Update Method for Steady-State Evolutionary Multi-objective. optimization

Sensitivity of Parameter Control Mechanisms with Respect to Their Initialization

A Brief Introduction to Multiobjective Optimization Techniques

Stochastic optimization and a variable metric approach

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES.

Interactive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method

Co-optimization of topology design and parameterized control in a traffic network

Performance Measures for Dynamic Multi-Objective Optimization

Noisy Optimization: A Theoretical Strategy Comparison of ES, EGS, SPSA & IF on the Noisy Sphere

Power Prediction in Smart Grids with Evolutionary Local Kernel Regression

AMULTIOBJECTIVE optimization problem (MOP) can

Multiobjective Evolutionary Algorithms. Pareto Rankings

Stochastic Search using the Natural Gradient

WORST CASE OPTIMIZATION USING CHEBYSHEV INEQUALITY

Challenges in High-dimensional Reinforcement Learning with Evolution Strategies

Deposited on: 1 April 2009

How information theory sheds new light on black-box optimization

Using Comparative Preference Statements in Hypervolume-Based Interactive Multiobjective Optimization

A Novel Multiobjective Formulation of the Robust Software Project Scheduling Problem

Computational Intelligence Winter Term 2017/18

Inter-Relationship Based Selection for Decomposition Multiobjective Optimization

A CMA-ES for Mixed-Integer Nonlinear Optimization

arxiv: v1 [math.oc] 15 May 2018

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed

Two-dimensional Subset Selection for Hypervolume and Epsilon-Indicator

Prediction-based Population Re-initialization for Evolutionary Dynamic Multi-objective Optimization

Transcription:

Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France

Mastertitelformat Scenario: Multiobjective bearbeiten Optimization 2 Most problems are multiobjective in nature... Cost Pareto Front power consumption Set-based optimization view [Zitzler et al. 2010]: interested in finding a set of solutions ( Pareto front approxim. ) human decision maker can learn about the problem mathematically: unary quality indicator I transforms problem into a single-objective set problem randomized, set-based algorithms well suited for difficult (blackbox) multiobjective problems: field of Evolutionary Multiobjective Optimization (EMO) Goal of my talk: introduce the idea behind one algorithm (class): MO-CMA-ES

Mastertitelformat The Multiobjective bearbeiten CMA-ES 3 CMA-ES [remember talk of Nikolaus Hansen yesterday] Covariance Matrix Adaptation Evolution Strategy [Hansen and Ostermeier 1996, 2001] the state-of-the-art numerical black box optimizer for large budgets and difficult functions [Hansen et al. 2010] CMA-ES for multiobjective optimization THE MO-CMA-ES does not exist original one of [Igel et al. 2007] in ECJ improved success definition [Voß et al. 2010] at GECCO 2010 recombination between solutions [Voß et al. 2009] at EMO 2009 all based on combination of µ single (1+1)-CMA-ES

Mastertitelformat (1+λ)-CMA-ES bearbeiten 4

Mastertitelformat (1+λ)-CMA-ES: Updates bearbeiten 5

Mastertitelformat MO-CMA-ES: Basic bearbeiten Idea 6 Illustration in objective space: intuition not accurate! [copyright by Ilya Loshchilov]

Mastertitelformat Concrete MO-CMA-ES bearbeiten Baseline Algorithm Objective 2 7 µ x (1+1)-CMA-ES: Objective 1

Mastertitelformat Concrete MO-CMA-ES bearbeiten Baseline Algorithm Objective 2 8 µ x (1+1)-CMA-ES: Objective 1

Mastertitelformat Concrete MO-CMA-ES bearbeiten Baseline Algorithm Objective 2 9 µ x (1+1)-CMA-ES: Objective 1

Mastertitelformat Concrete MO-CMA-ES bearbeiten Baseline Algorithm Objective 2 10 µ x (1+1)-CMA-ES: Objective 1

Mastertitelformat Concrete MO-CMA-ES bearbeiten Baseline Algorithm Objective 2 11 µ x (1+1)-CMA-ES: Objective 1

Mastertitelformat MO-CMA-ES baseline bearbeiten algorithm 12 µ x (1+1)-CMA-ES hypervolume-based selection update of CMA strategy parameters based on different success notions Update of parameters: step size of parents and offspring based on success covariance matrix only for offspring Success Definitions: original success [Igel et al. 2007]: if offspring dominates parent improved success [Voß et al. 2010]: if offspring selected into new population

Mastertitelformat Available Implementations bearbeiten 13 Baseline implementation in Shark machine learning library http://image.diku.dk/shark/ C++ Now also available in MATLAB easy prototyping of new ideas visualization of algorithm s state variables (similar to CMA-ES)

Mastertitelformat MO-CMA-ES (GECCO 2010 bearbeiten version): output 14 original MO-CMA-ES worse

Mastertitelformat Comparison with Other bearbeiten Well-Known Algorithms 15 exemplary problem: f10: Ellipsoid + f21: Gallagher 101 peaks hypervolume difference of all non-dominated solutions found to ref. set Shark version

Mastertitelformat More Problems bearbeiten 16

Mastertitelformat Aggregation Over bearbeiten All BBOB Problems (300 total) 17 5-D 20-D

Mastertitelformat Aggregation Over bearbeiten Function Groups 18 the only groups where MO-CMA-ES is always outperformed contain separable functions

Mastertitelformat Other MO-CMA-ES bearbeiten Variants 19 Strategy Parameter Recombination in MO-CMA-ES (EMO 2009) learning of probability distribution also based on neighbors neighbors = influence weighted by Mahalanobis distance performance difference to original MO-CMA-ES less strong than between original and improved success criterion Towards Integrating Komma-Strategies (1+1)-CMA-ES not optimal for noisy problems Problem with integrating more robust (µ/µ,λ)-cma-es: how to rank with respect to parent? First ideas there, but algorithm still in progress

Mastertitelformat Conclusions bearbeiten 20 MO-CMA-ES as a multiobjective extension of the prominent single-objective CMA-ES Several variants Shark (C++) and Matlab implementations available shows superiority for large(r) budgets on non-separable biobjective functions wrt archive of non-dominated solutions found therefore probably a good alternative over NSGA-II if you have difficult multiobjective problems to solve in practice Questions?

Mastertitelformat Publications bearbeiten 21 [Hansen and Ostermeier 1996] N. Hansen and A. Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In Congress on Evolutionary Computation (CEC 1996), pages 312 317, Piscataway, NJ, USA, 1996. IEEE. [476 [Hansen and Ostermeier 2001] N. Hansen and A. Ostermeier. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation, 9(2):159 195, 2001. [Hansen et al 2010] N. Hansen, A. Auger, R. Ros, S. Finck, and P. Posik. Comparing Results of 31 Algorithms from the Black-Box Optimization Benchmarking BBOB-2009. In Genetic and Evolutionary Computation Conference (GECCO 2010), pages 1689 1696, 2010. ACM [Igel et al. 2007] C. Igel, N. Hansen, and S. Roth. Covariance Matrix Adaptation for Multi-objective Optimization. Evolutionary Computation, 15(1):1 28, 2007. [Voß et al. 2009] T. Voß, N. Hansen, and C. Igel. Recombination for Learning Strategy Parameters in the MO-CMA-ES. In Evolutionary Multi-Criterion Optimization (EMO 2009), pages 155 168. Springer, 2009. [Voß et al. 2010] T. Voß, N. Hansen, and C. Igel. Improved Step Size Adaptation for the MO-CMA-ES. In J. Branke et al., editors, Genetic and Evolutionary Computation Conference (GECCO 2010), pages 487 494. ACM, 2010.