Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed

Similar documents
Mirrored Variants of the (1,4)-CMA-ES Compared on the Noiseless BBOB-2010 Testbed

Bounding the Population Size of IPOP-CMA-ES on the Noiseless BBOB Testbed

Comparison of NEWUOA with Different Numbers of Interpolation Points on the BBOB Noiseless Testbed

BBOB-Benchmarking Two Variants of the Line-Search Algorithm

Benchmarking Gaussian Processes and Random Forests Surrogate Models on the BBOB Noiseless Testbed

Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds

Benchmarking a Hybrid Multi Level Single Linkage Algorithm on the BBOB Noiseless Testbed

Benchmarking a BI-Population CMA-ES on the BBOB-2009 Function Testbed

Benchmarking the Nelder-Mead Downhill Simplex Algorithm With Many Local Restarts

arxiv: v1 [cs.ne] 9 May 2016

Benchmarking the (1+1)-CMA-ES on the BBOB-2009 Function Testbed

Black-Box Optimization Benchmarking the IPOP-CMA-ES on the Noisy Testbed

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study

Covariance Matrix Adaptation in Multiobjective Optimization

Adaptive Coordinate Descent

Benchmarking a Weighted Negative Covariance Matrix Update on the BBOB-2010 Noiseless Testbed

Benchmarking Gaussian Processes and Random Forests on the BBOB Noiseless Testbed

BI-population CMA-ES Algorithms with Surrogate Models and Line Searches

Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds

Real-Parameter Black-Box Optimization Benchmarking 2010: Presentation of the Noiseless Functions

Adaptive Generation-Based Evolution Control for Gaussian Process Surrogate Models

A Restart CMA Evolution Strategy With Increasing Population Size

Experimental Comparisons of Derivative Free Optimization Algorithms

Surrogate models for Single and Multi-Objective Stochastic Optimization: Integrating Support Vector Machines and Covariance-Matrix Adaptation-ES

Minimization of Energy Loss using Integrated Evolutionary Approaches

CMA-ES a Stochastic Second-Order Method for Function-Value Free Numerical Optimization

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

arxiv: v1 [cs.ne] 29 Jul 2014

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.

Beta Damping Quantum Behaved Particle Swarm Optimization

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Advanced Optimization

Comparison-Based Optimizers Need Comparison-Based Surrogates

Multi-objective Optimization with Unbounded Solution Sets

Bio-inspired Continuous Optimization: The Coming of Age

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES

Empirical comparisons of several derivative free optimization algorithms

Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Evolution Strategies and Covariance Matrix Adaptation

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Genetic Algorithm: introduction

2 Differential Evolution and its Control Parameters

Numerical optimization Typical di culties in optimization. (considerably) larger than three. dependencies between the objective variables

Iterated Responsive Threshold Search for the Quadratic Multiple Knapsack Problem

Energy-aware scheduling for GreenIT in large-scale distributed systems

An Improved Quantum Evolutionary Algorithm with 2-Crossovers

CSC 4510 Machine Learning

The detection of subsurface inclusions using internal measurements and genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms

An Evolutionary Programming Based Algorithm for HMM training

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing

Initial-Population Bias in the Univariate Estimation of Distribution Algorithm

Evolutionary Programming Using a Mixed Strategy Adapting to Local Fitness Landscape

LOCAL SEARCH. Today. Reading AIMA Chapter , Goals Local search algorithms. Introduce adversarial search 1/31/14

The Quadratic Assignment Problem

Analysis of Crossover Operators for Cluster Geometry Optimization

Addressing Numerical Black-Box Optimization: CMA-ES (Tutorial)

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

Gradient-based Adaptive Stochastic Search

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Optimisation numérique par algorithmes stochastiques adaptatifs

A (1+1)-CMA-ES for Constrained Optimisation

An Implementation of Compact Genetic Algorithm on a Quantum Computer

Stochastic optimization and a variable metric approach

Viability Principles for Constrained Optimization Using a (1+1)-CMA-ES

Crossover and the Different Faces of Differential Evolution Searches

Differential Evolution: Competitive Setting of Control Parameters

Stochastic Methods for Continuous Optimization

Quadratic Multiple Knapsack Problem with Setups and a Solution Approach

Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems

Genetic Algorithms. Seth Bacon. 4/25/2005 Seth Bacon 1

Genetic Algorithms. Donald Richards Penn State University

arxiv: v2 [cs.ai] 13 Jun 2011

An Enhanced Aggregation Pheromone System for Real-Parameter Optimization in the ACO Metaphor

DE/BBO: A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization

Restarting a Genetic Algorithm for Set Cover Problem Using Schnabel Census

A tabu search based memetic algorithm for the maximum diversity problem

Behaviour of the UMDA c algorithm with truncation selection on monotone functions

Metaheuristic algorithms for identification of the convection velocity in the convection-diffusion transport model

Center-based initialization for large-scale blackbox

Introduction to Optimization

Quadrature based Broyden-like method for systems of nonlinear equations

Adaptive Differential Evolution and Exponential Crossover

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Scaling Up. So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.).

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

MODELLING AND OPTIMISATION OF THERMOPHYSICAL PROPERTIES AND CONVECTIVE HEAT TRANSFER OF NANOFLUIDS BY USING ARTIFICIAL INTELLIGENCE METHODS

High Dimensions and Heavy Tails for Natural Evolution Strategies

INVARIANT SUBSETS OF THE SEARCH SPACE AND THE UNIVERSALITY OF A GENERALIZED GENETIC ALGORITHM

A comparison of sequencing formulations in a constraint generation procedure for avionics scheduling

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.

Local-Meta-Model CMA-ES for Partially Separable Functions

Toward Effective Initialization for Large-Scale Search Spaces

Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms

Computational Intelligence Winter Term 2017/18

A Mixed Strategy for Evolutionary Programming Based on Local Fitness Landscape

Transcription:

Benchmarking Projection-Based Real Coded Genetic Algorithm on BBOB-2013 Noiseless Function Testbed Babatunde Sawyerr 1 Aderemi Adewumi 2 Montaz Ali 3 1 University of Lagos, Lagos, Nigeria 2 University of KwaZulu-Natal, Durban, South Africa 3 University of The Witwatersrand, Johannesburg, South Africa Workshop on Black Box Optimization Benchmarking, 2013

Outline 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Outline Introduction Problem Statement Genetic Algorithms 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Problem Statement Genetic Algorithms The Global Optimization Problem Real Parameter Optimization The task is to minimize an objective function f. Given f : S R where S R n, find x S for which, f(x ) f(x), x S. (1) Black Box approach: gradients are not known or not useful. problem domain are rugged and ill-conditioned. Goal: To find the global optimum, x quickly. With the least search cost (function evaluations).

Outline Introduction Problem Statement Genetic Algorithms 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Genetic Algorithms Problem Statement Genetic Algorithms Developed by John Holland in 1975. Goal: Develop robust and adaptive systems. Solutions are represented internally as genetic encoding of points. Reproduction of offspring via: mutation, recombination. Selection methods: initially Fitness-proportional method. Model: Generational or Steady state.

Real Coded Genetic Algorithms Problem Statement Genetic Algorithms Real valued representation are used as genetic encodings of points. They are better adapted to numerical optimization of continuous problems. They can also be easily hybridized with other search methods.

Outline Introduction Projection 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Projection Orthogonal Projection of a vector x on a vector y For any two n dimensional vectors, the projection of a vector x on another vector y generates a vector, defined by: ŷ = x T y y T y y = x T ( ) y x cos(θ) y 2 y = y. (2) y Note that the projected vector ŷ (the offspring) will be in the same direction as y unless π 2 < θ < 3π 2 in which case the angle, θ, between the two vectors is such that cos(θ) < 0. As a result, the projected vector is in the opposite direction (the reflection of y about the origin).

Projection Orthogonal Projection of a vector x on a vector y Figure: Projection of vector x on vector y

Outline Introduction Projection 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Projection Inputs Outputs PRCGA was first introduced as RCGA-P in [6, 7]. The incorporated projection operator showed promising exploratory search capability in some search problems. PRCGA is an enhanced version of RCGA-P. Fitness function f. Parameters. The Best solution x best. Fitnss value of x best, f(x best ).

Projection 1 Initialize P t=0, P t = { x 1,t, x 2,t,...,x N,t } from S 2 f(x i,t ) = evaluate(p t ), {1 i N} 3 While not stopping condition, do steps 4-12 4 ζ t = σ(f(p t )), if ζ t ǫ do step 5 else step 6 5 ˆPt = perturb(p t ) 6 ˆPt = tournamentselection(p t ) 7 C t = blend-αcrossover(ˆp t, p c ) 8 M t = non-uniformmutation(c t, p m ) 9 Φ t = projection(m t ) 10 f(x i,t ) = evaluate(φ t ) 11 P t+1 = replace(p t,φ t ) 12 t = t + 1 13 end while

Outline Introduction Experimental Settings 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Computer System and Software Experimental Settings Computer System Configuration HP Probook 6545b with AMD Turion(tm) II Ultra Dual-Core mobile M620 CPU processor. Software CPU Speed: 2.5GHz. RAM: 2.75GB Microsoft Windows 7 Professional service pack 1. MATLAB 7.10 (R2010a). COmparing Continuous Optimisers (COCO) software. Post-Processing Script in Python

Experimental setup Experimental Settings The experimental setup was carried out according to [3] on the benchmark functions provided in [2, 4]. Two independent restart strategies were employed Checks for stagnation [1]. Maximum number of generations reached without f target. For each restart strategy, the genetic run is initiated with an initial population P 0 which is Unif([ 4, 4] D ).

Parameter Settings Experimental Settings Parameters Population Size = min(100, 100 D), where D = dimension. Maximum Number of Evaluation = 10 5 D. Tournament size = 3. Crossover rate p c = 0.8. Mutation rate p m = 0.15. Non-uniformity factor for Mutation β = 15. Crafting effort CrE = 0.

CPU Timing Experiment Experimental Settings The CPU timing experiment was conducted using the same independent restart strategies on the function f 8 for a duration of 30 seconds. Time per function evaluation Dimension 2 3 5 10 20 40 Time ( 10 5 ) 7.1 7.5 6.9 6.9 7.1 8.0

Outline Introduction Empirical Results 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Ellipsoid separable Empirical Results 5 2 Ellipsoid separable 4 10 4 4 3 2 1 0 2 3 5 10 20 40

Rastrigin separable Empirical Results 5 3 Rastrigin separable 5 12 9 4 3 2 1 0 2 3 5 10 20 40

Empirical Results Skew Rastrigin-Bueche separable 4 Skew Rastrigin-Bueche separ 8 11 5 12 14 4 3 2 1 0 2 3 5 10 20 40

Outline Introduction Empirical Results 1 Introduction Problem Statement Genetic Algorithms 2 Projection 3 Experimental Settings 4 Empirical Results 5

Empirical Results Separable Functions PRCGA performed well on separable functions f 1 f 4. PRCGA also solved Gallagher s Gaussian 101-me Peaks Function f 21, a multi-modal function with weak global structure. PRCGA showed some encouraging performance in solving problems f 6 f 7 in dimensions 2 10. Functions with high conditioning and unimodal Functions f 10 f 14 prove to be difficult for PRCGA to solve to the required level of accuracy.

Empirical Results Comparison of PRCGA with Previous GAs DBRCGA [1] outperformed PRCGA. PRCGA performed better than the RCGA in [8]. PRCGA performed better than the simplega in [5].

The benchmarking of PRCGA on noiseless BBOB function testbed shows the strengths and weaknesses of the algorithm. The performance of PRCGA shows that in its current form it cannot compete with state-of-the-art evolutionary algorithms.

Thank You!!!

Appendix For Further Reading For Further Reading I Y.-C. Chuang and C.-T. Chen. Black-box optimization benchmarking for noiseless function testbed using a direction-based rcga. In GECCO (Companion), pages 167 174, 2012. S. Finck, N. Hansen, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 2009: Presentation of the noiseless functions. Technical Report 2009/20, Research Center PPE, 2009. Updated February 2010.

Appendix For Further Reading For Further Reading II N. Hansen, A. Auger, S. Finck, and R. Ros. Real-parameter black-box optimization benchmarking 2012: Experimental setup. Technical report, INRIA, 2012. N. Hansen, S. Finck, R. Ros, and A. Auger. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Technical Report RR-6829, INRIA, 2009. Updated February 2010.

Appendix For Further Reading For Further Reading III M. Nicolau. Application of a simple binary genetic algorithm to a noiseless testbed benchmark. In GECCO (Companion), pages 2473 2478, 2009. B. A. Sawyerr. Hybrid real coded genetic algorithms with pattern search and projection. PhD thesis, University of Lagos, Lagos, Nigeria, 2010.

Appendix For Further Reading For Further Reading IV B. A. Sawyerr, M. M. Ali, and A. O. Adewumi. A comparative study of some real coded genetic algorithms for unconstrained global optimization. Optimization Methods and Software, 26(6):945 970, 2011. T.-D. Tran and G.-G. Jin. Real-coded genetic algorithm benchmarked on noiseless black-box optimization testbed. In GECCO (Companion), pages 1731 1738, 2010.