COMPETITIVE DIFFERENTIAL EVOLUTION

Similar documents
Differential Evolution: Competitive Setting of Control Parameters

2 Differential Evolution and its Control Parameters

Competitive Self-adaptation in Evolutionary Algorithms

Adaptive Differential Evolution and Exponential Crossover

Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems

An Introduction to Differential Evolution. Kelly Fleetwood

Crossover and the Different Faces of Differential Evolution Searches

A Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning

Population Variance Based Empirical Analysis of. the Behavior of Differential Evolution Variants

Integer weight training by differential evolution algorithms

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

DE/BBO: A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization

Beta Damping Quantum Behaved Particle Swarm Optimization

Bio-inspired Continuous Optimization: The Coming of Age

Modified Differential Evolution for Nonlinear Optimization Problems with Simple Bounds

CONVERGENCE ANALYSIS OF DIFFERENTIAL EVOLUTION VARIANTS ON UNCONSTRAINED GLOBAL OPTIMIZATION FUNCTIONS

Differential Evolution: a stochastic nonlinear optimization algorithm by Storn and Price, 1996

ARTIFICIAL NEURAL NETWORKS REGRESSION ON ENSEMBLE STRATEGIES IN DIFFERENTIAL EVOLUTION

Differential Evolution Based Particle Swarm Optimization

Finding Multiple Global Optima Exploiting Differential Evolution s Niching Capability

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Multi-objective Emission constrained Economic Power Dispatch Using Differential Evolution Algorithm

THE objective of global optimization is to find the

Toward Effective Initialization for Large-Scale Search Spaces

Adaptive Generalized Crowding for Genetic Algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

A Mixed Strategy for Evolutionary Programming Based on Local Fitness Landscape

Dynamic Optimization using Self-Adaptive Differential Evolution

Improving Differential Evolution Algorithm by Synergizing Different Improvement Mechanisms

Performance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems

An Empirical Study of Control Parameters for Generalized Differential Evolution

On the Pathological Behavior of Adaptive Differential Evolution on Hybrid Objective Functions

Multi-start JADE with knowledge transfer for numerical optimization

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Evolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution

OPTIMIZATION OF MODEL-FREE ADAPTIVE CONTROLLER USING DIFFERENTIAL EVOLUTION METHOD

Center-based initialization for large-scale blackbox

OPTIMIZATION refers to the study of problems in which

Learning to Forecast with Genetic Algorithms

A Restart CMA Evolution Strategy With Increasing Population Size

CSC 4510 Machine Learning

An Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification

Evolutionary computation

ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS

Research Article A Hybrid Backtracking Search Optimization Algorithm with Differential Evolution

Research Article Algorithmic Mechanism Design of Evolutionary Computation

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES.

Multi-objective approaches in a single-objective optimization environment

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness

Evolutionary Programming Using a Mixed Strategy Adapting to Local Fitness Landscape

Egocentric Particle Swarm Optimization

Fractional Order PID Controller with an Improved Differential Evolution Algorithm

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN ,

The Parameters Selection of PSO Algorithm influencing On performance of Fault Diagnosis

Parameter Sensitivity Analysis of Social Spider Algorithm

Genetic Algorithm for Solving the Economic Load Dispatch

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES

An Adaptive Population Size Differential Evolution with Novel Mutation Strategy for Constrained Optimization

Constrained Real-Parameter Optimization with Generalized Differential Evolution

Application Research of Fireworks Algorithm in Parameter Estimation for Chaotic System

Performance Evaluation of Best-Worst Selection Criteria for Genetic Algorithm

Genetic Algorithm: introduction

Metaheuristics and Local Search

MODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION

Intelligens Számítási Módszerek Genetikus algoritmusok, gradiens mentes optimálási módszerek

WORST CASE OPTIMIZATION USING CHEBYSHEV INEQUALITY

AMULTIOBJECTIVE optimization problem (MOP) can

Using Evolutionary Techniques to Hunt for Snakes and Coils

Comparison between cut and try approach and automated optimization procedure when modelling the medium-voltage insulator

A numerical study of some modified differential evolution algorithms

Improving Search Space Exploration and Exploitation with the Cross-Entropy Method and the Evolutionary Particle Swarm Optimization

A COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION

Gaussian bare-bones artificial bee colony algorithm

Particle Swarm Optimization. Abhishek Roy Friday Group Meeting Date:

Looking Under the EA Hood with Price s Equation

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Multiobjective Optimization of Cement-bonded Sand Mould System with Differential Evolution

Power Prediction in Smart Grids with Evolutionary Local Kernel Regression

Fuzzy adaptive catfish particle swarm optimization

THE DIFFERENTIAL evolution (DE) [1] [4] algorithm

Optimization of Threshold for Energy Based Spectrum Sensing Using Differential Evolution

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Regrouping Particle Swarm Optimization: A New Global Optimization Algorithm with Improved Performance Consistency Across Benchmarks

Quantum-Inspired Differential Evolution with Particle Swarm Optimization for Knapsack Problem

arxiv: v1 [cs.ne] 29 Jul 2014

A Comparative Study of Differential Evolution, Particle Swarm Optimization, and Evolutionary Algorithms on Numerical Benchmark Problems

Chapter 8: Introduction to Evolutionary Computation

Petri Nets as Fuzzy Modeling Tool

Genetic Algorithms. Seth Bacon. 4/25/2005 Seth Bacon 1

ENHANCING THE CUCKOO SEARCH WITH LEVY FLIGHT THROUGH POPULATION ESTIMATION

Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization.

An Enhanced Aggregation Pheromone System for Real-Parameter Optimization in the ACO Metaphor

Data Mining Part 5. Prediction

Secondary Frequency Control of Microgrids In Islanded Operation Mode and Its Optimum Regulation Based on the Particle Swarm Optimization Algorithm

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Nature inspired optimization technique for the design of band stop FIR digital filter

Computational Intelligence Winter Term 2017/18

Transcription:

COMPETITIVE DIFFERENTIAL EVOLUTION Josef Tvrdík University of Ostrava, Department of Computer Science 30. dubna 22, 701 03 Ostrava, Czech Republic phone: +420/596160231, fax: +420/596120478 e-mail: tvrdik@osu.cz Abstract. This paper deals with differential evolution. Adaptation of its controlling parameters was studied. The competition of different controlling-parameter settings was proposed and tested on six functions at three levels of the search space dimensions. The competitive differential evolution proved to be more reliable and less time-consuming than the standard differential evolution. The competitive differential evolution also outperformed some other algorithms that were tested. Keywords: global optimization, differential evolution, competition, adaptive algorithms. Introduction We will deal with the global optimization problem: for a given objective function f : D R, D R d the point x is to be found such that x = arg min x D f(x). The point x is called the global minimum point and D is the search space. We focus on the problems, where D is closed compact set defined as D = d [a i, b i ], a i < b i, i = 1, 2,..., d, the objective function is continuous and function value f(x) can be evaluated at any point x D. The authors of many stochastic search algorithms for solving this problem claim the efficiency and the reliability of searching for the global minimum. The reliability means that the point with minimal function value found in the search process is sufficiently close to the global minimum point. However, when we use such algorithms, we face the problem of the setting their controlling parameters. The efficiency and the reliability of many algorithms is strongly depent on the values of controlling parameters. Recommations given by authors are often vague or uncertain, see e.g. [9], [18]. A user is supposed to be able to change the parameter values according to partial results obtained in search process. Such attempt is not acceptable in tasks, where the global optimization is one step on the way to the solution of the user s problem or when the user has no experience in fine art of control parameter tuning. Adaptive robust algorithms reliable enough at reasonable time-consumption without the necessity of fine tuning their input parameters have been studied in recent years. The proposal of an adaptive generator of robust algorithms is described in Deb [2]. Winter et al. [13] proposed a flexible evolutionary agent for real-coded genetic algorithms. Theoretical analysis done by Wolpert and Macready implies, that any search algorithm cannot outperform the others for all objective functions [14]. In spite of this fact, there is empirical evidence, that some algorithms can outperform others for relatively wide class of problems both in the convergence rate and in the reliability of finding the global minimum. Thus, the way to the adaptive algorithms leads rather trough the experimental research than purely theoretical approach. Differential Evolution and its Controlling Parameters The differential evolution (DE) works with two population P and Q of the same size N. The algorithm in pseudo-code is writen as Algorithm 1. A new trial point y is composed of the current point x i of old population and the point u obtained by using mutation. If f(y) < f(x i ) the point y is inserted into the new population Q instead of x i. After completion of the new population Q the old population P is replaced by Q and the search continues until stopping condition is fulfilled.

Algorithm 1. Differential evolution 1 generate P = (x 1, x 2,..., x N ); (N points in D at random) 2 repeat 3 for i := 1 to N do 4 compute a mutant vector u; 5 create a trial point y by the crossover of u and x i ; 6 if f(y) < f(x i ) then insert y into Q 7 else insert x i into Q 8 if; 9 for; 10 P := Q; 11 until stopping condition; The most popular variant of DE (called DER in this text) generates the point u by adding the weighted difference of two points u = r 1 + F (r 2 r 3 ), (1) where r 1, r 2 and r 3 are three distinct points taken randomly from P (not coinciding with the current x i ) and F > 0 is an input parameter. Another variant called DEBEST generates the point u by the following rule u = x min + F (r 1 + r 2 r 3 r 4 ), (2) where r 1, r 2, r 3, r 4 are four distinct points taken randomly from P (not coinciding with the current x i ), x min is the point of P with minimal function value, and F > 0 is an input parameter The elements y j, j = 1, 2...., d of trial point y are built up by the crossover of its parents x i and u using the following rule { uj kdy U y j = j C nebo j = l (3) x ij kdy U j > C a j l, where l is a randomly chosen integer from {1, 2,..., d}, U 1, U 2,..., U d are indepent random variables uniformly distributed in [0, 1), and C [0, 1] is an input parameter influencing the number of elements to be exchanged by crossover. The differential evolution has become one of the most popular algorithms for the continuous global optimization problems in recent years, see [3]. But it is known that the efficiency of the search for the global minimum is very sensitive to the setting of values F and C. The recommed values are F = 0.8 and C = 0.5, but even Storn and Price in their principal paper [9] use 0.5 F 1 and 0 C 1. They also set the size of population less than the recommed N = 10 d in many of their test tasks. Many papers deal with the setting of control parameters for differential evolution. Ali and Törn [1] suggested to adapt the value of the scaling factor F within search process according to the equation { max(fmin, 1 f max f F = min ) if f max f min < 1 max(f min, 1 fmin f max ) otherwise, (4) where f min, f max are respectively the minimum and maximum function values in the population and F min is an input parameter ensuring F [F min, 1). This adaptive attempt is useful only sometimes and there are tasks where such adaptation is not helpful, see [12]. Zaharie [15] derived the critical interval for the controlling parameters of DE. This interval ensures to keep the mean of population variance non-decreasing, which results in the following relationship 2 p F 2 2p N + p2 N > 0, (5) where p = max(1/d, C) is the probability of differential perturbation according to (3). The relationship (5) implies that the mean of population variance is non-decreasing, if F > 1/N, but practical reason of such result is very limited, because it brings no new information when we compare this result with the minimal value of F = 0.5 used in [9] and in other applications of differential evolution. Some attempts to the adaptation of DE control parameters have appeared on MENDEL recently, see [4], [5], [8], and [16]. Up-to-date state of adaptive parameter control in differential evolution is summarized by Liu and Lampinen [6].

Competition in Differential Evolution The setting of the controlling parameters can be made adaptive trough the implementation of a competition into the algorithm. This idea is similar to the competition of local-search heuristics in evolutionary algorithm [10] or in controlled random search [11]. Let us have h settings (different values of F and C used in the statements on line 4 and 5 of Algorithm 1) and choose among them at random with the probability q i, i = 1, 2,..., h. The probabilities can be changed according to the successfulnes of the setting in preceding steps of search process. The i-th setting is successful if it generates such a trial point y that f(y) < f(x i ). When n i is the current number of the i-th setting successes, the probability q i can be evaluated simply as the relative frequency q i = n i + n 0 h j=1 (n j + n 0 ), (6) where n 0 > 0 is a constant. The setting of n 0 1 prevents a dramatic change in q i by one random successful use of the i-th parameter setting. In order to avoid the degeneration of process the current values of q i are reset to their starting values (q i = 1/h) if any probability q i decreases bellow a given limit δ > 0. It is supposed that such a competition of different settings will prefer successful settings. The competition will serve as an adaptive mechanism of setting control parameter appropriate to the problem actually solved. Experiments and Results Four variants of such competitive differential evolution were implemented and tested. Three values of controlling parameter C were used in all the variants, namely C = 0, C = 0.5, and C = 1. DER9 the mutant vector u is generated according to (1), nine settings of controlling parameters are all the combinations of three F -values (F = 0.5, F = 0.8, and F = 1) with three values of C given above, DEBEST9 the mutant vector u is generated according to (2), nine settings of controlling parameters F and C like in DER9, DERADP3 the mutant vector u is generated according to (1), F adaptive according to (4), three settings of C. DEBR18 18 settings, aggregation of settings used in DER9 and DEBEST9, implemented due to the good performance of DER9 and DEBEST9 in the test tasks. Population size was set to N = max(20, 2 d), and parameters for competition control were set to n 0 = 2, and δ = 1/45. These variants of competitive differential evolution were compared with three other algorithms. One of them was the standard DER with recommed values F = 0.8 and C = 0.5 and population size the same as in the competitive variants of differential evolution. The second algorithm was the CRS8HC with eight competing local-search heuristics, described in [11] including the setting of its controlled parameters. The last algorithm was SOMA (all-to-one variant), see [17] and [18], with its controlling parameters set to recommed values (N = 5 d, mass=1.9, step=0.3, prt=0.5). The search for the global minimum was stopped if f max f min < 1E 07 or the number of objective function evaluations exceeds the input upper limit 20000 d. The algorithms were tested on six functions commonly used in experimental tests. Two of them are unimodal (first De Jong, Rosenbrock), the other test functions are multimodal. Definitions of the test functions can be found e.g. in [9] or [1]. The search spaces D in all test tasks were symmetric and with the same interval in each dimension, a i = b i, b i = b j i, j = 1, 2,..., d. The values of b i were set as follows: 2.048 for Rosenbrock s function, 5.12 for first De Jong and Rastrigin function, 30 for Ackley function, 400 for Griewangk function and 500 for Schwefel function. The test were preformed for all the functions at three level of dimension d of search spaces, namely d = 5, d = 10 and d = 30. One hundred of repeated runs were carried out for each function and level of d. The accuracy of the result obtained by the search for the global minimum was evaluated according to the number of duplicated digits when compared with the right certified result. The number of duplicated digits λ can be calculated via log relative error [7]:

If c 0, the λ is evaluated as 0 if m c c 1 λ = 11 if m c c < 1 10 ( ) 11 log m c 10 c otherwise, (7) where c denotes the certified value and m denotes the value obtained by the search. If c = 0, the λ is evaluated as 0 if m 1 λ = 11 if m < 1 10 11 log 10 ( m ) otherwise. (8) Two values of the number of duplicated digits are reported in the results: λ f for the function value, and λ m, which is minimal λ for the global minimum point (x 1, x 2,..., x d ) found by the search. Table 1: Competitive differential evolution algorithms Algorithm DER9 DEBEST9 DERADP3 Function d λ f λ m rne R λ f λ m rne R λ f λ m rne R ackley 30 5.8 5.8-13 100 6.0 5.9 21 100 6.1 6.0-35 100 dejong1 30 6.3 3.0-13 100 6.5 3.1 21 100 6.7 3.1-12 100 griewangk 30 6.3 1.6-13 100 6.5 1.7 24 100 6.7 1.8-12 100 rastrig 30 6.3 4.2-12 100 6.5 4.2 25 100 6.7 4.3-14 100 rosen 30 6.2 4.2 1 100 6.4 4.3 28 100 0.0 0.0 56 0 schwefel 30 7.5 5.4-12 100 7.5 5.5 20 100 7.5 5.5-38 100 ackley 10 6.1 5.9-15 100 6.1 5.9 24 100 5.6 5.4-25 89 dejong1 10 6.6 3.0-14 100 6.7 3.1 22 100 7.1 3.2 3 100 griewangk 10 6.6 2.1-18 100 6.8 2.2 37 100 7.0 2.4-5 93 rastrig 10 6.7 4.2-13 100 6.6 4.2 25 99 6.8 4.2-5 96 rosen 10 5.8 3.5 110 95 6.4 4.2 15 100 0.1 0.1 130 2 schwefel 10 7.3 5.3-14 97 7.4 5.4 21 98 6.8 4.9-33 89 ackley 5 6.5 6.2-11 100 6.5 6.2 17 100 6.6 6.3-7 100 dejong1 5 7.2 3.2-11 100 7.2 3.2 14 100 7.4 3.3 13 100 griewangk 5 7.2 2.5-15 99 7.2 2.6 40 100 7.2 2.7 10 93 rastrig 5 7.2 4.4-13 100 7.2 4.4 18 100 7.4 4.5 1 99 rosen 5 6.7 4.1 47 97 6.8 4.2 14 99 3.1 2.0 139 44 schwefel 5 7.4 5.4-12 98 7.4 5.4 12 99 7.4 5.4-28 98 Table 2: Competitive differential evolution DEBR18 Dimension d = 30 d = 10 d = 5 Function λ f λ m ne R λ f λ m ne R λ f λ m ne R ackley 5.9 5.8 142208 100 6.1 5.9 13569 100 6.4 6.2 6401 100 dejong1 6.4 3.0 78664 100 6.7 3.0 6973 100 7.2 3.2 3176 100 griewangk 6.4 1.6 103095 100 6.6 2.1 13153 99 7.2 2.5 8686 100 rastrig 6.4 4.1 110071 100 6.7 4.2 10711 100 7.2 4.4 4989 100 rosen 6.3 4.3 381972 100 6.3 4.0 20524 100 6.9 4.2 6256 100 schwefel 7.5 5.4 108050 100 7.4 5.4 9964 99 7.4 5.4 4564 98 The results of the competitive DE are presented in Table 1 and 2. The time consumption is expressed as the average number (ne) of the objective function evaluations needed to reach the stopping condition. Columns of Table 1 (and also Table 3] denoted rne contain the relative change of ne in percents when compared with DEBR18 in Table 2. Therefore, the negative values of rne mean less time consumption with respect

Table 3: Standard differential evolution, CRS and SOMA Algorithm DER CRS8HC SOMA Function d λ f λ m rne R λ f λ m rne R λ f λ m rne R ackley 30 5.6 5.6 164 100 5.7 5.7-68 95 0.0 0.0 24 0 dejong1 30 6.1 2.9 141 100 6.5 3.1-66 100 4.7 2.0 87 71 griewangk 30 6.0 1.5 174 100 5.8 1.5-67 87 0.4 0.0 68 4 rastrig 30 0.0 0.0 445 0 2.4 1.6 222 37 0.0 0.0 110 0 rosen 30 0.0 0.0 57 0 5.1 3.3-20 98 0.0 0.0 57 0 schwefel 30 7.5 5.4 206 100 7.5 5.5 12 100 0.8 0.0 99 0 ackley 10 5.9 5.7 14 99 6.3 6.1 0 100 0.1 0.1 150 2 dejong1 10 6.5 3.0 6 100 6.9 3.2 4 100 7.2 3.3 99 100 griewangk 10 5.3 1.6 18 78 6.6 2.1-8 94 0.3 0.1 557 2 rastrig 10 5.3 3.4 104 82 6.8 4.3 166 99 0.0 0.0 731 0 rosen 10 6.7 4.3 429 100 6.8 4.3-7 100 0.7 0.8 871 2 schwefel 10 7.3 5.2 9 96 7.5 5.5 30 100 1.3 0.2 557 3 ackley 5 6.3 6.1 1 99 6.3 6.1-12 100 2.0 2.0 137 28 dejong1 5 7.1 3.2-3 100 7.0 3.1-13 100 8.6 3.9 20 100 griewangk 5 5.2 1.7 14 70 5.0 1.7-20 68 0.2 0.0 900 0 rastrig 5 6.7 4.1 16 95 6.5 4.0 7 93 0.9 0.5 517 10 rosen 5 7.2 4.4 528 100 7.2 4.3-15 100 8.3 6.3 735 75 schwefel 5 7.4 5.4-3 98 7.4 5.4-13 99 2.6 1.3 613 24 to DEBR18, the positive values bigger one, the value rne = 100 means that ne is twice bigger, the value rne = 50 means half ne in comparison with DEBR18. The reliability of search is reported in the columns λ f and λ m, where are the average values of one hundred runs, and also in the columns denoted R, where the percentage of runs with λ f > 4 is given. The results of the non-competitive standard DER, CRS8HC and SOMA are presented in Table 3. The columns denoted rne also contain the relative change of ne in percents when compared with DEBR18. Discussion and Conclusions As it is seen in Tables 1, 2 and 3, three competitive variants of DE (DEBR18, DER9, and DEBEST9) searched for the global minimum significantly more reliable than the other algorithms. They also outperformed the other algorithms in convergence rate (except DERADP3 and CRS8HC in some test tasks, but these algorithms exhibited less reliability in some tasks). The performance of DEBR18, DER9 and DEBEST9 is apparently better than the performance of standard DER. DERADP3 was usually less reliable than the other competitive variants of differential evolution, very significantly in the case of Rosenbrock function. As regards the comparison of DEBR18 and CRS8HC, CRS8HC searched for the global minimum with lower reliability at more time demand in the case of Rastrigin function (especially for d = 30) and also in several other tasks the reliability of CRS8HC was a bit lower, but with less time consumption. SOMA performed badly in these test tasks, although its controlling parameters were set to the middle of recommed intervals [18]. The reliability of SOMA was very poor except the easiest De Jong1 function, even at more time consumption comparing with the other algorithms under testing. The winner among the algorithms is DEBR18 algorithm. Its reliability is the highest (but not significantly different from the reliability of DEBEST9 or DER9). Its time demands are less on all the test tasks when compared with DEBEST9. As regards comparison with DER9, DEBR18 worked faster in the case of the most time-consuming Rosenbrock function and only slightly slower in remaining tasks. The source code of DEBR18 in Matlab is presented in the full CD version of this contribution. The proposed competitive setting of the controlling parameters F and C has proved to be useful idea, which can help to solve the global optimization tasks without necessity of fine parameter tuning. However, another research of competitive differential evolution should continue.

References [1] Ali M. M., Törn A. (2004). Population set based global optimization algorithms: Some modifications and numerical studies, Computers and Operations Research 31, 1703 1725. [2] Deb K. (2005). A population-based algorithm-generator for real parameter optimization. Soft Computing 9, 236 253. [3] Lampinen J. (1999). A Bibliography of Differential Evolution Algorithm. Technical Report. Lappeenranta University of Technology, Department of Information Technology, Laboratory of Information Processing, http://www.lut.fi/ jlampine/debiblio.htm [4] Liu J., Lampinen J. (2002). On Setting the Control Parameter of the Differential Evolution Method. MENDEL 2002, 8th International Conference on Soft Computing (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 11 18. [5] Liu J., Lampinen J. (2002). Adaptive Parameter Control of Differential Evolution. MENDEL 2002, 8th International Conference on Soft Computing (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 19 26. [6] Liu J., Lampinen J. (2005). A Fuzzy Adaptive Differential Evolution Algortithm. Soft Computing 9, 448 462. [7] McCullough, B.D.,Wilson, B., (2005). On the accuracy of statistical procedures in Microsoft Excel 2003. Comput. Statist. and Data Anal. 49, 1244 1252. [8] Šmuc T. (2002). Improving convergence properties of the differential evolution algorithm. MENDEL 2002, 8th International Conference on Soft Computing (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 80 86. [9] Storn R., Price K. (1997). Differential evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Global Optimization 11, 341 359. [10] Tvrdík J., Mišík L., Křivý I. (2002). Competing Heuristics in Evolutionary Algorithms. 2nd Euro-ISCI, Intelligent Technologies - Theory and Applications (Sinčák P. et al. eds.), IOS Press, Amsterdam, 159 165. [11] Tvrdík J. (2004). Generalized controlled random search and competing heuristics. MENDEL 2004, 10th International Conference on Soft Computing, (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 228 233. [12] Tvrdík J. (2005). Competition and Cooperation in Evolutionary Algorithms: A Comparative Study. MENDEL 2005, (Matoušek R. and Ošmera P. eds), 11-th Int. Conference on Soft Computing, University of Technology, Brno, 108 113. [13] Winter G., Galvan B., Alonso S., Gonzales B., Jimenez J. I., Greimer D., (2005). A flexible evolutionary agent: cooperation and competition among real-coded evolutionary operators. Soft Computing 9, 299 323. [14] Wolpert D. H., Macready, W.G. (1997). No Free Lunch Theorems for Optimization. IEEE Transactions on Evolutionary Computation 1, 67 82. [15] Zaharie D. (2002). Critical Values for the Control Parameter of the Differential Evolution Algorithms. MENDEL 2002, 8th International Conference on Soft Computing (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 62 67. [16] Zaharie D. (2003). Control of popuplation diversity and adaptation in Differential Evolution Algorithms. MENDEL 2003, 9th International Conference on Soft Computing (Matoušek R. and Ošmera P. eds). University of Technology, Brno, 41 46. [17] Zelinka, I., Lampinen J. (2000). SOMA Self-Organizing Migrating Algorithm, MENDEL 2000, 6th Int. Conference on Soft Computing, University of Technology, Brno, 177 187. [18] Zelinka, I. (2002). Artificial intelligence in global optimization problems. (in Czech), BEN, Praha. Acknowledgment This work was supported by the grant 201/05/0284 of the Czech Grant Agency and by the research scheme MSM 6198898701 of the Institute for Research and Applications of Fuzzy Modeling.

Appix A Test functions Ackley s function f(x) = 20 exp 0.02 1 d x 2 i exp ( 1 d ) cos 2πx i + 20 + exp(1) x i [ 30, 30], x = (0, 0,..., 0), f(x ) = 0 First De Jong s function f(x) = x 2 i x i [ 5.12, 5.12], x = (0, 0,..., 0), f(x ) = 0 Griewangk s function f(x) = 2 x i d 4000 ( ) xi cos + 1 i x i [ 400, 400], x = (0, 0,..., 0), f(x ) = 0 Rastrigin s function f(x) = 10d + [x 2 i 10 cos(2πx i )] x i [ 5.12, 5.12], x = (0, 0,..., 0), f(x ) = 0 Rosenbrock s function (banana valley) d 1 [ f(x) = 100(x 2 i x i+1 ) 2 + (1 x i ) 2] x i [ 2.048, 2.048], x = (1, 1,..., 1), f(x ) = 0 Schwefel s function f(x) = x i sin( x i ) x i [ 500, 500], x = (s, s,..., s), s = 420.97, f(x ) = 418.9829d

Appix B DEBR18 Algorithm in Matlab function [x_star,fn_star, func_evals, success, func_near,nrst,cni]=... debr18compet(fn_name, a, b, N, my_eps, max_evals, f_near, n0, delta) Tvrdik, March 2006 h=18; initialization d=length(a); cni=zeros(1,h); ni=zeros(1,h)+n0; nrst=0; func_evals=n; success=0; func_near=0; P=zeros(N,d+1); for :N P(i,1:d)=a+(b-a).*rand(1,d); P(i,d+1)=feval(fn_name,(P(i,1:d))); 0-th generation initialized fmax=max(p(:,d+1)); [fmin, indmin]=min(p(:,d+1)); Q=P; while (fmax-fmin > my_eps) && (func_evals < d*max_evals) main loop for :N in each generation [hh p_min]=roulete(ni); select hh-th set of parameters if p_min<delta cni=cni+ni-n0; ni=zeros(1,h)+n0; nrst=nrst+1; reset switch hh number of selected heuristic case 1 F=0.5; CR=0; y=debest(p,f,cr,[i,indmin]); case 2 F=0.5; CR=0.5; y=debest(p,f,cr,[i,indmin]); case 3 F=0.5; CR=1; y=debest(p,f,cr,[i,indmin]); case 4 F=0.8; CR=0; y=debest(p,f,cr,[i,indmin]); case 5 F=0.8; CR=0.5; y=debest(p,f,cr,[i,indmin]); case 6 F=0.8; CR=1; y=debest(p,f,cr,[i,indmin]); case 7 F=1; CR=0; y=debest(p,f,cr,[i,indmin]); case 8 F=1; CR=0.5; y=debest(p,f,cr,[i,indmin]); case 9 F=1; CR=1; y=debest(p,f,cr,[i,indmin]); case 10 F=0.5; CR=0; y=derand(p,f,cr,i); case 11 F=0.5; CR=0.5; y=derand(p,f,cr,i); case 12 F=0.5; CR=1; y=derand(p,f,cr,i); case 13 F=0.8; CR=0; y=derand(p,f,cr,i); case 14 F=0.8; CR=0.5; y=derand(p,f,cr,i); case 15 F=0.8; CR=1; y=derand(p,f,cr,i); case 16 F=1; CR=0; y=derand(p,f,cr,i); case 17 F=1; CR=0.5; y=derand(p,f,cr,i);

case 18 F=1; CR=1; y=derand(p,f,cr,i); y=zrcad(y,a,b); perturbation fy=feval(fn_name,y); func_evals=func_evals+1; if fy < P(i,d+1) trial point y is good for renewing population Q(i,:)= [y fy]; success=success+1; ni(hh)=ni(hh)+1; zmena prsti qi of generation P=Q; fmax=max(p(:,d+1)); [fmin,indmin]=min(p(:,d+1)); if func_near==0 && fmin<f_near func_near=func_evals; main loop - cni=cni+ni-n0; x_star=p(indmin,1:d); fn_star=fmin; function [res, p_min]=roulete(cutpoints) returns an integer from [1, length(cutpoints)] with probability proportional to cutpoints(i)/ summa cutpoints h =length(cutpoints); ss=sum(cutpoints); p_min=min(cutpoints)/ss; cp(1)=cutpoints(1); for i=2:h cp(i)=cp(i-1)+cutpoints(i); cp=cp/ss; res=1+fix(sum(cp<rand(1))); debest binomial function y=debest(p,f,cr,expt); expt=[i, indmin] N=length(P(:,1)); d=length(p(1,:))-1; y=p(expt(1),1:d); current best=p(expt(2),1:d); best point in P vyb=nahvyb_expt(n,4,expt); four random points without expt r1=p(vyb(1),1:d); r2=p(vyb(2),1:d); r3=p(vyb(3),1:d); r4=p(vyb(4),1:d); v=best+f*(r2+r2-r3-r4); change=find(rand(1,d)<cr); if length(change)==0 at least one element is changed change=1+fix(d*rand(1)); y(change)=v(change);

de_rand binomial function y=derand(p,f,cr,expt); N=length(P(:,1)); d=length(p(1,:))-1; y=p(expt(1),1:d); vyb=nahvyb_expt(n,3,expt); three random points without expt r1=p(vyb(1),1:d); r2=p(vyb(2),1:d); r3=p(vyb(3),1:d); v=r1+f*(r2-r3); change=find(rand(1,d)<cr); if length(change)==0 at least one element is changed change=1+fix(d*rand(1)); y(change)=v(change); random sample, k of N without repetition, numbers given in vector expt are not included function vyb=nahvyb_expt(n,k,expt); opora=1:n; if nargin==3 opora(expt)=[]; vyb=zeros(1,k); for :k index=1+fix(rand(1)*length(opora)); vyb(i)=opora(index); opora(index)=[]; zrcadleni, Perturbation y into <a,b> function result=zrcad(y,a,b) zrc=find(y<a y>b); for i=zrc while (y(i)<a(i) y(i)>b(i)) if y(i)>b(i) y(i)=2*b(i)-y(i); elseif y(i)<a(i) y(i)=2*a(i)-y(i); result=y;