An Empirical Study of Control Parameters for Generalized Differential Evolution
|
|
- Ferdinand Davidson
- 6 years ago
- Views:
Transcription
1 An Empirical tudy of Control Parameters for Generalized Differential Evolution aku Kukkonen Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute of Technology Kanpur Kanpur, PIN 8 6, India saku@iitk.ac.in, saku.kukkonen@lut.fi Jouni Lampinen Department of Information Technology Lappeenranta University of Technology P.O. Box, IN-585 Lappeenranta, inland jouni.lampinen@lut.fi KanGAL Report Number 54 Abstract In this paper the influence of control parameters to the search process of previously introduced Generalized Differential Evolution is empirically studied. Besides number of generations and size of the population, Generalized Differential Evolution has two control parameters, which have to be set before applying the method for solving a problem. The effect of these two control parameters is studied with a set of common bi-objective test problems and performance metrics. Objectives of this study are to investigate sensitivity of the control parameters according to different performance metrics, understand better the effect and tuning of the control parameters on the search process, and to find rules for the initial settings of the control parameters in the case of multi-objective optimization. It seems that in the case of multi-objective optimization, there exists same kind of relationship between the control parameters and the progress of population variance as in the case of single-objective optimization with Differential Evolution. Based on the results, recommendations choosing control parameter values are given.
2 Introduction Many practical problems have multiple objectives and several aspects cause constraints to problems. Generalized Differential Evolution (E) [ 4] is an extension of Differential Evolution (DE) [5, 6] for constrained multi-objective (Pareto-)optimization. DE is a relatively new real-coded Evolutionary Algorithm (EA), which has been demonstrated to be efficient for solving many real-world problems. In the earlier studies [4, 7, 8] E is compared with other multi-objective EAs (MOEAs). In these studies it has been noted that E performs better with different control parameter values from those usually used for the basic DE algorithm in singleobjective optimization. It has been also noted that the control parameter values should be drawn from a rather narrow range for some problems. However, more complete tests with different control parameter values has not been performed and reported before. Differential Evolution The DE algorithm [5, 6] was introduced by torn and Price in 995. Design principles of DE are simplicity, efficiency, and use of floating-point encoding instead of binary numbers. As a typical EA, DE has a random initial population that is then improved using selection, mutation, and crossover operations. everal ways exist to determine a stopping criterion for EAs but usually a predefined upper limit G max for the number of generations to be computed provides an appropriate stopping condition. Other control parameters for DE are crossover control parameter, mutation factor, and the population size NP. In each generation G, DE goes through each D dimensional decision vector x i,g of the population and creates corresponding trial vector u i,g as follows [9]: r, r, r {,,..., NP }, (randomly selected, except mutually different and different from i) j rand = floor (rand i [, ) D) + for(j = ; j D; j = j + ) { if(rand j [, ) < j = j rand ) u j,i,g = x j,r,g + (x j,r,g x j,r,g) else u j,i,g = x j,i,g } () Both and remain fixed during the whole execution of the algorithm. Parameter, controlling the crossover operation, represents the probability that an element for the trial vector is chosen from a linear combination of three randomly chosen vectors instead of from old vector x i,g. The condition j = j rand is to make sure that at least one element
3 is different compared to elements of the old vector. Parameter is a scaling factor for mutation and its value is typically (, +]. In practice, controls the rotational invariance of the search, and its small value (e.g..) is practicable with separable problems while larger values (e.g..9) are for non-separable problems. Control parameter controls the speed and robustness of the search, i.e., a lower value for increases the convergence rate but also the risk of stacking into a local optimum. Parameters and NP have the same kind of effect on the convergence rate as has. After the mutation and crossover operations trial vector u i,g is compared to old vector x i,g. If the trial vector has equal or better objective value, then it replaces the old vector in the next generation. DE is an elitist method since the best population member is always preserved and the average objective value of the population will never worsen. Generalized Differential Evolution everal extensions of DE for multi-objective optimization and for constrained optimization has been proposed [ 8]. Generalized Differential Evolution (E) [ 4, 7, 8] extends the basic DE algorithm for constrained multi-objective optimization by modifying the selection rule of the basic DE algorithm. Compared to the other DE extensions for multi-objective optimization, E makes DE suitable for constrained multi-objective optimization with noticeable fewer changes to the original DE algorithm. The modified selection operation for solving a constrained multi-objective optimization problem x i,g+ = u i,g minimize subject to {f ( x), f ( x),..., f M ( x)} (g ( x), g ( x),..., g K ( x)) T with M objectives f m and K constraint functions g k is presented formally as follows []: k {,..., K} : g k ( u i,g ) > x i,g if k {,..., K} : g k ( u i,g) g k ( x i,g) k {,..., K} : g k ( u i,g ) k {,..., K} : g k ( x i,g ) > k {,..., K} : g k ( u i,g ) g k ( x i,g ) m {,..., M} : f m ( u i,g ) f m ( x i,g ) otherwise (), () where g k( x i,g ) = max (g k ( x i,g ), ) and g k( u i,g ) = max (g k ( u i,g ), ) are representing the constraint violations.
4 The selection rule given in Eq. selects trial vector the next generation in the following cases: u i,g to replace old vector x i,g in. Both trial vector u i,g and old vector x i,g violate at least one constraint but the trial vector does not violate any constraint more than the old vector.. Old vector x i,g violates at least one constraint whereas trial vector u i,g is feasible.. Both vectors are feasible and trial vector u i,g has at least as good value for each objective than old vector x i,g has. Otherwise old vector x i,g is preserved. The basic idea in the selection rule given in Eq. is that trial vector u i,g is required to dominate compared old population member x i,g in a constraint violation space or in an objective function space, or at least provide an equally good solution as x i,g. After the selected number of generations the final population represents the solution for the optimization problem. Unique non-dominated vectors can be separated from the final population if desired. There is no sorting of non-dominated vectors during the optimization process or any mechanism for maintaining diversity of the solution. In the earlier studies [4,7,8] E was compared with other MOEAs. In these studies, it was observed that relatively small values for crossover control parameter and mutation factor should be used in the case of used multi-objective test problems. It was also noticed that suitable values for these control parameters should be drawn from a rather narrow range for some problems to obtain a converged solution along the Pareto front. However, performance was not thoroughly tested with commonly adopted metrics for MOEAs by applying different control parameter combinations. 4 Experiments In this investigation E was tested with a set of bi-objective benchmark problems commonly used in the MOEA literature. These problems are known as CH, CH, ON, KUR, POL, ZDT, ZDT, ZDT, ZDT4, and ZDT6 [9, pp. 8 6] and they all have two objectives to be minimized. With all the test problems the size of the population and the number of generations were kept fixed. rom the final population unique non-dominated solutions were extracted to form the final solution, an approximation of the Pareto front. Characteristics of the problems, the size of the population, and the number of generations are collected to Table. Relatively small number of generations was used for some problems to observe the effect of the control parameters. Different control parameter value combinations in the range [, ] and [, ] with a resolution of.5 were tested for all the test problems. Tests were repeated times with each control parameter combination and the results were evaluated using 4
5 Problem D eparability Pf, decision space Pf, objective space NP G max CH N/A line curve, convex CH N/A lines curves 5 ON separable line curve, non-convex 5 KUR separable points & 4 curves point & curves 5 POL non-separable curves curves ZDT separable line curve, convex 5 ZDT separable line curve, concave 5 ZDT separable 5 lines 5 curves 5 ZDT4 separable line curve, convex 5 ZDT6 separable line curve, concave 5 Table : Characteristics of the bi-objective optimization test problems (Pf = Pareto front) [9, pp. 8 6] [, pp. 6 8, 4 5], the size of the population, and the number of generations. convergence and diversity metrics. Closeness to the Pareto front was measured with a generational distance () [9, pp. 6 7], which measures the average distance of the solution vectors from the Pareto front. Diversity of the obtained solution was measured using a spacing () metric [9, pp. 7 8], which measures the standard deviation of the distances from each vector to the nearest vector in the obtained non-dominated set. maller values for both metrics are preferable, and the optimal values are zero. The number of unique non-dominated vectors () in the final solution was also registered. Results for the different problems are shown in igs. as surfaces in the control parameter space. The results for all the problems show that with a small value the final solution has greatest amount of unique non-dominated vectors and also values for performance metrics and are best. or the multidimensional (i.e. D > ) test problems, values for the performance metrics are best with low values, whereas for the one-dimensional test problems, does not seem to have notable impact on the solution. The value range for in the tests is larger than usually used for single-objective optimization. Based on the results, a larger value does not give any extra benefit in the case of multi-objective optimization and therefore its value can be chosen from the same value range as in the case of single-objective optimization. rom the results of the multidimensional test problems, one can observe a non-linear relationship between and values, i.e., a larger value can be used with a small value than with a large value, and this relationship is non-linear. An explanation for this can be found from a theory for the single-objective DE algorithm. A formula for the relationship between the control parameters of DE and the evolution of the population variance has been conducted in []. The change of the population variance between 5
6 CH, Cardinality CH, Generational Distance CH, pacing log ( ) ء.5.5 ز س igure : Mean values of the number of unique non-dominated vectors in the final solution, generational distance (note logarithmic scale), and spacing for CH shown as surfaces in the trange shape of the spacing surface is due to only a few non-dominated vectors when is large. CH, Cardinality CH, Generational Distance CH, pacing igure : Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for CH shown as surfaces in the successive generations due to the crossover and mutation operations is denoted with c and its value is calculated as c = /NP + /NP +. When c <, the crossover and mutation operations decrease the population variance. When c =, the variance do not change, and when c >, the variance increases. ince a selection operation of an EA usually decreases the population variance, c > is recommended to prevent too high convergence rate with a poor search coverage, which typically results in a premature convergence. On the other hand, if c is too large, the search process proceeds reliably, but too slowly. When the size of the population is relatively large (e.g. NP > 5), the value of c depends mainly on the values of and. Curves c =. and c =.5 for NP = in the control parameter space are shown in ig.. These curves are also shown with 6
7 ON, Cardinality ON, Generational Distance ON, pacing igure : Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ON shown as surfaces in the KUR, Cardinality KUR, Generational Distance KUR, pacing igure 4: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for KUR shown as surfaces in the the and surfaces in igs.. As it can be observed, the surface for c in ig. corresponds well with the surfaces for and in most of the test problems proposing that the theory for the population variance between successive generations is applicable also in the case of multi-objective optimization with E. This is natural, since singleobjective optimization can be seen as a special case of multi-objective optimization, and a large c means slow but reliable search while a small c means opposite. In general, the results are good with small and values meaning also a low c value close to one. One reason for the good performance with small control parameter values is that the solution of a multi-objective problem is usually a set of non-dominated vectors instead of one vector, and conflicting objectives of the problem reduce overall selection pressure, maintain the diversity of the population, and prevent the premature convergence. Another reason is that all the multidimensional test problems here are separable or almost separable. With strongly non-separable objective functions, a larger 7
8 POL, Cardinality POL, Generational Distance POL, pacing igure 5: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for POL shown as surfaces in the ZDT, Cardinality ZDT, Generational Distance ZDT, pacing igure 6: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ZDT shown as surfaces in the might work better. Also, most of the problems have relatively easy objective functions to solve leading faster convergence with a small, while a bigger might be needed for harder functions with several local optima. The results for most of the ZDT problems show that the value of control parameter must be kept low (e.g...) to obtain good results. This is due to the fact that in the ZDT problems, the first objective depends on one decision variable while the second objective depends on rest of the decision variables. This leads easily to a situation when the first objective gets optimized faster than the second one and the solution converges to a single point on the Pareto front since in E there is no mechanism for maintaining diversity of the solution. Therefore, if the optimization difficulty for different objectives varies, then the value for must be close to zero to prevent the solution converge to a single point of the Pareto-optimal front. Based on the results, our recommendations for the control parameter values are 8
9 ZDT, Cardinality ZDT, Generational Distance ZDT, pacing igure 7: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ZDT shown as surfaces in the ZDT, Cardinality ZDT, Generational Distance ZDT, pacing igure 8: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ZDT shown as surfaces in the [.5,.8] and [.,.6] for initial settings, and it is safer to use smaller values for. However, these recommendations are based on the results of the used test functions. In overall, it is wise to select values for control parameters and satisfying a condition. < c <.5. 5 Conclusions An extension of Differential Evolution (DE), Generalized Differential Evolution (E), is described. E has the same control parameters, crossover control parameter and mutation factor, as the basic DE has. uitable values for these control parameters has been reported earlier to be different compared to those usually used with the basic single-objective DE, but no extensive study was performed. 9
10 ZDT4, Cardinality ZDT4, Generational Distance ZDT4, pacing igure 9: Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ZDT4 shown as surfaces in the ZDT6, Cardinality ZDT6, Generational Distance ZDT6, pacing igure : Mean values of the number of unique non-dominated vectors in the final solution, generational distance, and spacing for ZDT6 shown as surfaces in the Different control parameter values for E were tested using common bi-objective test problems and metrics measuring the convergence and diversity of the solutions. Based on the empirical results, suitable initial control parameter values are [.5,.8] and [.,.6]. The results also propose that a larger value than usually used in the case of single-objective optimization does not give any extra benefit in the case of multiobjective optimization. It also seems that in some cases it is better to use a smaller value to prevent the solution converge to a single point of the Pareto front. However, these results are based on mostly separable problems with conflicting objectives. Also in the case of multi-objective optimization, the non-linear relationship between and was observed according to the theory of the basic single-objective DE about the relationship between the control parameters and the evolution of the population variance. It is advisable to select values for and satisfying the condition. < c <.5.
11 c =.5 c = c igure : Curves c =. and c =.5 for NP = in the In the future, the correlation between c and performance metrics should be studied also numerically. urthermore, the further investigation of the generalization of the theory for the evolution of the population variance to multi-objective problems is necessary. Also, extending studies for strongly non-separable multi-objective problems and problems with more than two objectives remains to be studied. pecial cases, as multi-objective problems without conflicting objectives and single-objective problems transformed into multi-objective forms, might be interesting to investigate. Acknowledgments. Kukkonen gratefully acknowledges support from the East inland Graduate chool in Computer cience and Engineering (ECE), the Academy of inland, Technological oundation of inland, and innish Cultural oundation. He also wishes to thank Dr. K. Deb and all the KanGAL people for help during his visit in pring 5. References [] Jouni Lampinen. DE s selection rule for multiobjective optimization. Technical report, Lappeenranta University of Technology, Department of Information Technology,. [Online] Available: [] Jouni Lampinen. Multi-constrained nonlinear optimization by the Differential Evolution algorithm. Technical report, Lappeenranta University of Technology, Department of Information Technology,. [Online] Available:
12 [] Jouni Lampinen. A constraint handling approach for the Differential Evolution algorithm. In Proceedings of the Congress on Evolutionary Computation (CEC ), pages , Honolulu, Hawaii, May. [4] aku Kukkonen and Jouni Lampinen. Comparison of Generalized Differential Evolution algorithm to other multi-objective evolutionary algorithms. In Proceedings of the 4th European Congress on Computational Methods in Applied ciences and Engineering (ECCOMA4), page 445, Jyväskylä, inland, July 4. [5] Rainer torn and Kenneth V. Price. Differential Evolution - a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, (4):4 59, Dec 997. [6] Kenneth V. Price, Rainer torn, and Jouni Lampinen. Differential Evolution: A Practical Approach to Global Optimization. pringer-verlag, Berlin, 5. [7] aku Kukkonen and Jouni Lampinen. A Differential Evolution algorithm for constrained multi-objective optimization: Initial assessment. In Proceedings of the IATED International Conference on Artificial Intelligence and Applications, pages 96, Innsbruck, Austria, eb 4. [8] aku Kukkonen and Jouni Lampinen. Mechanical component design for multiple objectives using Generalized Differential Evolution. In Proceedings of the 6th International Conference on Adaptive Computing in Design and Manufacture (ACDM4), pages 6 7, Bristol, United Kingdom, April 4. [9] Kenneth V. Price. New Ideas in Optimization, chapter An Introduction to Differential Evolution, pages McGraw-Hill, London, 999. [] Rainer torn. ystem design by constraint adaptation and Differential Evolution. IEEE Transactions on Evolutionary Computation, (): 4, April 999. [] eng-heng Wang and Jyh-Woei heu. Multiobjective parameter estimation problems of fermentation processes using a high ethanol tolerance yeast. Chemical Engineering cience, 55(8): , ept. [] Hussein A. Abbass and Ruhul arker. The Pareto Differential Evolution algorithm. International Journal on Artificial Intelligence Tools, (4):5 55,. [] Hussein A. Abbass. The self-adaptive Pareto Differential Evolution algorithm. In Proceedings of the Congress on Evolutionary Computation (CEC ), pages 8 86, Honolulu, Hawaii, May.
13 [4] Nateri K. Madavan. Multiobjective optimization using a Pareto Differential Evolution approach. In Proceedings of the Congress on Evolutionary Computation (CEC ), pages 45 5, Honolulu, Hawaii, May. [5] B. V. Babu and M. Mathew Leenus Jehan. Differential Evolution for multi-objective optimization. In Proceedings of the Congress on Evolutionary Computation (CEC ), pages 696 7, Canberra, Australia, Dec. [6] Robert J. Graves eng Xue, Arthur C. anderson. Multi-objective Differential Evolution and its application to enterprise planning. In Proceedings of IEEE International Conference on Robotics and Automation, pages 55 54, Taiwan, May. [7] Efrén Mezura-Montes, Carlos A. Coello Coello, and Edy I. Tun-Morales. imple feasibility rules and Differential Evolution for constrained optimization. In Proceedings of the rd Mexican International Conference on Artificial Intelligence (MICAI 4), pages 77 76, Mexico City, Mexico, April 4. [8] Tea Robič and Bogdan ilipič. DEMO: Differential Evolution for multiobjective optimization. In Proceedings of the rd International Conference on Evolutionary Multi- Criterion Optimization (EMO 5), pages 5 5, Guanajuato, Mexico, March 5. [9] Kalyanmoy Deb. Multi-Objective Optimization using Evolutionary Algorithms. John Wiley & ons, Chichester, England,. [] Carlos A. Coello Coello, David A. Van Veldhuizen, and Gary B. Lamont. Evolutionary Algorithms for olving Multi-Objective Problems. Kluwer Academic, New York, United tates of America,. [] Daniela Zaharie. Critical values for the control parameters of Differential Evolution algorithms. In Proceedings of Mendel 4, 8th International Conference on oft Computing, pages 6 67, Brno, Czech Republic, June.
Constrained Real-Parameter Optimization with Generalized Differential Evolution
2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 Constrained Real-Parameter Optimization with Generalized Differential Evolution
More informationGDE3: The third Evolution Step of Generalized Differential Evolution
GDE3: The third Evolution Step of Generalized Differential Evolution Saku Kukkonen Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute of Technology Kanpur Kanpur, PIN 28 16, India saku@iitk.ac.in,
More informationPerformance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems
Performance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems Saku Kukkonen, Student Member, IEEE and Jouni Lampinen Abstract This paper presents
More informationPerformance Assessment of Generalized Differential Evolution 3 (GDE3) with a Given Set of Problems
Perormance Assessment o Generalized Dierential Evolution (GDE) with a Given Set o Problems Saku Kukkonen, Student Member, IEEE and Jouni Lampinen Abstract This paper presents results or the CEC 007 Special
More informationRunning time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem
Research Collection Working Paper Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Author(s): Laumanns, Marco; Thiele, Lothar; Zitzler, Eckart;
More informationBehavior of EMO Algorithms on Many-Objective Optimization Problems with Correlated Objectives
H. Ishibuchi N. Akedo H. Ohyanagi and Y. Nojima Behavior of EMO algorithms on many-objective optimization problems with correlated objectives Proc. of 211 IEEE Congress on Evolutionary Computation pp.
More informationMonotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles
Monotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles Kalyanmoy Deb and Aravind Srinivasan Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationDifferential Evolution: Competitive Setting of Control Parameters
Proceedings of the International Multiconference on Computer Science and Information Technology pp. 207 213 ISSN 1896-7094 c 2006 PIPS Differential Evolution: Competitive Setting of Control Parameters
More informationThis is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.
This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail. Author(s): Sindhya, Karthik; Ruuska, Sauli; Haanpää, Tomi; Miettinen,
More informationCrossover and the Different Faces of Differential Evolution Searches
WCCI 21 IEEE World Congress on Computational Intelligence July, 18-23, 21 - CCIB, Barcelona, Spain CEC IEEE Crossover and the Different Faces of Differential Evolution Searches James Montgomery Abstract
More information2 Differential Evolution and its Control Parameters
COMPETITIVE DIFFERENTIAL EVOLUTION AND GENETIC ALGORITHM IN GA-DS TOOLBOX J. Tvrdík University of Ostrava 1 Introduction The global optimization problem with box constrains is formed as follows: for a
More informationAn Effective Chromosome Representation for Evolving Flexible Job Shop Schedules
An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules Joc Cing Tay and Djoko Wibowo Intelligent Systems Lab Nanyang Technological University asjctay@ntuedusg Abstract As the Flexible
More informationDifferential Evolution: a stochastic nonlinear optimization algorithm by Storn and Price, 1996
Differential Evolution: a stochastic nonlinear optimization algorithm by Storn and Price, 1996 Presented by David Craft September 15, 2003 This presentation is based on: Storn, Rainer, and Kenneth Price
More informationAn Introduction to Differential Evolution. Kelly Fleetwood
An Introduction to Differential Evolution Kelly Fleetwood Synopsis Introduction Basic Algorithm Example Performance Applications The Basics of Differential Evolution Stochastic, population-based optimisation
More informationMulti-objective approaches in a single-objective optimization environment
Multi-objective approaches in a single-objective optimization environment Shinya Watanabe College of Information Science & Engineering, Ritsumeikan Univ. -- Nojihigashi, Kusatsu Shiga 55-8577, Japan sin@sys.ci.ritsumei.ac.jp
More informationCONVERGENCE ANALYSIS OF DIFFERENTIAL EVOLUTION VARIANTS ON UNCONSTRAINED GLOBAL OPTIMIZATION FUNCTIONS
CONVERGENCE ANALYSIS OF DIFFERENTIAL EVOLUTION VARIANTS ON UNCONSTRAINED GLOBAL OPTIMIZATION FUNCTIONS G.Jeyakumar 1 C.Shanmugavelayutham 2 1, 2 Assistant Professor Department of Computer Science and Engineering
More informationA Brief Introduction to Multiobjective Optimization Techniques
Università di Catania Dipartimento di Ingegneria Informatica e delle Telecomunicazioni A Brief Introduction to Multiobjective Optimization Techniques Maurizio Palesi Maurizio Palesi [mpalesi@diit.unict.it]
More informationRunning Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem
Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Marco Laumanns 1, Lothar Thiele 1, Eckart Zitzler 1,EmoWelzl 2,and Kalyanmoy Deb 3 1 ETH Zürich,
More informationPlateaus Can Be Harder in Multi-Objective Optimization
Plateaus Can Be Harder in Multi-Objective Optimization Tobias Friedrich and Nils Hebbinghaus and Frank Neumann Max-Planck-Institut für Informatik, Campus E1 4, 66123 Saarbrücken, Germany Abstract In recent
More informationDESIGN OF OPTIMUM CROSS-SECTIONS FOR LOAD-CARRYING MEMBERS USING MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS
DESIGN OF OPTIMUM CROSS-SECTIONS FOR LOAD-CARRING MEMBERS USING MULTI-OBJECTIVE EVOLUTIONAR ALGORITHMS Dilip Datta Kanpur Genetic Algorithms Laboratory (KanGAL) Deptt. of Mechanical Engg. IIT Kanpur, Kanpur,
More informationConstraint-Handling in Evolutionary Algorithms. Czech Institute of Informatics, Robotics and Cybernetics CTU Prague
in Evolutionary Algorithms Jiří Kubaĺık Czech Institute of Informatics, Robotics and Cybernetics CTU Prague http://cw.felk.cvut.cz/doku.php/courses/a0m33eoa/start pmotivation The general nonlinear programming
More informationA Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization
A Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization Dung H. Phan and Junichi Suzuki Deptartment of Computer Science University of Massachusetts, Boston, USA {phdung, jxs}@cs.umb.edu
More informationGeneralization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms
Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms Tadahiko Murata 1, Shiori Kaige 2, and Hisao Ishibuchi 2 1 Department of Informatics, Kansai University 2-1-1 Ryozenji-cho,
More informationQuad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism
Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism Sanaz Mostaghim 1 and Jürgen Teich 2 1 Electrical Engineering Department University of Paderborn,
More informationInteractive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method
Interactive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method Kalyanmoy Deb Department of Mechanical Engineering Indian Institute of Technology Kanpur Kanpur,
More informationInvestigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems
Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems Miguel Leon Ortiz and Ning Xiong Mälardalen University, Västerås, SWEDEN Abstract. Differential evolution
More informationRobust Multi-Objective Optimization in High Dimensional Spaces
Robust Multi-Objective Optimization in High Dimensional Spaces André Sülflow, Nicole Drechsler, and Rolf Drechsler Institute of Computer Science University of Bremen 28359 Bremen, Germany {suelflow,nd,drechsle}@informatik.uni-bremen.de
More informationARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES
International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1455 1475 ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC
More informationConstrained Optimization by the Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites
2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 6-2, 2006 Constrained Optimization by the Constrained Differential Evolution with Gradient-Based
More informationResearch Article A Novel Ranking Method Based on Subjective Probability Theory for Evolutionary Multiobjective Optimization
Mathematical Problems in Engineering Volume 2011, Article ID 695087, 10 pages doi:10.1155/2011/695087 Research Article A Novel Ranking Method Based on Subjective Probability Theory for Evolutionary Multiobjective
More informationWORST CASE OPTIMIZATION USING CHEBYSHEV INEQUALITY
WORST CASE OPTIMIZATION USING CHEBYSHEV INEQUALITY Kiyoharu Tagawa School of Science and Engineering, Kindai University, Japan tagawa@info.kindai.ac.jp Abstract In real-world optimization problems, a wide
More informationPopulation Variance Based Empirical Analysis of. the Behavior of Differential Evolution Variants
Applied Mathematical Sciences, Vol. 9, 2015, no. 66, 3249-3263 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2015.54312 Population Variance Based Empirical Analysis of the Behavior of Differential
More informationWeight minimization of trusses with natural frequency constraints
th World Congress on Structural and Multidisciplinary Optimisation 0 th -2 th, June 20, Sydney Australia Weight minimization of trusses with natural frequency constraints Vu Truong Vu Ho Chi Minh City
More informationResearch Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems
Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear
More informationComparison between cut and try approach and automated optimization procedure when modelling the medium-voltage insulator
Proceedings of the 2nd IASME / WSEAS International Conference on Energy & Environment (EE'7), Portoroz, Slovenia, May 15-17, 27 118 Comparison between cut and try approach and automated optimization procedure
More informationAdaptive Differential Evolution and Exponential Crossover
Proceedings of the International Multiconference on Computer Science and Information Technology pp. 927 931 ISBN 978-83-60810-14-9 ISSN 1896-7094 Adaptive Differential Evolution and Exponential Crossover
More informationBi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure
Bi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure Kalyanmoy Deb 1, Ralph E. Steuer 2, Rajat Tewari 3, and Rahul Tewari 4 1 Department of Mechanical Engineering, Indian Institute
More informationNew Reference-Neighbourhood Scalarization Problem for Multiobjective Integer Programming
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 3 No Sofia 3 Print ISSN: 3-97; Online ISSN: 34-48 DOI:.478/cait-3- New Reference-Neighbourhood Scalariation Problem for Multiobjective
More informationInteractive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method
Interactive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method ABSTRACT Kalyanmoy Deb Department of Mechanical Engineering Indian Institute of Technology Kanpur
More informationEvolutionary Multiobjective. Optimization Methods for the Shape Design of Industrial Electromagnetic Devices. P. Di Barba, University of Pavia, Italy
Evolutionary Multiobjective Optimization Methods for the Shape Design of Industrial Electromagnetic Devices P. Di Barba, University of Pavia, Italy INTRODUCTION Evolutionary Multiobjective Optimization
More informationLecture 9 Evolutionary Computation: Genetic algorithms
Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic
More informationMultiobjective Optimization
Multiobjective Optimization MTH6418 S Le Digabel, École Polytechnique de Montréal Fall 2015 (v2) MTH6418: Multiobjective 1/36 Plan Introduction Metrics BiMADS Other methods References MTH6418: Multiobjective
More informationEvolutionary Computation and Convergence to a Pareto Front
Evolutionary Computation and Convergence to a Pareto Front David A. Van Veldhuizen Department of Electrical and Computer Engineering Graduate School of Engineering Air Force Institute of Technology Wright-Patterson
More informationOn the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study
On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study Yang Yu, and Zhi-Hua Zhou, Senior Member, IEEE National Key Laboratory for Novel Software Technology Nanjing University,
More informationGaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress
Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague
More informationTHE objective of global optimization is to find the
Large Scale Global Optimization Using Differential Evolution With Self-adaptation and Cooperative Co-evolution Aleš Zamuda, Student Member, IEEE, Janez Brest, Member, IEEE, Borko Bošković, Student Member,
More informationA COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION
A COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION Vu Truong Vu Ho Chi Minh City University of Transport, Faculty of Civil Engineering No.2, D3 Street, Ward 25, Binh Thanh District,
More informationMultiobjective Optimisation An Overview
ITNPD8/CSCU9YO Multiobjective Optimisation An Overview Nadarajen Veerapen (nve@cs.stir.ac.uk) University of Stirling Why? Classic optimisation: 1 objective Example: Minimise cost Reality is often more
More informationExpected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions
Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Nilanjan Banerjee and Rajeev Kumar Department of Computer Science and Engineering Indian Institute
More informationPrediction-based Population Re-initialization for Evolutionary Dynamic Multi-objective Optimization
Prediction-based Population Re-initialization for Evolutionary Dynamic Multi-objective Optimization Aimin Zhou 1, Yaochu Jin 2, Qingfu Zhang 1, Bernhard Sendhoff 2, and Edward Tsang 1 1 Department of Computer
More informationTIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017
TIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017 Jussi Hakanen jussi.hakanen@jyu.fi Contents A priori methods A posteriori methods Some example methods Learning
More informationDynamic Optimization using Self-Adaptive Differential Evolution
Dynamic Optimization using Self-Adaptive Differential Evolution IEEE Congress on Evolutionary Computation (IEEE CEC 2009), Trondheim, Norway, May 18-21, 2009 J. Brest, A. Zamuda, B. Bošković, M. S. Maučec,
More informationAnalyses of Guide Update Approaches for Vector Evaluated Particle Swarm Optimisation on Dynamic Multi-Objective Optimisation Problems
WCCI 22 IEEE World Congress on Computational Intelligence June, -5, 22 - Brisbane, Australia IEEE CEC Analyses of Guide Update Approaches for Vector Evaluated Particle Swarm Optimisation on Dynamic Multi-Objective
More informationConstrained Multi-objective Optimization Algorithm Based on ε Adaptive Weighted Constraint Violation
Constrained Multi-objective Optimization Algorithm Based on ε Adaptive Weighted Constraint Violation College of Information Science and Engineering, Dalian Polytechnic University Dalian,3,China E-mail:
More informationSafety Envelope for Load Tolerance and Its Application to Fatigue Reliability Design
Safety Envelope for Load Tolerance and Its Application to Fatigue Reliability Design Haoyu Wang * and Nam H. Kim University of Florida, Gainesville, FL 32611 Yoon-Jun Kim Caterpillar Inc., Peoria, IL 61656
More informationBeta Damping Quantum Behaved Particle Swarm Optimization
Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,
More informationDecomposition and Metaoptimization of Mutation Operator in Differential Evolution
Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Karol Opara 1 and Jaros law Arabas 2 1 Systems Research Institute, Polish Academy of Sciences 2 Institute of Electronic
More informationAn artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.
An artificial chemical reaction optimization algorithm for multiple-choice knapsack problem Tung Khac Truong 1,2, Kenli Li 1, Yuming Xu 1, Aijia Ouyang 1, and Xiaoyong Tang 1 1 College of Information Science
More informationRobust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties
. Hybrid GMDH-type algorithms and neural networks Robust Pareto Design of GMDH-type eural etworks for Systems with Probabilistic Uncertainties. ariman-zadeh, F. Kalantary, A. Jamali, F. Ebrahimi Department
More informationPIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems
PIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems Pruet Boonma Department of Computer Engineering Chiang Mai University Chiang Mai, 52, Thailand Email: pruet@eng.cmu.ac.th
More informationTECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531
TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence
More informationLooking Under the EA Hood with Price s Equation
Looking Under the EA Hood with Price s Equation Jeffrey K. Bassett 1, Mitchell A. Potter 2, and Kenneth A. De Jong 1 1 George Mason University, Fairfax, VA 22030 {jbassett, kdejong}@cs.gmu.edu 2 Naval
More informationEffects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization
Effects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization Hisao Ishibuchi, Yusuke Nojima, Noritaka Tsukamoto, and Ken Ohara Graduate School of Engineering, Osaka
More informationProgram Evolution by Integrating EDP and GP
Program Evolution by Integrating EDP and GP Kohsuke Yanai and Hitoshi Iba Dept. of Frontier Informatics, Graduate School of Frontier Science, The University of Tokyo. 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8654,
More informationFinding Robust Solutions to Dynamic Optimization Problems
Finding Robust Solutions to Dynamic Optimization Problems Haobo Fu 1, Bernhard Sendhoff, Ke Tang 3, and Xin Yao 1 1 CERCIA, School of Computer Science, University of Birmingham, UK Honda Research Institute
More informationTowards Automatic Design of Adaptive Evolutionary Algorithms. Ayman Srour - Patrick De Causmaecker
Towards Automatic Design of Adaptive Evolutionary Algorithms Ayman Srour - Patrick De Causmaecker Outline Background Parameter tuning Adaptive Parameter control Preliminary investigation The proposed framework
More informationAMULTIOBJECTIVE optimization problem (MOP) can
1 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1 Letters 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 Decomposition-Based Multiobjective Evolutionary Algorithm with an Ensemble of Neighborhood Sizes Shi-Zheng
More informationOn Performance Metrics and Particle Swarm Methods for Dynamic Multiobjective Optimization Problems
On Performance Metrics and Particle Swarm Methods for Dynamic Multiobjective Optimization Problems Xiaodong Li, Jürgen Branke, and Michael Kirley, Member, IEEE Abstract This paper describes two performance
More informationx 2 i 10 cos(2πx i ). i=1
CHAPTER 2 Optimization Written Exercises 2.1 Consider the problem min, where = 4 + 4 x 2 i 1 cos(2πx i ). i=1 Note that is the Rastrigin function see Section C.1.11. a) What are the independent variables
More informationInteger weight training by differential evolution algorithms
Integer weight training by differential evolution algorithms V.P. Plagianakos, D.G. Sotiropoulos, and M.N. Vrahatis University of Patras, Department of Mathematics, GR-265 00, Patras, Greece. e-mail: vpp
More informationSolving High-Dimensional Multi-Objective Optimization Problems with Low Effective Dimensions
Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17) Solving High-Dimensional Multi-Objective Optimization Problems with Low Effective Dimensions Hong Qian, Yang Yu National
More informationAn Evolution Strategy for the Induction of Fuzzy Finite-state Automata
Journal of Mathematics and Statistics 2 (2): 386-390, 2006 ISSN 1549-3644 Science Publications, 2006 An Evolution Strategy for the Induction of Fuzzy Finite-state Automata 1,2 Mozhiwen and 1 Wanmin 1 College
More informationGenetic Algorithm. Outline
Genetic Algorithm 056: 166 Production Systems Shital Shah SPRING 2004 Outline Genetic Algorithm (GA) Applications Search space Step-by-step GA Mechanism Examples GA performance Other GA examples 1 Genetic
More informationEvolutionary Algorithms
Evolutionary Algorithms a short introduction Giuseppe Narzisi Courant Institute of Mathematical Sciences New York University 31 January 2008 Outline 1 Evolution 2 Evolutionary Computation 3 Evolutionary
More informationA Genetic Algorithm Approach for Doing Misuse Detection in Audit Trail Files
A Genetic Algorithm Approach for Doing Misuse Detection in Audit Trail Files Pedro A. Diaz-Gomez and Dean F. Hougen Robotics, Evolution, Adaptation, and Learning Laboratory (REAL Lab) School of Computer
More informationEnsemble determination using the TOPSIS decision support system in multi-objective evolutionary neural network classifiers
Ensemble determination using the TOPSIS decision support system in multi-obective evolutionary neural network classifiers M. Cruz-Ramírez, J.C. Fernández, J. Sánchez-Monedero, F. Fernández-Navarro, C.
More informationGenetic Algorithms. Seth Bacon. 4/25/2005 Seth Bacon 1
Genetic Algorithms Seth Bacon 4/25/2005 Seth Bacon 1 What are Genetic Algorithms Search algorithm based on selection and genetics Manipulate a population of candidate solutions to find a good solution
More informationA Multiobjective GO based Approach to Protein Complex Detection
Available online at www.sciencedirect.com Procedia Technology 4 (2012 ) 555 560 C3IT-2012 A Multiobjective GO based Approach to Protein Complex Detection Sumanta Ray a, Moumita De b, Anirban Mukhopadhyay
More informationOptimization of Threshold for Energy Based Spectrum Sensing Using Differential Evolution
Wireless Engineering and Technology 011 130-134 doi:10.436/wet.011.3019 Published Online July 011 (http://www.scirp.org/journal/wet) Optimization of Threshold for Energy Based Spectrum Sensing Using Differential
More informationAn Analysis on Recombination in Multi-Objective Evolutionary Optimization
An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China
More informationPopulation-Based Incremental Learning with Immigrants Schemes in Changing Environments
Population-Based Incremental Learning with Immigrants Schemes in Changing Environments Michalis Mavrovouniotis Centre for Computational Intelligence (CCI) School of Computer Science and Informatics De
More informationA Parallel Evolutionary Approach to Multi-objective Optimization
A Parallel Evolutionary Approach to Multi-objective Optimization Xiang Feng Francis C.M. Lau Department of Computer Science, The University of Hong Kong, Hong Kong Abstract Evolutionary algorithms have
More informationDepartment of Artificial Complex Systems Engineering, Graduate School of Engineering, Hiroshima University, Higashi-Hiroshima , Japan
Advances in Operations Research Volume 2009, Article ID 372548, 17 pages doi:10.1155/2009/372548 Research Article An Interactive Fuzzy Satisficing Method for Multiobjective Nonlinear Integer Programming
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO.X, XXXX 1
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI.9/TEVC.27.273778,
More informationCSC 4510 Machine Learning
10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on
More informationA Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions
A Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions Tomohiro Yoshikawa, Toru Yoshida Dept. of Computational Science and Engineering Nagoya University
More informationUsefulness of infeasible solutions in evolutionary search: an empirical and mathematical study
Edith Cowan University Research Online ECU Publications 13 13 Usefulness of infeasible solutions in evolutionary search: an empirical and mathematical study Lyndon While Philip Hingston Edith Cowan University,
More informationAn Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification
International Journal of Automation and Computing 6(2), May 2009, 137-144 DOI: 10.1007/s11633-009-0137-0 An Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification
More informationDE/BBO: A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization
1 : A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization Wenyin Gong, Zhihua Cai, and Charles X. Ling, Senior Member, IEEE Abstract Differential Evolution
More informationProgram Evolution by Integrating EDP and GP
Program Evolution by Integrating EDP and GP Kohsuke Yanai, Hitoshi Iba Dept. of Frontier Informatics, Graduate School of Frontier Science, The University of Tokyo. 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8654,
More informationOptimizing Surface Profiles during Hot Rolling: A Genetic Algorithms Based Multi-objective Optimization
Optimizing Surface Profiles during Hot Rolling: A Genetic Algorithms Based Multi-objective Optimization N. Chakraborti 1, B. Siva Kuamr 1, V. Satish Babu 1, S. Moitra 1, and A. Mukhopadhyay 1 Department
More informationEstimating Traffic Accidents in Turkey Using Differential Evolution Algorithm
SSP - JOURNAL OF CIVIL ENGINEERING Vol. 12, Issue 1, 2017 DOI: 10.1515/sspjce-2017-0008 Estimating Traffic Accidents in Turkey Using Differential Evolution Algorithm Ali Payıdar Akgüngör, Ersin Korkmaz
More informationInter-Relationship Based Selection for Decomposition Multiobjective Optimization
Inter-Relationship Based Selection for Decomposition Multiobjective Optimization Ke Li, Sam Kwong, Qingfu Zhang, and Kalyanmoy Deb Department of Electrical and Computer Engineering Michigan State University,
More informationMultiobjective Optimization of Cement-bonded Sand Mould System with Differential Evolution
DOI: 10.7763/IPEDR. 013. V63. 0 Multiobjective Optimization of Cement-bonded Sand Mould System with Differential Evolution T. Ganesan 1, I. Elamvazuthi, Ku Zilati Ku Shaari 3, and P. Vasant + 1, 3 Department
More informationToward Effective Initialization for Large-Scale Search Spaces
Toward Effective Initialization for Large-Scale Search Spaces Shahryar Rahnamayan University of Ontario Institute of Technology (UOIT) Faculty of Engineering and Applied Science 000 Simcoe Street North
More informationDesign of Higher Order LP and HP Digital IIR Filter Using the Concept of Teaching-Learning Based Optimization
Design of Higher Order LP and HP Digital IIR Filter Using the Concept of Teaching-Learning Based Optimization DAMANPREET SINGH, J.S. DHILLON Department of Computer Science & Engineering Department of Electrical
More informationA Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning
009 Ninth International Conference on Intelligent Systems Design and Applications A Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning Hui Wang, Zhijian Wu, Shahryar Rahnamayan,
More informationCenter-based initialization for large-scale blackbox
See discussions, stats, and author profiles for this publication at: http://www.researchgate.net/publication/903587 Center-based initialization for large-scale blackbox problems ARTICLE FEBRUARY 009 READS
More informationMulti-objective Quadratic Assignment Problem instances generator with a known optimum solution
Multi-objective Quadratic Assignment Problem instances generator with a known optimum solution Mădălina M. Drugan Artificial Intelligence lab, Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels,
More informationA Generalized Quantum-Inspired Evolutionary Algorithm for Combinatorial Optimization Problems
A Generalized Quantum-Inspired Evolutionary Algorithm for Combinatorial Optimization Problems Julio M. Alegría 1 julio.alegria@ucsp.edu.pe Yván J. Túpac 1 ytupac@ucsp.edu.pe 1 School of Computer Science
More information