Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms
|
|
- Chad Dawson
- 6 years ago
- Views:
Transcription
1 Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms Tadahiko Murata 1, Shiori Kaige 2, and Hisao Ishibuchi 2 1 Department of Informatics, Kansai University Ryozenji-cho, Takatsuki, Osaka , Japan murata@res.kutc.kansai-u.ac.jp, http: 2 Department of Industrial Engineering, Osaka Prefecture University 1-1 Gakuen-cho, Sakai, Osaka , Japan {hisaoi, shior}@ie.osakafu-u.ac.jp, Abstract. In this paper, we generalize the replacement rules based on the dominance relation in multiobjective optimization. Ordinary two replacement rules based on the dominance relation are usually employed in a local search (LS) for multiobjective optimization. One is to replace a current solution with a solution which dominates it. The other is to replace the solution with a solution which is not dominated by it. The movable area in the LS with the first rule is very small when the number of objectives is large. On the other hand, it is too huge to move efficiently with the latter. We generalize these extreme rules by counting the number of improved objectives in a candidate solution for LS. We propose a LS with the generalized replacement rule for existing EMO algorithms. Its effectiveness is shown on knapsack problems with two, three, and four objectives. 1 Introduction Since Schaffer s study [1], evolutionary algorithms have been applied to various multiobjective optimization problems for finding their Pareto-optimal solutions. Recently evolutionary algorithms for multiobjective optimization are often referred to as EMO (evolutionary multiobjective optimization) algorithms. The task of EMO algorithms is to find Pareto-optimal solutions as many as possible. In recent studies (e.g., [2-6]), emphasis was placed on the convergence speed to the Pareto-front as well as the diversity of solutions. In those studies, some form of elitism was used as an important ingredient of EMO algorithms. It was shown that use of elitism improved the convergence speed to the Pareto-front [5]. One promising approach for improving the convergence speed to the Pareto-front is the use of local search in EMO algorithms. Hybridization of evolutionary algorithms with local search has already been investigated for single-objective optimization problems in many studies (e.g., [7], [8]). Such a hybrid algorithm is often referred to as a memetic algorithm. See Moscato [9] for an introduction to this field and [10] [12] for recent developments. The hybridization with local search for multiobjective E. Cantú-Paz et al. (Eds.): GECCO 2003, LNCS 2723, pp , Springer-Verlag Berlin Heidelberg 2003
2 Generalization of Dominance Relation-Based Replacement Rules 1235 optimization was first implemented in [13], [14] as a multiobjective genetic local search (MOGLS) algorithm where a scalar fitness function with random weights was used for the selection of parents and the local search for their offspring. Jaszkiewicz [15] improved the performance of the MOGLS by modifying its selection mechanism of parents. While his MOGLS still used the scalar fitness function with random weights in selection and local search, it did not use the roulette wheel selection over the entire population. A pair of parents was randomly selected from a pre-specified number of the best solutions with respect to the scalar fitness function with the current weights. This selection scheme can be viewed as a kind of mating restriction in EMO algorithms. Knowles & Corne [16] combined their Pareto archived evolution strategy (PAES [2], [4]) with a crossover operation for designing a memetic PAES (M-PAES) [17]. In their M-PAES, the Pareto-dominance relation and the grid-type partition of the objective space were used for determining the acceptance (or rejection) of new solutions generated in genetic search and local search. The M-PAES had a special form of elitism inherent in the PAES. In those studies, the M-PAES was compared with the PAES, the MOGLS of Jaszkiewicz [15], and an EMO algorithm. In the above-mentioned hybrid EMO algorithms (i.e., multiobjective memetic algorithms [13]-[17]), local search was applied to individuals in every generation. In some studies [18], [19], local search is restrictedly applied to every generation by limited local search only to non-dominated solutions [18] or introducing the tournament selection and the selection probability of candidate solutions for local search [19]. Another way of application of local search is proposed in [20], [21], where local search was applied to individuals only in the final generation. In order to design a local search for multiobjective optimization, a rule for replacing a current solution with another solution should be defined in advance. Murata et al. [22] showed experimental results on pattern classification problems where a scalar fitness function-based replacement rule was better than the dominance relation-based replacement rules. In the dominance relation-based replacement rules, the current solution is replaced with a solution which dominates the current one or a solution which is at least a non-dominated solution with the current one. Ishibuchi et al. [23] also pointed this matter by experimental results on scheduling problems. As mentioned in [17], [22] and [23], the replacement rule to accept a non-dominated solution has a weak search pressure since almost all pairs of solutions (a current solution and a candidate solution) will be non-dominated with respect to each other especially in problems with a large number of objectives. On the other hand, the replacement rule to accept a dominating solution does not work well because it is difficult to find a dominating solution for the current solution. We generalize the replacement rules based on the dominance relation by counting the number of better objective values. Details of the dominance relation-based replacement rules are shown in the next section. We employ a local search using the proposed replacement rule to improve the performance of existing EMO algorithms such as SPEA [3] and NSGA-II [6]. We apply them with the local search to multiobjective knapsack problems as benchmark problems [3], and show the effectiveness of our generalization.
3 1236 T. Murata, S. Kaige, and H. Ishibuchi 2 Dominance Relation-Based Replacement Rules 2.1 Previous Extensions of Dominance Relation In order to improve the performance of local search using the dominance relation, several ideas to extend it have been already proposed in [17], [24], and [25]. Knowles and Corne [17] proposed a replacement rule by which a current solution is replaced with a non-dominated solution if it dominates other solutions in the set of nondominated solutions obtained so far. Ikeda et al. [24] proposed their α -dominance where a small detriment in one or several of the objectives is permitted if an attractive improvement in the other objective(s) is achieved. While these two methods try to extend the area of dominating solutions of the current solution, Laumanns et al. [25] proposed their ε -dominance where a solution with a small improvement in every normalized objective does not dominate the current one. Each of these three methods can be considered as an extension of the dominance relation. Before explaining these extensions, we show the dominance relation defined in multiobjective optimization. Without loss of generality, we assume the following N-objective maximization problem: Maximize z = ( f 1( x), f 2 ( x),..., f N ( x)), (1) subject to x X, (2) where z is the objective vector with N objectives to be maximized, x is the decision vector, and X is the feasible region in the decision space. A solution x X is said to dominate another solution y X if the following two conditions are satisfied. f ( x) f ( y), i { 1,2,..., N}, (3) i i f ( x) > f ( y), i { 1,2,..., N}. (4) i i If there is no solution which dominates x in X, x can be said to be a Pareto-optimal solution. Fig. 1 shows that there are four areas of candidate solutions for the solution x in the case of two-objective problems. When we employ this dominance relation in local search, two replacement rules can be used in the local search as follows: Rule A: Move to dominating solutions: Replace the solution x with a solution which dominates it (Area A in Fig. 1). Rule B: Move to non-dominated solutions: Replace the solution x with a solution which is not dominated by x (Areas A - C). The movable area in the local search with Rule A is very small when the number of objectives is large. On the other hand, it is too huge to move efficiently with Rule B. Therefore some extensions for the dominance relation should be considered. As shown in Fig. 2, Knowles and Corne [17] extended the area of dominating solutions using non-dominated solutions obtained so far. Fig. 3 shows the dominating area of the current one defined by the α -dominance relation [24]. While these two methods enlarge the dominating area of the current solution, the ε -dominance relation [25] reduces the dominating area as shown in Fig. 4. Since the aim of the ε - dominance is reducing the number of non-dominated solutions obtained by this dominance relation, an opposite strategy is used. However, we can see that the area of
4 Generalization of Dominance Relation-Based Replacement Rules 1237 non-dominated solutions (B and C) is also reduced by the hatched area which consists of dominated solutions (D in Fig. 4). Therefore we can see that this is also a method to reduce the area of non-dominated solutions. As we observe from Figs. 2-4, the area of non-dominated solutions is reduced by these three methods. However, each of them needs more computational efforts. The method in [17] needs to compare the candidate solution with non-dominated solutions. Its performance may depend on the quality of the obtained set of nondominated solutions. As for the α -dominance [24], the decision maker (DM) should define parameters β and γ for every pair of objectives in advance. The DM should also define a parameter ε in advance to use the ε -dominance [25], and ε should be determined carefully since a large ε makes a solution obtained by this method far from the true Pareto-front. Ã f 2 (x) Ã f 2 (x) C D x A B C D x A B 0 f 1 (x) 0 f 1 (x) Fig. 1. The area of candidate solutions which replace the current solution x by the dominance relation. Fig. 2. The area of candidate solutions which replace the current solution x by the method in [17]. Ã f 2 (x) Ã f 2 (x) C A C ε Ã A D γ Ã β Ã x B D x ε Ã B 0 f 1 (x) 0 f 1 (x) Fig. 3. The area of candidate solutions which replace the current solution x by the α -dominance [24]. Fig. 4. The area of candidate solutions which replace the current solution x by the ε -dominance [25].
5 1238 T. Murata, S. Kaige, and H. Ishibuchi 2.2 Generalization of the Dominance Relation In this section, we generalize the two replacement rules (i.e., Rules A and B) shown in the previous section by counting the number of improved objectives. Fig. 5 shows that there are eight possible spaces for the solution x in the case of three-objective problems. Every solution in Space A dominates the solution x. On the other hand, the solution x dominates solutions in Space H. Therefore Rule A allows the current solution to move to a solution in only one space, Space A. On the other hand, Rule B enables it to move to neighborhood solutions in all spaces except a dominated space N N H in Fig. 5. This means that ( 2 1) spaces are allowed out of 2 spaces in Rule B. We can see that the number of accepted spaces is extreme in each of both cases. That N is, while the number of accepted spaces is only one as for Rule A, it is ( 2 1) for Rule B. We generalize these two extreme cases by counting the number of improved objectives. The number of improved objectives for the solution x is different in each space. For example, Fig. 5 shows that the number of improved objectives for a solution in Space A is three. It is zero for a solution in Space H. There are other spaces where the number of improved objectives is one or two. That is, Spaces B, C, and E have two improved objectives, and Spaces D, F, and G have one. In the case of N-objective problems, the number of possible spaces from the current N solution is 2. The number of improved objectives varies from zero to N in this case. We generalize the replacement rules A and B by considering the number of improved objectives d. That is, the current solution x is replaced with a solution which has d or more improved objective values. We have the following generalized rule: Rule d: Move to d-improved Solutions: Replace the current solution x with a solution which has d or more improved objectives. f 2 A C A H f 3 x G H D E F B 0 f 1 Fig. 5. Eight spaces for the current solution in the case of three-objective problems.
6 Generalization of Dominance Relation-Based Replacement Rules 1239 Ã Initialization EMO Part Initial population Improved population New population Local Search Part Fig. 6. Generic form of our local search part and EMO part. By varying the value of d, we have the following rules where N is the number of objectives: Rule N : Accept a solution which has N better objective values. Rule N 1: Accept a solution which has N 1 or more better objective values. M Rule 2: Accept a solution which has at least two better objective values. Rule 1: Accept a solution which has at least one better objective value. Therefore, Rules A and B in Subsection 2.1 are Rule N and Rule 1 of the proposed rule, respectively. 3 Local Search Using the Generalized Replacement Rule The outline of our local search can be written in a generic form as Fig. 6. This figure shows a basic structure of simple memetic algorithms. As shown in Fig. 6, our local search part can be applied to every EMO algorithm. For other types of memetic algorithms, see Krasnogor [26] where taxonomy of memetic algorithms was given using an index number D. This type of memetic algorithms is a D = 4 memetic algorithm in his taxonomy (for details, see [26]). We design our local search as follows: [Proposed Local Search] Iterate the following seven steps N pop times, where N pop is the number of populations to be governed by genetic operations such as crossover and mutation in an EMO algorithm. Then replace the current population with N pop solutions obtained by the following steps. Step 1: Randomly choose two individuals from the current population. Step 2: Count the number of better objective values between the two solutions. Select a solution which has a larger number of better objective values.
7 1240 T. Murata, S. Kaige, and H. Ishibuchi Step 3: Select another solution from the current population, and back to Step 2 until t solutions are compared from the current population. Step 4: Apply local search with the local search probability p LS. If it is applied, go to Step 5. If not, go to Step 7. Step 5: Generate a neighborhood solution of the current solution, and calculate the objectives of the generated solution. Count the number of improved objectives by the generated solution. Step 6: If the number of improved objective values is d or more, replace the current solution with the generated solution, and back to Step 5 for examining the neighborhood solution for the generated solution. If not, back to Step 5 until the number of examined solution for the current solution becomes k. If there is no better solution among k neighborhood solutions, go to Step 7. Step 7: Back to Step 1 until N solutions are selected for the local search. pop When local search is applied to the selected solution in Step 4, the final solution obtained by the local search is included in the next population. If local search is not applied, the selected solution is included in the next population. Therefore Steps 1-3 can be considered as the tournament selection for selecting candidate solutions for local search. In this tournament selection, we also employ the idea of the generalized replacement rule. That is, we select a solution with respect to the number of better objective values among t solutions. We use the local search probability p LS for decreasing the number of solutions to which local search is applied. In this way, local search is not applied to all the selected solutions. If local search is applied to all the solution among the population, the computation time may be wasted to improve dominated solutions. Moreover, we also employ the number of examined solutions k in Step 6 in order to control the balance between genetic search and local search. Since this proposed local search can be applied to any EMO algorithm, we apply our local search to SPEA [3] and NSGA-II [6]. In order to show its effectiveness, we employ multiobjective knapsack problems [3]. We show results of computer simulations in the next section. 4 Computer Simulations on Multiobjective Knapsack Problems 4.1 Multiobjective Knapsack Problems We employ multiobjective knapsack problems [3] to which we applied EMO algorithms with the proposed local search. Those test problems are available from the web site ( Generally, a 0/1 knapsack problem consists of a set of items, weight and profit associated with each item, and an upper bound for the capacity of the knapsack. The task is to find a subset of items which maximizes the total profits within the prespecified total weight of the items. This single-objective problem was extended to the multiobjective case by allowing an arbitrary number of knapsacks in [3]. In the multiobjective knapsack problem, there are m items and N knapsacks. Profits of items, weights of items, and capacities of knapsacks are denoted as follows:
8 Generalization of Dominance Relation-Based Replacement Rules 1241 p ij : profit of item j according to knapsack i, (5) w ij : weight of item j according to knapsack i, (6) c i : capacity of knapsack i, (7) where j = 1,..., m and i = 1,..., N. (8) The decision vector x of this problem is x = ( x1, x2,..., xm ), where x j is a binary decision value. x j = 0 and x j = 1 mean that the item j is not included in the knapsacks, and it is included in the knapsacks, respectively. Thus N-objective problem can be written as follows: Maximize z ( 1( x), 2 ( x),..., N ( x)) (9) m m subject to x {0,1}, and = w j ij x j c 1 i for i = 1,..., N, (10) where each objective function is described in the following form: m fi (x) = = p j ij x 1 j for i = 1,..., N. (11) On the website of the first author of [3], problems with 100, 250, 500, and 750 items for 2, 3 and 4 knapsacks are available. We employed the SPEA [3] and the NSGA-II [6] as representative EMO algorithms. These algorithms are known as high performance algorithms for multiobjective problems. We preliminarily defined the parameters in both the algorithms as shown in Table 1. We employed a one point crossover with the crossover rate 0.8, a bit-flip mutation with the mutation rate 0.01 to each bit. In our local search, we specified the tournament size as t = 6, the selection probability as p LS = 0. 1, and the number of examined solution as k = 3. We commonly used these parameters to the SPEA and the NSGA-II. We generated 30 sets of different initial solutions and applied each of EMO algorithms to them. After that we apply the three- and four-objective knapsack problems with 250, 500, and 750 items to show the effectiveness of generalizing the replacement rule based on the dominance relation. 4.2 Experimental Results on a Two-Objective Problem We show the effectiveness of our local search on two-objective knapsack problems. We employed a 750-item problem for two knapsacks to show non-dominated solutions obtained by EMO algorithms with/without our local search. Therefore we applied each of four EMO algorithms to the problem. In this case, we can not show the effectiveness of generalization using the number of improved objective values d because the value of d can vary one or two for two-objective problems. We can see, however, the performance of the basic architecture of our local search with d = 2. First we obtained 30 sets of non-dominated solutions by each algorithm. In order to depict figures clearly, we employed the 50%-attainment surface [27]. An attainment surface is a kind of trade-off surface obtained by a single run of an EMO algorithm. And 50%-attainment surface shows the estimated attainment surface obtained by at least 50% of multiple runs. Figs. 7 and 8 show the 50%-attainment surface of 30 sets of non-dominated solutions obtained by four EMO algorithms. Each axis of the figures shows the total profit of each knapsack. In a two objective problem, the total profit of each of two knapsacks is maximized.
9 1242 T. Murata, S. Kaige, and H. Ishibuchi From Figs. 7 and 8, we can see that the 50%-attaiment surface is improved by introducing the local search in the area of compromised solutions. Considering only a single objective, each of the original EMO algorithm (i.e., without the local search) found better solutions. Since we employed the local search with d = 2, the local search allows a solution to move to only a dominating solution in two objectives. Therefore the attainment surface in the area of compromised solutions is improved by the local search. As for the local search using the generalized replacement rule with d = 1, we could not obtain better attainment surfaces than the original EMO algorithms. As explained in [23], a drawback of the replacement rule which allows a solution to move to nondominated solutions is that the current solution can be degraded by multiple moves. That may be a reason why the local search with d = 1 was not effective to this problem. Table 1. Parameter settings for EMO algorithms. Problem (objectives, items) # of evaluations Population size Secondary Population Size in SPEA (2, 750) (3, 250) (3, 500) (3, 750) (4, 250) (4, 500) (4, 750) No LS d=2 d = 2 LS LS No No LS LS d=2 d = LS 2 LS Fig %-attainment surface obtained by SPEA with/without local search Fig %-attainment surface obtained by NSGA-II with/without local search. 4.3 Experimental Results on Three- and Four-Objective Problems In the case of two-objective problems, we can depict the set distribution by twodimensional graphs, it is difficult, however, to show the set distribution by figures for more than three-objective problems. In this paper, we use the coverage metric [3] to evaluate two sets of non-dominated solutions obtained by two EMO algorithms. Let X, X X be two sets of non-dominated solutions. The coverage metric can be defined as follows:
10 Generalization of Dominance Relation-Based Replacement Rules 1243 C ( X, X ) = { a X a X : a fa } / X. (12) The value C ( X, X ) = 1 means that all points in X are dominated by or equal to points in X. On the other hand, C ( X, X ) = 0 represents that no solutions in X are covered by the set X. It is noted that both C ( X, X ) and C ( X, X ) have to be considered, since C ( X, X ) is not necessarily equal to C ( X, X ). Table 2. SPEA for 3-objective problems (250, 500, and 750 items). 3 objectives No LS d = 1 d = 2 d = 3 No LS d = d = d = Table 3. NSGA-II for 3-objective problems (250, 500, and 750 items). 3 objectives No LS d = 1 d = 2 d = 3 No LS d = d = d = Table 4. SPEA for 4-objective problems (250, 500, and 750 items). 4 objectives No LS d = 1 d = 2 d = 3 d = 4 No LS d = d = d = d = We applied the EMO algorithms with/without our local search to three threeobjective problems and three four-objective problems using the parameters in Table 1. We varied the number of improved objective values d as d = 1,2, 3 in three-objective problems, and d = 1,2,3, 4 in four-objective problems. Therefore we had four variants for each EMO algorithm in the case of three-objective problems, and five in the case of four-objective problems. We compare two sets of non-dominated solutions using the coverage metric, and calculate an average value over 30 trials. Tables 2 5 show the summarization of the results for each EMO algorithm. We average the values of the coverage over three different items. The second column of Tables 2 and 3 shows that the sets obtained original algorithm is covered by those obtained by algorithms with the proposed local search. For example, in the cell of the 4th row and the second column in Table 2 shows that % solutions obtained by the original EMO algorithm (SPEA with No LS) are covered by solutions obtained by SPEA with the local search ( d = 2 ). From Tables 2 5, we can see that the better sets obtained by our local search with d = 2 for three-objective problems, and d = 2, 3 for four-objective problems.
11 1244 T. Murata, S. Kaige, and H. Ishibuchi Table 5. NSGA-II for 4-objective problems (250, 500, and 750 items). 4 objectives No LS d = 1 d = 2 d = 3 d = 4 No LS d = d = d = d = Due to the page limitation, we don t show detail results on each problem. Further information is shown in the first authors web ( ~murata/research.html). Through the experiments, we found that the proposed local search was effective in the case of larger problems. That is, it was more effective in 500-item problem than 250, and 750 than 500 for each of three- and four-objective problems. 5 Conclusion and Future Works In this paper, we generalized the replacement rules based on the dominance relation for local search in multiobjective optimization. Simulation results on knapsack problems with three- and four-objectives showed the effectiveness of the generalized replacement rule by introducing the number of improved objectives. As shown in the experimental results for the two-objective problems, the proposed local search is weak to improve each objective value. We can improve such weakness of the local search with the generalization rule. Since we employed the proposed only to knapsack problems in this paper, we can also apply other types of problems such as permutation problems, and function approximation problems. References [1] J. D. Schaffer, Multiple objective optimization with vector evaluated genetic algorithms, Proc. of 1st Int l Conf. on Genetic Algorithms and Their Applications, pp , [2] J. D. Knowles and D. W. Corne, The Pareto archived evolution strategy: A new baseline algorithm for Pareto multiobjective optimization, Proc. of 1999 Congress on Evolutionary Computation, pp , [3] E. Zitzler and L. Thiele, Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach, IEEE Trans. on Evolutionary Computation, 3 Algorithms: Empirical Results, Evolutionary Computation, 8 (2), pp , [6] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Trans. on Evolutionary Computation, 6 (2), pp , [7] P. Merz and B. Freisleben, Genetic local search for the TSP: New results, Proc. of 4th IEEE Int l Conf. on Evolutionary Computation, pp , [8] N. Krasnogor and J. Smith, A memetic algorithm with self-adaptive local search: TSP as a case study, Proc. of 2000 Genetic and Evolutionary Computation Conf., pp , [9] P. Moscato, Memetic algorithms: A short introduction, in D. Corne, F. Glover, and M. Dorigo (eds.), New Ideas in Optimization, McGraw-Hill, pp , Maidenhead, 1999.
12 Generalization of Dominance Relation-Based Replacement Rules 1245 [10] W. E. Hart, N. Krasnogor, and J. Smith (eds.), First Workshop on Memetic Algorithms (WOMA I), in Proc. of 2000 Genetic and Evolutionary Computation Conf. Workshop Program, pp , [11] W. E. Hart, N. Krasnogor, and J. Smith (eds.), Second Workshop on Memetic Algorithms (WOMA II), in Proc. of 2001 Genetic and Evolutionary Computation Conf. Workshop Program, pp , [12] W. E. Hart, N. Krasnogor, and J. Smith (eds.), Proc. of Third Workshop on Memetic Algorithms (WOMA III), [13] H. Ishibuchi and T. Murata, Multi-objective genetic local search algorithm, Proc. of 3rd IEEE Int l Conf. on Evolutionary Computation, pp , [14] H. Ishibuchi and T. Murata, A multi-objective genetic local search algorithm and its application to flowshop scheduling, IEEE Trans. on Systems, Man, and Cybernetics - Part C: Applications and Reviews, 28 (3), pp , [15] A. Jaszkiewicz, Genetic local search for multi-objective combinatorial optimization, European Journal of Operational Research, 137 (1), pp , [16] J. D. Knowles and D. W. Corne, M-PAES: A memetic algorithm for multiobjective optimization, Proc. of 2000 Congress on Evolutionary Computation, pp , [17] J. D. Knowles and D. W. Corne, A comparison of diverse approaches to memetic multiobjective combinatorial optimization, Proc. of 2000 Genetic and Evolutionary Computation Conf. Workshop Program, pp , [18] T. Murata, H. Nozawa, Y. Tsujimura, M. Gen, and H. Ishibuchi, Effect of local search on the performance of cellular multi-objective genetic algorithms for designing fuzzy rulebased classification systems, Proc. of 2002 Congress on Evolutionary Computation, pp , [19] H. Ishibuchi, T. Yoshida, and T. Murata, Selection of initial solutions for local search in multi-objective genetic local search, Proc. of 2002 Congress on Evolutionary Computation, pp , [20] K. Deb and T. Goel, A hybrid multi-objective evolutionary approach to engineering shape design, Proc. of 1st Int l Conf. on Evolutionary Multi-Criterion Optimization, pp , [21] E. Talbi, M. Rahoual, M. H. Mabed, and C. Dhaenens, A hybrid evolutionary approach for multicriteria optimization problems: Application to the flow shop, Proc. of 1st Int l Conf. on Evolutionary Multi-Criterion Optimization, pp , [22] T. Murata, H. Nozawa, H. Ishibuchi, and M. Gen, Modification of local search directions for non-dominated solutions in cellular multiobjective genetic algorithms for pattern classification problems, Proc. of 2nd Int l Conf. on Evolutionary Multi-Criterion Optimization, 2003 (to appear). [23] H. Ishibuchi, T. Yoshida, and T. Murata, Balance between genetic search and local search in memetic algorithms for multiobjective permutation flowshop scheduling, IEEE Trans. on Evolutionary Computation (to appear). [24] K. Ikeda, H. Kita, and S. Kobayashi, Failure of Pareto-based MOEAs: Does nondominated really mean near to optimal? Proc. of 2001 Congress on Evolutionary Computation, pp , [25] M. Laumanns, L. Thiele, K. Deb, and E. Zitzler, Combining convergence and diversity in evolutionary multiobjective optimization, Evolutionary Computation, 10 (3), pp , [26] N. Krasnogor, Studies on the theory and design space of memetic algorithms, Ph. D. Thesis, University of the West of England, Bristol, June [27] C. M. Fonseca and P. J. Fleming, On the performance assessment and comparison of stochastic multiobjective optimizers, Proc. of Parallel Problem Solving from Nature IV, pp , 1996.
Effects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization
Effects of the Use of Non-Geometric Binary Crossover on Evolutionary Multiobjective Optimization Hisao Ishibuchi, Yusuke Nojima, Noritaka Tsukamoto, and Ken Ohara Graduate School of Engineering, Osaka
More informationBehavior of EMO Algorithms on Many-Objective Optimization Problems with Correlated Objectives
H. Ishibuchi N. Akedo H. Ohyanagi and Y. Nojima Behavior of EMO algorithms on many-objective optimization problems with correlated objectives Proc. of 211 IEEE Congress on Evolutionary Computation pp.
More informationPIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems
PIBEA: Prospect Indicator Based Evolutionary Algorithm for Multiobjective Optimization Problems Pruet Boonma Department of Computer Engineering Chiang Mai University Chiang Mai, 52, Thailand Email: pruet@eng.cmu.ac.th
More informationMulti-objective approaches in a single-objective optimization environment
Multi-objective approaches in a single-objective optimization environment Shinya Watanabe College of Information Science & Engineering, Ritsumeikan Univ. -- Nojihigashi, Kusatsu Shiga 55-8577, Japan sin@sys.ci.ritsumei.ac.jp
More informationQuad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism
Quad-trees: A Data Structure for Storing Pareto-sets in Multi-objective Evolutionary Algorithms with Elitism Sanaz Mostaghim 1 and Jürgen Teich 2 1 Electrical Engineering Department University of Paderborn,
More informationRelation between Pareto-Optimal Fuzzy Rules and Pareto-Optimal Fuzzy Rule Sets
Relation between Pareto-Optimal Fuzzy Rules and Pareto-Optimal Fuzzy Rule Sets Hisao Ishibuchi, Isao Kuwajima, and Yusuke Nojima Department of Computer Science and Intelligent Systems, Osaka Prefecture
More informationRunning time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem
Research Collection Working Paper Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Author(s): Laumanns, Marco; Thiele, Lothar; Zitzler, Eckart;
More informationA Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization
A Non-Parametric Statistical Dominance Operator for Noisy Multiobjective Optimization Dung H. Phan and Junichi Suzuki Deptartment of Computer Science University of Massachusetts, Boston, USA {phdung, jxs}@cs.umb.edu
More informationRunning Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem
Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Marco Laumanns 1, Lothar Thiele 1, Eckart Zitzler 1,EmoWelzl 2,and Kalyanmoy Deb 3 1 ETH Zürich,
More informationResearch Article Optimal Solutions of Multiproduct Batch Chemical Process Using Multiobjective Genetic Algorithm with Expert Decision System
Hindawi Publishing Corporation Journal of Automated Methods and Management in Chemistry Volume 2009, Article ID 927426, 9 pages doi:10.1155/2009/927426 Research Article Optimal Solutions of Multiproduct
More informationRobust Multi-Objective Optimization in High Dimensional Spaces
Robust Multi-Objective Optimization in High Dimensional Spaces André Sülflow, Nicole Drechsler, and Rolf Drechsler Institute of Computer Science University of Bremen 28359 Bremen, Germany {suelflow,nd,drechsle}@informatik.uni-bremen.de
More informationInteractive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method
Interactive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method Kalyanmoy Deb Department of Mechanical Engineering Indian Institute of Technology Kanpur Kanpur,
More informationRuntime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization
Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Tobias Friedrich 1, Christian Horoba 2, and Frank Neumann 1 1 Max-Planck-Institut für Informatik, Saarbrücken, Germany 2
More informationTECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531
TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence
More informationExpected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions
Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Nilanjan Banerjee and Rajeev Kumar Department of Computer Science and Engineering Indian Institute
More informationDESIGN OF OPTIMUM CROSS-SECTIONS FOR LOAD-CARRYING MEMBERS USING MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS
DESIGN OF OPTIMUM CROSS-SECTIONS FOR LOAD-CARRING MEMBERS USING MULTI-OBJECTIVE EVOLUTIONAR ALGORITHMS Dilip Datta Kanpur Genetic Algorithms Laboratory (KanGAL) Deptt. of Mechanical Engg. IIT Kanpur, Kanpur,
More informationPlateaus Can Be Harder in Multi-Objective Optimization
Plateaus Can Be Harder in Multi-Objective Optimization Tobias Friedrich and Nils Hebbinghaus and Frank Neumann Max-Planck-Institut für Informatik, Campus E1 4, 66123 Saarbrücken, Germany Abstract In recent
More informationMultiobjective Optimization
Multiobjective Optimization MTH6418 S Le Digabel, École Polytechnique de Montréal Fall 2015 (v2) MTH6418: Multiobjective 1/36 Plan Introduction Metrics BiMADS Other methods References MTH6418: Multiobjective
More informationRuntime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights
Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Frank Neumann 1 and Andrew M. Sutton 2 1 Optimisation and Logistics, School of Computer Science, The
More informationThe Pickup and Delivery Problem: a Many-objective Analysis
The Pickup and Delivery Problem: a Many-objective Analysis Abel García-Nájera and Antonio López-Jaimes Universidad Autónoma Metropolitana, Unidad Cuajimalpa, Departamento de Matemáticas Aplicadas y Sistemas,
More informationAn Effective Chromosome Representation for Evolving Flexible Job Shop Schedules
An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules Joc Cing Tay and Djoko Wibowo Intelligent Systems Lab Nanyang Technological University asjctay@ntuedusg Abstract As the Flexible
More informationDeposited on: 1 April 2009
Schütze, O. and Vasile, M. and Coello, C.A. (2008) Approximate solutions in space mission design. Lecture Notes in Computer Science, 5199. pp. 805-814. ISSN 0302-9743 http://eprints.gla.ac.uk/5049/ Deposited
More informationAn Improved Multi-Objective Genetic Algorithm for Solving Multi-objective Problems
Appl. Math. Inf. Sci. 7, No. 5, 1933-1941 (2013) 1933 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.12785/amis/070531 An Improved Multi-Objective Genetic Algorithm
More informationDiversity Based on Entropy: A Novel Evaluation Criterion in Multi-objective Optimization Algorithm
I.J. Intelligent Systems and Applications, 2012, 10, 113-124 Published Online September 2012 in MECS (http://www.mecs-press.org/) DOI: 10.5815/ijisa.2012.10.12 Diversity Based on Entropy: A Novel Evaluation
More informationAMULTIOBJECTIVE optimization problem (MOP) can
1 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1 Letters 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 Decomposition-Based Multiobjective Evolutionary Algorithm with an Ensemble of Neighborhood Sizes Shi-Zheng
More informationCombinatorial Optimization of Stochastic Multi-objective Problems: an Application to the Flow-Shop Scheduling Problem
Author manuscript, published in "Evolutionary Multi-criterion Optimization (EMO 2007), Matsushima : Japan (2007)" DOI : 10.1007/978-3-540-70928-2_36 Combinatorial Optimization of Stochastic Multi-objective
More informationInteractive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method
Interactive Evolutionary Multi-Objective Optimization and Decision-Making using Reference Direction Method ABSTRACT Kalyanmoy Deb Department of Mechanical Engineering Indian Institute of Technology Kanpur
More informationOn Performance Metrics and Particle Swarm Methods for Dynamic Multiobjective Optimization Problems
On Performance Metrics and Particle Swarm Methods for Dynamic Multiobjective Optimization Problems Xiaodong Li, Jürgen Branke, and Michael Kirley, Member, IEEE Abstract This paper describes two performance
More informationResearch Article A Novel Ranking Method Based on Subjective Probability Theory for Evolutionary Multiobjective Optimization
Mathematical Problems in Engineering Volume 2011, Article ID 695087, 10 pages doi:10.1155/2011/695087 Research Article A Novel Ranking Method Based on Subjective Probability Theory for Evolutionary Multiobjective
More informationTUTORIAL: HYPER-HEURISTICS AND COMPUTATIONAL INTELLIGENCE
TUTORIAL: HYPER-HEURISTICS AND COMPUTATIONAL INTELLIGENCE Nelishia Pillay School of Mathematics, Statistics and Computer Science University of KwaZulu-Natal South Africa TUTORIAL WEBSITE URL: http://titancs.ukzn.ac.za/ssci2015tutorial.aspx
More informationCovariance Matrix Adaptation in Multiobjective Optimization
Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France Mastertitelformat Scenario: Multiobjective
More informationAn Evolutionary Strategy for. Decremental Multi-Objective Optimization Problems
An Evolutionary Strategy or Decremental Multi-Objective Optimization Problems Sheng-Uei Guan *, Qian Chen and Wenting Mo School o Engineering and Design, Brunel University, UK Department o Electrical and
More informationTECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531
TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence
More informationA Novel Multiobjective Formulation of the Robust Software Project Scheduling Problem
A Novel Multiobjective Formulation of the Robust Problem Francisco Chicano, Alejandro Cervantes, Francisco Luna, Gustavo Recio 1 / 30 Software projects usually involve many people and many resources that
More informationA Brief Introduction to Multiobjective Optimization Techniques
Università di Catania Dipartimento di Ingegneria Informatica e delle Telecomunicazioni A Brief Introduction to Multiobjective Optimization Techniques Maurizio Palesi Maurizio Palesi [mpalesi@diit.unict.it]
More informationComparison of Multi-Objective Genetic Algorithms in Optimizing Q-Law Low-Thrust Orbit Transfers
Comparison of Multi-Objective Genetic Algorithms in Optimizing Q-Law Low-Thrust Orbit Transfers Seungwon Lee, Paul v on Allmen, Wolfgang Fink, Anastassios E. Petropoulos, and Richard J. Terrile Jet Propulsion
More informationPerformance Measures for Dynamic Multi-Objective Optimization
Performance Measures for Dynamic Multi-Objective Optimization Mario Cámara 1, Julio Ortega 1, and Francisco de Toro 2 1 Dept. of Computer Technology and Architecture 2 Dept. of Signal Theory, Telematics
More informationA Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions
A Study on Non-Correspondence in Spread between Objective Space and Design Variable Space in Pareto Solutions Tomohiro Yoshikawa, Toru Yoshida Dept. of Computational Science and Engineering Nagoya University
More informationGenetic Algorithms and Genetic Programming Lecture 17
Genetic Algorithms and Genetic Programming Lecture 17 Gillian Hayes 28th November 2006 Selection Revisited 1 Selection and Selection Pressure The Killer Instinct Memetic Algorithms Selection and Schemas
More informationPrediction-based Population Re-initialization for Evolutionary Dynamic Multi-objective Optimization
Prediction-based Population Re-initialization for Evolutionary Dynamic Multi-objective Optimization Aimin Zhou 1, Yaochu Jin 2, Qingfu Zhang 1, Bernhard Sendhoff 2, and Edward Tsang 1 1 Department of Computer
More informationEffect of Rule Weights in Fuzzy Rule-Based Classification Systems
506 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 9, NO. 4, AUGUST 2001 Effect of Rule Weights in Fuzzy Rule-Based Classification Systems Hisao Ishibuchi, Member, IEEE, and Tomoharu Nakashima, Member, IEEE
More informationTECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531
TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence
More informationA GA Mechanism for Optimizing the Design of attribute-double-sampling-plan
A GA Mechanism for Optimizing the Design of attribute-double-sampling-plan Tao-ming Cheng *, Yen-liang Chen Department of Construction Engineering, Chaoyang University of Technology, Taiwan, R.O.C. Abstract
More informationA Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming
A Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming Arnaud Liefooghe, Sébastien Verel, Jin-Kao Hao To cite this version: Arnaud Liefooghe, Sébastien Verel, Jin-Kao Hao.
More informationBi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure
Bi-objective Portfolio Optimization Using a Customized Hybrid NSGA-II Procedure Kalyanmoy Deb 1, Ralph E. Steuer 2, Rajat Tewari 3, and Rahul Tewari 4 1 Department of Mechanical Engineering, Indian Institute
More informationEvolving Dynamic Multi-objective Optimization Problems. with Objective Replacement. Abstract
Evolving Dynamic Multi-objective Optimization Problems with Objective eplacement Sheng-Uei Guan, Qian Chen and Wenting Mo Department o Electrical and Computer Engineering National University o Singapore
More informationCSC 4510 Machine Learning
10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on
More informationAMultiobjective optimisation problem (MOP) can be
MOEA/D with Tabu Search for Multiobjective Permutation Flow Shop Scheduling Problems Ahmad Alhindi, Student Member, IEEE, and Qingfu Zhang, Senior Member, IEEE, Abstract Multiobjective Evolutionary Algorithm
More informationEfficient Non-domination Level Update Method for Steady-State Evolutionary Multi-objective. optimization
Efficient Non-domination Level Update Method for Steady-State Evolutionary Multi-objective Optimization Ke Li, Kalyanmoy Deb, Fellow, IEEE, Qingfu Zhang, Senior Member, IEEE, and Qiang Zhang COIN Report
More informationFacility Location Problems with Random Demands in a Competitive Environment
Facility Location Problems with Random Demands in a Competitive Environment Takeshi Uno, Hideki Katagiri, and Kosuke Kato Abstract This paper proposes a new location problem of competitive facilities,
More informationMultiobjective Optimisation An Overview
ITNPD8/CSCU9YO Multiobjective Optimisation An Overview Nadarajen Veerapen (nve@cs.stir.ac.uk) University of Stirling Why? Classic optimisation: 1 objective Example: Minimise cost Reality is often more
More informationA Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming
A Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming Arnaud Liefooghe, Sébastien Verel, Jin-Kao Hao To cite this version: Arnaud Liefooghe, Sébastien Verel, Jin-Kao Hao.
More informationMulti-objective Quadratic Assignment Problem instances generator with a known optimum solution
Multi-objective Quadratic Assignment Problem instances generator with a known optimum solution Mădălina M. Drugan Artificial Intelligence lab, Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels,
More informationUniversiteit Leiden Opleiding Informatica
Universiteit Leiden Opleiding Informatica Portfolio Selection from Large Molecular Databases using Genetic Algorithms Name: Studentnr: Alice de Vries 0980013 Date: July 2, 2014 1st supervisor: 2nd supervisor:
More informationGDE3: The third Evolution Step of Generalized Differential Evolution
GDE3: The third Evolution Step of Generalized Differential Evolution Saku Kukkonen Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute of Technology Kanpur Kanpur, PIN 28 16, India saku@iitk.ac.in,
More informationPerformance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems
Performance Assessment of Generalized Differential Evolution 3 with a Given Set of Constrained Multi-Objective Test Problems Saku Kukkonen, Student Member, IEEE and Jouni Lampinen Abstract This paper presents
More informationMultiobjective Evolutionary Algorithms. Pareto Rankings
Monografías del Semin. Matem. García de Galdeano. 7: 7 3, (3). Multiobjective Evolutionary Algorithms. Pareto Rankings Alberto, I.; Azcarate, C.; Mallor, F. & Mateo, P.M. Abstract In this work we present
More informationSolving Multi-Criteria Optimization Problems with Population-Based ACO
Solving Multi-Criteria Optimization Problems with Population-Based ACO Michael Guntsch 1 and Martin Middendorf 2 1 Institute for Applied Computer Science and Formal Description Methods University of Karlsruhe,
More informationComputational statistics
Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f
More informationExploring the Performance of Stochastic Multiobjective Optimisers with the Second-Order Attainment Function
Exploring the Performance of Stochastic Multiobjective Optimisers with the Second-Order Attainment Function Carlos M. Fonseca 1, Viviane Grunert da Fonseca 1,2, and Luís Paquete 3,4 1 CSI Centro de Sistemas
More informationQuadratic Multiple Knapsack Problem with Setups and a Solution Approach
Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 Quadratic Multiple Knapsack Problem with Setups and a Solution Approach
More informationMetaheuristics and Local Search
Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :
More informationGeometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators
Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Andrea Mambrini 1 University of Birmingham, Birmingham UK 6th June 2013 1 / 33 Andrea Mambrini GSGP: theory-laden
More informationUsefulness of infeasible solutions in evolutionary search: an empirical and mathematical study
Edith Cowan University Research Online ECU Publications 13 13 Usefulness of infeasible solutions in evolutionary search: an empirical and mathematical study Lyndon While Philip Hingston Edith Cowan University,
More informationLecture 9 Evolutionary Computation: Genetic algorithms
Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic
More informationRunning Time Analysis of a Multiobjective Evolutionary Algorithm on Simple and Hard Problems
112 Running Time Analysis of a Multiobjective Evolutionary Algorithm on Simple and Hard Problems Rajeev Kumar and Nilanjan Banerjee Department of Computer Science and Engineering Indian Institute of Technology
More informationEvolutionary Multitasking Across Multi and Single-Objective Formulations for Improved Problem Solving
Evolutionary Multitasking Across Multi and Single-Objective Formulations for Improved Problem Solving Bingshui Da, Abhishek Gupta, Yew-Soon Ong, Liang Feng and Chen Wang School of Computer Engineering,
More informationMonotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles
Monotonicity Analysis, Evolutionary Multi-Objective Optimization, and Discovery of Design Principles Kalyanmoy Deb and Aravind Srinivasan Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationOn cone based decompositions of proper Pareto optimality
Noname manuscript No. (will be inserted by the editor) On cone based decompositions of proper Pareto optimality Marlon A. Braun Pradyumn K. Shukla Hartmut Schmeck Received: date / Accepted: date Abstract
More informationMulti-Objective Pareto Optimization of Centrifugal Pump Using Genetic Algorithms
Proceedings of the 11th WSEAS International Conference on COMPUTERS, Agios Niolaos, Crete Island, Greece, July 6-8, 007 135 Multi-Obective Pareto Optimization of Centrifugal Pump Using Genetic Algorithms
More informationComparative Study of Basic Constellation Models for Regional Satellite Constellation Design
2016 Sixth International Conference on Instrumentation & Measurement, Computer, Communication and Control Comparative Study of Basic Constellation Models for Regional Satellite Constellation Design Yu
More informationMULTIOBJECTIVE EVOLUTIONARY ALGORITHM FOR INTEGRATED TIMETABLE OPTIMIZATION WITH VEHICLE SCHEDULING ASPECTS
MULTIOBJECTIVE EVOLUTIONARY ALGORITHM FOR INTEGRATED TIMETABLE OPTIMIZATION WITH VEHICLE SCHEDULING ASPECTS Michal Weiszer 1, Gabriel Fedoro 2, Zdene Čujan 3 Summary:This paper describes the implementation
More informationA Tabu Search Approach based on Strategic Vibration for Competitive Facility Location Problems with Random Demands
A Tabu Search Approach based on Strategic Vibration for Competitive Facility Location Problems with Random Demands Takeshi Uno, Hideki Katagiri, and Kosuke Kato Abstract This paper proposes a new location
More informationGenetic Algorithm for Solving the Economic Load Dispatch
International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 5 (2014), pp. 523-528 International Research Publication House http://www.irphouse.com Genetic Algorithm
More informationEvolving more efficient digital circuits by allowing circuit layout evolution and multi-objective fitness
Evolving more efficient digital circuits by allowing circuit layout evolution and multi-objective fitness Tatiana Kalganova Julian Miller School of Computing School of Computing Napier University Napier
More informationMulti Agent Collaborative Search Based on Tchebycheff Decomposition
Noname manuscript No. (will be inserted by the editor) Multi Agent Collaborative Search Based on Tchebycheff Decomposition Federico Zuiani Massimiliano Vasile Received: date / Accepted: date Abstract This
More informationEvolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction
Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction 3. Introduction Currency exchange rate is an important element in international finance. It is one of the chaotic,
More informationA Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming
Author manuscript, published in "Applied Soft Computing (203) (to appear)" A Hybrid Metaheuristic for Multiobjective Unconstrained Binary Quadratic Programming Arnaud Liefooghe,a,b, Sébastien Verel c,
More informationGaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress
Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague
More informationBounded Approximation Algorithms
Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within
More informationResearch Article Study on Parameter Optimization Design of Drum Brake Based on Hybrid Cellular Multiobjective Genetic Algorithm
Mathematical Problems in Engineering Volume 22, Article ID 73493, 23 pages doi:.55/22/73493 Research Article Study on Parameter Optimization Design of Drum Brake Based on Hybrid Cellular Multiobjective
More informationMetaheuristics and Local Search. Discrete optimization problems. Solution approaches
Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,
More informationSolving High-Dimensional Multi-Objective Optimization Problems with Low Effective Dimensions
Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17) Solving High-Dimensional Multi-Objective Optimization Problems with Low Effective Dimensions Hong Qian, Yang Yu National
More informationAn evolutionary algorithm for multi-objective optimization of combustion processes
Center for Turbulence Research Annual Research Briefs 2 23 An evolutionary algorithm for multi-objective optimization of combustion processes By Dirk Büche, Peter Stoll AND Petros Koumoutsakos. Motivation
More informationA Comparison of Decoding Strategies for the 0/1 Multi-objective Unit Commitment Problem
A Comparison of Decoding Strategies for the 0/1 Multi-objective Unit Commitment Problem Sophie Jacquin, Lucien Mousin, Igor Machado, El-Ghazali Talbi, Laetitia Jourdan To cite this version: Sophie Jacquin,
More informationHypervolume Sharpe-Ratio Indicator: Formalization and First Theoretical Results
Hypervolume Sharpe-Ratio Indicator: Formalization and First Theoretical Results Andreia P. Guerreiro and Carlos M. Fonseca CISUC, Department of Informatics Engineering, University of Coimbra Pólo II, Pinhal
More informationAnt Colony Optimization for Multi-objective Optimization Problems
Ant Colony Optimization for Multi-objective Optimization Problems Inès Alaya 1,2, Christine Solnon 1, and Khaled Ghédira 2 1 LIRIS, UMR 5205 CNRS / University of Lyon 2 SOIE ICTAI 2007 Multi-objective
More informationMulti-objective genetic algorithm
Multi-objective genetic algorithm Robin Devooght 31 March 2010 Abstract Real world problems often present multiple, frequently conflicting, objectives. The research for optimal solutions of multi-objective
More informationSet-based Min-max and Min-min Robustness for Multi-objective Robust Optimization
Proceedings of the 2017 Industrial and Systems Engineering Research Conference K. Coperich, E. Cudney, H. Nembhard, eds. Set-based Min-max and Min-min Robustness for Multi-objective Robust Optimization
More informationUsing Evolutionary Techniques to Hunt for Snakes and Coils
Using Evolutionary Techniques to Hunt for Snakes and Coils Abstract The snake-in-the-box problem is a difficult problem in mathematics and computer science that deals with finding the longest-possible
More informationIEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART C: APPLICATIONS AND REVIEWS, VOL. 32, NO. 1, FEBRUARY
IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART C: APPLICATIONS AND REVIEWS, VOL. 32, NO. 1, FEBRUARY 2002 31 Correspondence Statistical Analysis of the Main Parameters Involved in the Design of
More informationTIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017
TIES598 Nonlinear Multiobjective Optimization A priori and a posteriori methods spring 2017 Jussi Hakanen jussi.hakanen@jyu.fi Contents A priori methods A posteriori methods Some example methods Learning
More informationDistance Metrics and Fitness Distance Analysis for the Capacitated Vehicle Routing Problem
MIC2005. The 6th Metaheuristics International Conference 603 Metrics and Analysis for the Capacitated Vehicle Routing Problem Marek Kubiak Institute of Computing Science, Poznan University of Technology
More informationA Reference Vector Guided Evolutionary Algorithm for Many-objective Optimization
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO. X, XXXX XXXX A Reference Vector Guided Evolutionary Algorithm for Many-objective Optimization Ran Cheng, Yaochu Jin, Fellow, IEEE, Markus Olhofer
More informationAn Analysis on Recombination in Multi-Objective Evolutionary Optimization
An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China
More informationCentric Selection: a Way to Tune the Exploration/Exploitation Trade-off
: a Way to Tune the Exploration/Exploitation Trade-off David Simoncini, Sébastien Verel, Philippe Collard, Manuel Clergue Laboratory I3S University of Nice-Sophia Antipolis / CNRS France Montreal, July
More informationProgram Evolution by Integrating EDP and GP
Program Evolution by Integrating EDP and GP Kohsuke Yanai and Hitoshi Iba Dept. of Frontier Informatics, Graduate School of Frontier Science, The University of Tokyo. 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8654,
More informationMulti-Objective Optimization of Two Dynamic Systems
Multi-Objective Optimization of Two Dynamic Systems Joseph M. Mahoney The Pennsylvania State University 204 EES Building University Park, PA 16802 jmm694@psu.edu Abstract This paper details two problems
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO.,
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI.9/TEVC.6.57, IEEE
More informationZebo Peng Embedded Systems Laboratory IDA, Linköping University
TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic
More information