Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Size: px
Start display at page:

Download "Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms"

Transcription

1 Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University

2 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 2

3 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 3

4 The Main Paradigms of NIOAs Genetic algorithm (GA) Evolution strategy (ES) Evolutionary programming (EP) Ant colony optimization (ACO) Differential evolution (DE) Particle swarm optimization (PSO) 4

5 The Main Paradigms of NIOAs Genetic algorithm (GA) Evolution strategy (ES) Evolutionary programming (EP) Ant colony optimization (ACO) Differential evolution (DE) Particle swarm optimization (PSO) the most popular paradigms in current studies 4

6 Differential Evolution (1/4) Differential evolution (DE) was proposed by Storn and Price in DE includes three main operators, i.e., mutation, crossover, and selection. R. Storn and K. Price. Differential evolution a simple and efficient adaptive scheme for global optimization over continuous spaces, Berkeley, CA, Tech. Rep. TR , K. Price, R. Storn, and J. Lampinen. Differential Evolution A Practical Approach to Global Optimization. Berlin, Germany: Springer-Verlag,

7 Differential Evolution (2/4) 6

8 Differential Evolution (2/4) 6

9 Differential Evolution (2/4) three main operators 6

10 Differential Evolution (2/4) three main operators mutation + crossover = trial vector generation strategy 6

11 Differential Evolution (3/4) A classic DE version: DE/rand/1/bin base vector differential vector mutation scaling factor 7

12 Differential Evolution (3/4) A classic DE version: DE/rand/1/bin base vector differential vector mutation scaling factor crossover control parameter crossover 7

13 Differential Evolution (3/4) A classic DE version: DE/rand/1/bin base vector differential vector mutation scaling factor crossover control parameter crossover selection 7

14 Differential Evolution (3/4) A classic DE version: DE/rand/1/bin base vector differential vector mutation scaling factor crossover control parameter crossover selection Remark: in the crossover, each variable in DE is updated independently 7

15 Differential Evolution (4/4) Schematic diagram to illustrate DE/rand/1/bin g x i the target vector the mutant vector g g r2 r3 g v i F ( x x ) x g r 2 x g r1 base vector perturbed vectors x g r3 x

16 Differential Evolution (4/4) Schematic diagram to illustrate DE/rand/1/bin the triangle denotes the trial vector g u i g x i the target vector the mutant vector g g r2 r3 g v i F ( x x ) x g r 2 x g r1 base vector perturbed vectors x g r3 x

17 Particle Swarm Optimization (1/3) The movement equations of the classic PSO the personal historical best experience the entire swarm s best experience g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j g+ 1 g g+ 1 i, j = i, j+ i, j x x v the neighborhood s best experience J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

18 Particle Swarm Optimization (1/3) The movement equations of the classic PSO the personal historical best experience the entire swarm s best experience g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j velocity update equation g+ 1 g g+ 1 i, j = i, j+ i, j x x v the neighborhood s best experience J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

19 Particle Swarm Optimization (1/3) The movement equations of the classic PSO the personal historical best experience the entire swarm s best experience g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j velocity update equation g+ 1 g g+ 1 i, j = i, j+ i, j x x v position update equation the neighborhood s best experience J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

20 Particle Swarm Optimization (1/3) The movement equations of the classic PSO g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j velocity update equation g+ 1 g g+ 1 i, j = i, j+ i, j x x v position update equation cognition part J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

21 Particle Swarm Optimization (1/3) The movement equations of the classic PSO g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j velocity update equation g+ 1 g g+ 1 i, j = i, j+ i, j x x v position update equation cognition part social part J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

22 Particle Swarm Optimization (1/3) The movement equations of the classic PSO g + 1 g g g g i, j = i, j + g 1 1, j pbesti, j i, j + 2r2, j j xi, j v v cr ( x ) c ( gbest ) or v + = v + cr ( x ) + c r ( lbest x ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2 2, j i, j i, j velocity update equation g+ 1 g g+ 1 i, j = i, j+ i, j x x v position update equation cognition part social part Remark: in the velocity update, each variable in PSO is updated independently J. Kennedy and R. C. Eberhart. Particle swarm optimization. in Proc. IEEE Int. Conf. Neural Networks, 1995, pp

23 Particle Swarm Optimization (2/3) The principle of the movement equations v + = v + cr ( x ) + c ( gbest ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2r2, j j xi, j g+ 1 g g+ 1 i, j = i, j+ i, j x x v g pbest i, j g 1 x + i, j g 1 v + i, j g gbest j g v i, j g x i, j the global version 10

24 Particle Swarm Optimization (2/3) The principle of the movement equations v + = v + cr ( x ) + c ( gbest ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2r2, j j xi, j g+ 1 g g+ 1 i, j = i, j+ i, j x x v g pbest i, j g 1 x + i, j g 1 v + i, j g gbest j g v i, j g x i, j the global version 10

25 Particle Swarm Optimization (2/3) The principle of the movement equations v + = v + cr ( x ) + c ( gbest ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2r2, j j xi, j g+ 1 g g+ 1 i, j = i, j+ i, j x x v g pbest i, j g 1 x + i, j g 1 v + i, j g gbest j g v i, j g x i, j the global version 10

26 Particle Swarm Optimization (2/3) The principle of the movement equations v + = v + cr ( x ) + c ( gbest ) g 1 g g g g g i, j i, j 1 1, j pbesti, j i, j 2r2, j j xi, j g+ 1 g g+ 1 i, j = i, j+ i, j x x v g pbest i, j g 1 x + i, j g 1 v + i, j g gbest j g v i, j g x i, j the global version 10

27 Particle Swarm Optimization (3/3) The framework of PSO 11

28 Particle Swarm Optimization (3/3) The framework of PSO 11

29 Particle Swarm Optimization (3/3) The framework of PSO 11

30 Particle Swarm Optimization (3/3) The framework of PSO 11

31 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 12

32 What is the Original Coordinate system? 13

33 What is the Original Coordinate system? 13

34 What is the Eigen Coordinate System? The Eigen coordinate system is established by the columns of an orthogonal matrix B, which comes from the Eigen decomposition of the covariance matrix C: C = BD 2 B T 14

35 What is the Eigen Coordinate System? The Eigen coordinate system is established by the columns of an orthogonal matrix B, which comes from the Eigen decomposition of the covariance matrix C: C = BD 2 B T an orthogonal matrix 14

36 What is the Eigen Coordinate System? The Eigen coordinate system is established by the columns of an orthogonal matrix B, which comes from the Eigen decomposition of the covariance matrix C: C = BD 2 B T an orthogonal matrix a diagonal matrix 14

37 What is the Eigen Coordinate System? The Eigen coordinate system is established by the columns of an orthogonal matrix B, which comes from the Eigen decomposition of the covariance matrix C: C = BD 2 B T an orthogonal matrix a diagonal matrix the transposed matrix of B 14

38 What is the Eigen Coordinate System? The Eigen coordinate system is established by the columns of an orthogonal matrix B, which comes from the Eigen decomposition of the covariance matrix C: C = BD 2 B T an orthogonal matrix a diagonal matrix the transposed matrix of B Each column of B is an eigenvector of C, and each diagonal element of D is the square root of an eigenvalue of C. 14

39 How to Establish the Eigen Coordinate System pop x1,1 x1,2 x1, D = xnp,1 xnp,2 x NP, D NP individuals Matlab Code xmean = mean(pop); C = 1/(NP-1)*(pop - xmean(ones(np,1), :))'*(pop - xmean(ones(np,1), :)); C = triu(c) + transpose(triu(c,1)); % enforce symmetry [B,D] = eig(c); 15

40 How to Establish the Eigen Coordinate System pop x1,1 x1,2 x1, D = xnp,1 xnp,2 x NP, D NP individuals Matlab Code xmean = mean(pop); C = 1/(NP-1)*(pop - xmean(ones(np,1), :))'*(pop - xmean(ones(np,1), :)); C = triu(c) + transpose(triu(c,1)); % enforce symmetry [B,D] = eig(c); 15

41 The Advantages of the Covariance Matrix (1/6) Covariance matrices have an appealing geometrical interpretation: they can be uniquely identified with the (hyper-)ellipsoid N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, vol. 9, no. 2, pp ,

42 The Advantages of the Covariance Matrix (2/6) The Eigen decomposition is denoted by C = BD B 2 T 17

43 The Advantages of the Covariance Matrix (2/6) The Eigen decomposition is denoted by C = BD B 2 T D = δ I 2 C= δ I the ellipsoid is isotropic a positive number the identity matrix 17

44 The Advantages of the Covariance Matrix (2/6) The Eigen decomposition is denoted by C = BD B 2 T D = δ I 2 C= δ I the ellipsoid is isotropic a positive number the identity matrix B = I 2 C= D the ellipsoid is axis parallel oriented the identity matrix a diagonal matrix 17

45 The Advantages of the Covariance Matrix (2/6) The Eigen decomposition is denoted by C = BD B 2 T D = δ I 2 C= δ I the ellipsoid is isotropic a positive number the identity matrix B = I 2 C= D the ellipsoid is axis parallel oriented the identity matrix a diagonal matrix B C the ellipsoid can be adapted to suit the contour lines of the objective function the new principal axes of the ellipsoid correspond to the columns of B 17

46 The Advantages of the Covariance Matrix (3/6) the Hessian matrix Setting on is equivalent to optimizing the isotropic function, where with. 18

47 The Advantages of the Covariance Matrix (3/6) the Hessian matrix Setting on is equivalent to optimizing the isotropic function, where with. Remark 1: The optimal covariance matrix equals to the inverse Hessian matrix 18

48 The Advantages of the Covariance Matrix (3/6) the Hessian matrix Setting on is equivalent to optimizing the isotropic function, where with. Remark 1: The optimal covariance matrix equals to the inverse Hessian matrix Remark 2: The objective of covariance matrix adaptation is to approximate the inverse Hessian matrix 18

49 The Advantages of the Covariance Matrix (4/6) minimize : f( x, x ) = x + x, -5 < x < 5, -5 < x < minimize x + minimize x, -5 < x < 5, -5 < x < isotropic function 19

50 The Advantages of the Covariance Matrix (4/6) minimize : f( x, x ) = x + x, -5 < x < 5, -5 < x < minimize x + minimize x, -5 < x < 5, -5 < x < isotropic function 19

51 The Advantages of the Covariance Matrix (5/6) Covariance Matrix Adaptation isotropic function 20

52 The Advantages of the Covariance Matrix (6/6) In general, the covariance matrix C is constructed and adapted according to the feedback information resulting from the search process. 21

53 The Advantages of the Covariance Matrix (6/6) In general, the covariance matrix C is constructed and adapted according to the feedback information resulting from the search process. Therefore, unlike the original coordinate system, the Eigen coordinate system is dynamic throughout the search process, with the aim of suiting the function landscape. 21

54 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 22

55 Motivation The commonly used crossover operators of DE are implemented in the original coordinate system x 2 v ig, x ig, x 1 Y. Wang, H.-X. Li, T. Huang, and L Li. Differential evolution based on covariance matrix learning and bimodal distribution parameter setting. Applied Soft Computing, vol. 18, pp , (CoBiDE) 23

56 CoBiDE (1/6) Covariance matrix learning 24

57 CoBiDE (2/6) An explanation x 2 v ig, covariance matrix learning x ig, x 1 x 2 x ig, x 1 v ig, x 2 x 1 25

58 CoBiDE (3/6) the optimal solution 26

59 CoBiDE (4/6) The first question: which individuals should be chosen for computing the covariance matrix 27

60 CoBiDE (4/6) The first question: which individuals should be chosen for computing the covariance matrix Compute the covariance matrix C via the top ps*np individuals in the current population 27

61 CoBiDE (4/6) The first question: which individuals should be chosen for computing the covariance matrix Compute the covariance matrix C via the top ps*np individuals in the current population Utilizing single population distribution information 27

62 CoBiDE (5/6) the λ offspring produced from μ parents the center of the μ best individuals from the λ offspring An issue in the covariance matrix adaptation: the variance will decease significantly N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, vol. 9, no. 2, pp ,

63 CoBiDE (6/6) The second question: how to determine the probability that the crossover is implemented in the Eigen coordinate system 29

64 CoBiDE (6/6) The second question: how to determine the probability that the crossover is implemented in the Eigen coordinate system 29

65 CoBiDE (6/6) The second question: how to determine the probability that the crossover is implemented in the Eigen coordinate system Stochastic tuning 29

66 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 30

67 Motivation Single population fails to contain enough information to reliably estimate the covariance matrix. Y. Wang, H.-X. Li, T. Huang, and L Li. Differential evolution based on covariance matrix learning and bimodal distribution parameter setting. Applied Soft Computing, vol. 18, pp , S. Guo and C. Yang. Enhancing differential evolution utilizing Eigenvector-based crossover operator. IEEE Transactions on Evolutionary Computation, vol. 19, no. 1, pp ,

68 Motivation Single population fails to contain enough information to reliably estimate the covariance matrix. Moreover, some extra parameters have been introduced. Y. Wang, H.-X. Li, T. Huang, and L Li. Differential evolution based on covariance matrix learning and bimodal distribution parameter setting. Applied Soft Computing, vol. 18, pp , S. Guo and C. Yang. Enhancing differential evolution utilizing Eigenvector-based crossover operator. IEEE Transactions on Evolutionary Computation, vol. 19, no. 1, pp ,

69 CPI-DE (1/3) We make use of the cumulative distribution information of the population to establish an appropriate coordinate system for DE s crossover Y. Wang, Z.-Z. Liu, J. Li, H.-X. Li, and G. G. Yen. Utilizing cumulative population distribution information in differential evolution. Applied Soft Computing, vol. 48, pp , (CPI-DE) 32

70 CPI-DE (1/3) We make use of the cumulative distribution information of the population to establish an appropriate coordinate system for DE s crossover The algorithmic framework Y. Wang, Z.-Z. Liu, J. Li, H.-X. Li, and G. G. Yen. Utilizing cumulative population distribution information in differential evolution. Applied Soft Computing, vol. 48, pp , (CPI-DE) 32

71 CPI-DE (1/3) We make use of the cumulative distribution information of the population to establish an appropriate coordinate system for DE s crossover The algorithmic framework Deterministic tuning Y. Wang, Z.-Z. Liu, J. Li, H.-X. Li, and G. G. Yen. Utilizing cumulative population distribution information in differential evolution. Applied Soft Computing, vol. 48, pp , (CPI-DE) 32

72 CPI-DE (2/3) Rank-NP-update of the covariance matrix in DE NP g+ 1 g+ 1 g g+ 1 g T NP = i ( i:2* NP )( i:2* NP ) i= 1 C w x m x m C = (1 c ) C + c ( σ ) C g+ 1 g g 1 g+ 1 NP NP NP 2 33

73 CPI-DE (2/3) Rank-NP-update of the covariance matrix in DE NP g+ 1 g+ 1 g g+ 1 g T NP = i ( i:2* NP )( i:2* NP ) i= 1 C w x m x m C = (1 c ) C + c ( σ ) C g+ 1 g g 1 g+ 1 NP NP NP 2 33 cumulative population distribution information

74 CPI-DE (3/3) The relationship between rank-np-update in CPI-DE and rank-μ-update in CMA-ES rank-np-update in CPI-DE rank-μ-update in CMA-ES 34

75 CPI-DE (3/3) The relationship between rank-np-update in CPI-DE and rank-μ-update in CMA-ES rank-np-update in CPI-DE rank-μ-update in CMA-ES rank-np-update in CPI-DE is a natural extension of rank-μ-update in CMA-ES 34

76 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 35

77 Motivation It is necessary to strike the balance between the accuracy of estimation and the convergence performance. 36

78 Motivation It is necessary to strike the balance between the accuracy of estimation and the convergence performance. The current methods adjust the original and Eigen coordinate systems in either a random way or a deterministic way. How to exploit the feedback information from the search process to adaptively tune them has not yet been investigated. 36

79 Motivation It is necessary to strike the balance between the accuracy of estimation and the convergence performance. The current methods adjust the original and Eigen coordinate systems in either a random way or a deterministic way. How to exploit the feedback information from the search process to adaptively tune them has not yet been investigated. It is an interesting topic to boost the research on the coordinate systems to other NIOA paradigms. 36

80 A New Point of View (1/4) How to describe some common nature-inspired operators in the original coordinate system Z.-Z. Liu, Y. Wang*, S. Yang, and K. Tang. An adaptive framework to tune the coordinate systems in nature-inspired optimization algorithms. IEEE Transactions on Cybernetics, in press. (ACoS) 37

81 A New Point of View (1/4) How to describe some common nature-inspired operators in the original coordinate system each variable is updated independently in an operator Z.-Z. Liu, Y. Wang*, S. Yang, and K. Tang. An adaptive framework to tune the coordinate systems in nature-inspired optimization algorithms. IEEE Transactions on Cybernetics, in press. (ACoS) 37

82 A New Point of View (2/4) A convenient transformation from a nature-inspired operator in the original coordinate system to the corresponding nature-inspired operator in the Eigen coordinate system 38

83 A New Point of View (2/4) A convenient transformation from a nature-inspired operator in the original coordinate system to the corresponding nature-inspired operator in the Eigen coordinate system 38

84 A New Point of View (2/4) A convenient transformation from a nature-inspired operator in the original coordinate system to the corresponding nature-inspired operator in the Eigen coordinate system 38

85 A New Point of View (2/4) A convenient transformation from a nature-inspired operator in the original coordinate system to the corresponding nature-inspired operator in the Eigen coordinate system 38

86 A New Point of View (3/4) An example: DE 39

87 A New Point of View (3/4) An example: DE the crossover operator in the original coordinate system 39

88 A New Point of View (3/4) An example: DE the crossover operator in the original coordinate system 39

89 A New Point of View (3/4) An example: DE the crossover operator in the original coordinate system the crossover operator in the Eigen coordinate system 39

90 A New Point of View (4/4) Another example: PSO 40

91 A New Point of View (4/4) Another example: PSO the velocity update equation in the original coordinate system 40

92 A New Point of View (4/4) Another example: PSO the velocity update equation in the original coordinate system 40

93 A New Point of View (4/4) Another example: PSO the velocity update equation in the original coordinate system the velocity update equation in the Eigen coordinate system 40

94 ACoS (1/9) 41

95 ACoS (1/9) 41

96 ACoS (2/9) Each individual selects an appropriate coordinate system from the original coordinate system and the Eigen coordinate system by a probability. 42

97 ACoS (3/9) The probability is adaptively updated based on the collected information from the offspring. 43

98 ACoS (4/9) The Eigen coordinate system is coupled with the original coordinate system 44

99 ACoS (5/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy 45

100 ACoS (5/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy An external archive A is used to store the offspring in both the current generation and the past several generations 45

101 ACoS (5/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy An external archive A is used to store the offspring in both the current generation and the past several generations A does not undergo any nature-inspired operators. 45

102 ACoS (5/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy An external archive A is used to store the offspring in both the current generation and the past several generations A does not undergo any nature-inspired operators. Therefore, it can obtain sufficient information to estimate the Eigen coordinate system while never affecting the population size and the generation number. 45

103 ACoS (5/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy An external archive A is used to store the offspring in both the current generation and the past several generations A does not undergo any nature-inspired operators. Therefore, it can obtain sufficient information to estimate the Eigen coordinate system while never affecting the population size and the generation number. strike the balance between the accuracy of estimation and the convergence performance 45

104 ACoS (6/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy 46

105 ACoS (6/9) The Eigen coordinate system is updated by making use of an additional archiving mechanism and the rank-μ-update strategy cumulative population distribution information 46

106 ACoS (7/9) Apply ACoS to DE 47

107 ACoS (7/9) Apply ACoS to DE 47

108 ACoS (7/9) Apply ACoS to DE 47

109 ACoS (7/9) Apply ACoS to DE 47

110 ACoS (7/9) Apply ACoS to DE 47

111 ACoS (8/9) Apply ACoS to PSO 48

112 ACoS (8/9) Apply ACoS to PSO 48

113 ACoS (8/9) Apply ACoS to PSO 48

114 ACoS (8/9) Apply ACoS to PSO 48

115 ACoS (8/9) Apply ACoS to PSO 48

116 ACoS (9/9) ACoS can be used to improve the performance of three famous DE variants: JADE, jde, and SaDE. 49

117 ACoS (9/9) ACoS can be used to improve the performance of three famous DE variants: JADE, jde, and SaDE. ACoS can be used to improve the performance of two of the most popular PSO variants: PSO-w and PSO-cf. 49

118 ACoS (9/9) ACoS can be used to improve the performance of three famous DE variants: JADE, jde, and SaDE. ACoS can be used to improve the performance of two of the most popular PSO variants: PSO-w and PSO-cf. We also apply ACoS to bat algorithm (BA) and teaching learning-based optimization (TLBO). 49

119 ACoS (9/9) ACoS can be used to improve the performance of three famous DE variants: JADE, jde, and SaDE. ACoS can be used to improve the performance of two of the most popular PSO variants: PSO-w and PSO-cf. We also apply ACoS to bat algorithm (BA) and teaching learning-based optimization (TLBO). The structure of ACoS is simple. 49

120 Outline Nature-Inspired Optimization Algorithms (NIOAs) Eigen Coordinate System I: Stochastic Tuning + Single Population Distribution Information II: Deterministic Tuning + Cumulative Population Distribution Information III: Adaptive Tuning + Cumulative Population Distribution Information Conclusion 50

121 Conclusion (1/2) The Eigen coordinate system should be coupled with the original coordinate system in that in the early stage the Eigen coordinate system is inaccurate due to the random distribution of the population. 51

122 Conclusion (1/2) The Eigen coordinate system should be coupled with the original coordinate system in that in the early stage the Eigen coordinate system is inaccurate due to the random distribution of the population. The Eigen coordinate system can significantly accelerate the convergence. 51

123 Conclusion (1/2) The Eigen coordinate system should be coupled with the original coordinate system in that in the early stage the Eigen coordinate system is inaccurate due to the random distribution of the population. The Eigen coordinate system can significantly accelerate the convergence. However, based on our observation and analysis, it is hard for the Eigen coordinate system to change the search pattern of a NIOA, which means that the capability of a NIOA implemented in the Eigen coordinate system jumping from a local optimum is limited if this NIOA tends to converge to a local optimum of an optimization problem. 51

124 Conclusion (2/2) Due to the fact that the theory study of NIOAs is very difficult (in particular, the theoretical analysis of population cooperation is very challenging), tuning the coordinate systems in NIOAs is a very important and promising step toward understanding why NIOAs work or do not work, which deserves in-depth attention in the future. 52

125 Conclusion (2/2) Due to the fact that the theory study of NIOAs is very difficult (in particular, the theoretical analysis of population cooperation is very challenging), tuning the coordinate systems in NIOAs is a very important and promising step toward understanding why NIOAs work or do not work, which deserves in-depth attention in the future. The source codes of CoBiDE, CPI-DE, and ACoS can be downloaded from the following URL: 52

126

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms

Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Three Steps toward Tuning the Coordinate Systems in Nature-Inspired Optimization Algorithms Yong Wang and Zhi-Zhong Liu School of Information Science and Engineering Central South University ywang@csu.edu.cn

More information

Bio-inspired Continuous Optimization: The Coming of Age

Bio-inspired Continuous Optimization: The Coming of Age Bio-inspired Continuous Optimization: The Coming of Age Anne Auger Nikolaus Hansen Nikolas Mauny Raymond Ros Marc Schoenauer TAO Team, INRIA Futurs, FRANCE http://tao.lri.fr First.Last@inria.fr CEC 27,

More information

A Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning

A Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning 009 Ninth International Conference on Intelligent Systems Design and Applications A Scalability Test for Accelerated DE Using Generalized Opposition-Based Learning Hui Wang, Zhijian Wu, Shahryar Rahnamayan,

More information

Differential Evolution Based Particle Swarm Optimization

Differential Evolution Based Particle Swarm Optimization Differential Evolution Based Particle Swarm Optimization Mahamed G.H. Omran Department of Computer Science Gulf University of Science and Technology Kuwait mjomran@gmail.com Andries P. Engelbrecht Department

More information

ACTA UNIVERSITATIS APULENSIS No 11/2006

ACTA UNIVERSITATIS APULENSIS No 11/2006 ACTA UNIVERSITATIS APULENSIS No /26 Proceedings of the International Conference on Theory and Application of Mathematics and Informatics ICTAMI 25 - Alba Iulia, Romania FAR FROM EQUILIBRIUM COMPUTATION

More information

Beta Damping Quantum Behaved Particle Swarm Optimization

Beta Damping Quantum Behaved Particle Swarm Optimization Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,

More information

B-Positive Particle Swarm Optimization (B.P.S.O)

B-Positive Particle Swarm Optimization (B.P.S.O) Int. J. Com. Net. Tech. 1, No. 2, 95-102 (2013) 95 International Journal of Computing and Network Technology http://dx.doi.org/10.12785/ijcnt/010201 B-Positive Particle Swarm Optimization (B.P.S.O) Muhammad

More information

Particle swarm optimization (PSO): a potentially useful tool for chemometrics?

Particle swarm optimization (PSO): a potentially useful tool for chemometrics? Particle swarm optimization (PSO): a potentially useful tool for chemometrics? Federico Marini 1, Beata Walczak 2 1 Sapienza University of Rome, Rome, Italy 2 Silesian University, Katowice, Poland Rome,

More information

Empirical comparisons of several derivative free optimization algorithms

Empirical comparisons of several derivative free optimization algorithms Empirical comparisons of several derivative free optimization algorithms A. Auger,, N. Hansen,, J. M. Perez Zerpa, R. Ros, M. Schoenauer, TAO Project-Team, INRIA Saclay Ile-de-France LRI, Bat 90 Univ.

More information

Numerical Optimization: Basic Concepts and Algorithms

Numerical Optimization: Basic Concepts and Algorithms May 27th 2015 Numerical Optimization: Basic Concepts and Algorithms R. Duvigneau R. Duvigneau - Numerical Optimization: Basic Concepts and Algorithms 1 Outline Some basic concepts in optimization Some

More information

Stochastic Methods for Continuous Optimization

Stochastic Methods for Continuous Optimization Stochastic Methods for Continuous Optimization Anne Auger et Dimo Brockhoff Paris-Saclay Master - Master 2 Informatique - Parcours Apprentissage, Information et Contenu (AIC) 2016 Slides taken from Auger,

More information

Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems

Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems Investigation of Mutation Strategies in Differential Evolution for Solving Global Optimization Problems Miguel Leon Ortiz and Ning Xiong Mälardalen University, Västerås, SWEDEN Abstract. Differential evolution

More information

Multi-start JADE with knowledge transfer for numerical optimization

Multi-start JADE with knowledge transfer for numerical optimization Multi-start JADE with knowledge transfer for numerical optimization Fei Peng, Ke Tang,Guoliang Chen and Xin Yao Abstract JADE is a recent variant of Differential Evolution (DE) for numerical optimization,

More information

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing International Conference on Artificial Intelligence (IC-AI), Las Vegas, USA, 2002: 1163-1169 Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing Xiao-Feng

More information

Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization

Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization Deepak Singh Raipur Institute of Technology Raipur, India Vikas Singh ABV- Indian Institute of Information Technology

More information

A Particle Swarm Optimization (PSO) Primer

A Particle Swarm Optimization (PSO) Primer A Particle Swarm Optimization (PSO) Primer With Applications Brian Birge Overview Introduction Theory Applications Computational Intelligence Summary Introduction Subset of Evolutionary Computation Genetic

More information

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution

Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Decomposition and Metaoptimization of Mutation Operator in Differential Evolution Karol Opara 1 and Jaros law Arabas 2 1 Systems Research Institute, Polish Academy of Sciences 2 Institute of Electronic

More information

Fuzzy adaptive catfish particle swarm optimization

Fuzzy adaptive catfish particle swarm optimization ORIGINAL RESEARCH Fuzzy adaptive catfish particle swarm optimization Li-Yeh Chuang, Sheng-Wei Tsai, Cheng-Hong Yang. Institute of Biotechnology and Chemical Engineering, I-Shou University, Kaohsiung, Taiwan

More information

Distributed Particle Swarm Optimization

Distributed Particle Swarm Optimization Distributed Particle Swarm Optimization Salman Kahrobaee CSCE 990 Seminar Main Reference: A Comparative Study of Four Parallel and Distributed PSO Methods Leonardo VANNESCHI, Daniele CODECASA and Giancarlo

More information

Research Article A Hybrid Backtracking Search Optimization Algorithm with Differential Evolution

Research Article A Hybrid Backtracking Search Optimization Algorithm with Differential Evolution Mathematical Problems in Engineering Volume 2015, Article ID 769245, 16 pages http://dx.doi.org/10.1155/2015/769245 Research Article A Hybrid Backtracking Search Optimization Algorithm with Differential

More information

Evolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution

Evolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution Evolving cognitive and social experience in Particle Swarm Optimization through Differential Evolution Michael G. Epitropakis, Member, IEEE, Vassilis P. Plagianakos and Michael N. Vrahatis Abstract In

More information

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties 1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen

More information

Covariance Matrix Adaptation in Multiobjective Optimization

Covariance Matrix Adaptation in Multiobjective Optimization Covariance Matrix Adaptation in Multiobjective Optimization Dimo Brockhoff INRIA Lille Nord Europe October 30, 2014 PGMO-COPI 2014, Ecole Polytechnique, France Mastertitelformat Scenario: Multiobjective

More information

Power Electronic Circuits Design: A Particle Swarm Optimization Approach *

Power Electronic Circuits Design: A Particle Swarm Optimization Approach * Power Electronic Circuits Design: A Particle Swarm Optimization Approach * Jun Zhang, Yuan Shi, and Zhi-hui Zhan ** Department of Computer Science, Sun Yat-sen University, China, 510275 junzhang@ieee.org

More information

Application of Teaching Learning Based Optimization for Size and Location Determination of Distributed Generation in Radial Distribution System.

Application of Teaching Learning Based Optimization for Size and Location Determination of Distributed Generation in Radial Distribution System. Application of Teaching Learning Based Optimization for Size and Location Determination of Distributed Generation in Radial Distribution System. Khyati Mistry Electrical Engineering Department. Sardar

More information

The Parameters Selection of PSO Algorithm influencing On performance of Fault Diagnosis

The Parameters Selection of PSO Algorithm influencing On performance of Fault Diagnosis The Parameters Selection of Algorithm influencing On performance of Fault Diagnosis Yan HE,a, Wei Jin MA and Ji Ping ZHANG School of Mechanical Engineering and Power Engineer North University of China,

More information

Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2

Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2 Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2 1 Production and Systems Engineering Graduate Program, PPGEPS Pontifical Catholic University

More information

Completely Derandomized Self-Adaptation in Evolution Strategies

Completely Derandomized Self-Adaptation in Evolution Strategies Completely Derandomized Self-Adaptation in Evolution Strategies Nikolaus Hansen and Andreas Ostermeier In Evolutionary Computation 9(2) pp. 159-195 (2001) Errata Section 3 footnote 9: We use the expectation

More information

PARTICLE SWARM OPTIMISATION (PSO)

PARTICLE SWARM OPTIMISATION (PSO) PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image: http://www.cs264.org/2009/projects/web/ding_yiyang/ding-robb/pso.jpg Introduction Concept first introduced by Kennedy and Eberhart

More information

Advanced Optimization

Advanced Optimization Advanced Optimization Lecture 3: 1: Randomized Algorithms for for Continuous Discrete Problems Problems November 22, 2016 Master AIC Université Paris-Saclay, Orsay, France Anne Auger INRIA Saclay Ile-de-France

More information

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation

Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Tutorial CMA-ES Evolution Strategies and Covariance Matrix Adaptation Anne Auger & Nikolaus Hansen INRIA Research Centre Saclay Île-de-France Project team TAO University Paris-Sud, LRI (UMR 8623), Bat.

More information

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES

Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability Study of Sep-G-CMA-ES Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Parameters across Mixed Dimensional Instances: A Performance Scalability

More information

A PSO APPROACH FOR PREVENTIVE MAINTENANCE SCHEDULING OPTIMIZATION

A PSO APPROACH FOR PREVENTIVE MAINTENANCE SCHEDULING OPTIMIZATION 2009 International Nuclear Atlantic Conference - INAC 2009 Rio de Janeiro,RJ, Brazil, September27 to October 2, 2009 ASSOCIAÇÃO BRASILEIRA DE ENERGIA NUCLEAR - ABEN ISBN: 978-85-99141-03-8 A PSO APPROACH

More information

The particle swarm optimization algorithm: convergence analysis and parameter selection

The particle swarm optimization algorithm: convergence analysis and parameter selection Information Processing Letters 85 (2003) 317 325 www.elsevier.com/locate/ipl The particle swarm optimization algorithm: convergence analysis and parameter selection Ioan Cristian Trelea INA P-G, UMR Génie

More information

Crossover and the Different Faces of Differential Evolution Searches

Crossover and the Different Faces of Differential Evolution Searches WCCI 21 IEEE World Congress on Computational Intelligence July, 18-23, 21 - CCIB, Barcelona, Spain CEC IEEE Crossover and the Different Faces of Differential Evolution Searches James Montgomery Abstract

More information

Particle Swarm Optimization. Abhishek Roy Friday Group Meeting Date:

Particle Swarm Optimization. Abhishek Roy Friday Group Meeting Date: Particle Swarm Optimization Abhishek Roy Friday Group Meeting Date: 05.25.2016 Cooperation example Basic Idea PSO is a robust stochastic optimization technique based on the movement and intelligence of

More information

Computational Intelligence Winter Term 2017/18

Computational Intelligence Winter Term 2017/18 Computational Intelligence Winter Term 2017/18 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund mutation: Y = X + Z Z ~ N(0, C) multinormal distribution

More information

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms.

Problem Statement Continuous Domain Search/Optimization. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms. Tutorial Evolution Strategies and Related Estimation of Distribution Algorithms Anne Auger & Nikolaus Hansen INRIA Saclay - Ile-de-France, project team TAO Universite Paris-Sud, LRI, Bat. 49 945 ORSAY

More information

A Method of HVAC Process Object Identification Based on PSO

A Method of HVAC Process Object Identification Based on PSO 2017 3 45 313 doi 10.3969 j.issn.1673-7237.2017.03.004 a a b a. b. 201804 PID PID 2 TU831 A 1673-7237 2017 03-0019-05 A Method of HVAC Process Object Identification Based on PSO HOU Dan - lin a PAN Yi

More information

A Novel Approach for Complete Identification of Dynamic Fractional Order Systems Using Stochastic Optimization Algorithms and Fractional Calculus

A Novel Approach for Complete Identification of Dynamic Fractional Order Systems Using Stochastic Optimization Algorithms and Fractional Calculus 5th International Conference on Electrical and Computer Engineering ICECE 2008, 20-22 December 2008, Dhaka, Bangladesh A Novel Approach for Complete Identification of Dynamic Fractional Order Systems Using

More information

ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS

ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS J. of Electromagn. Waves and Appl., Vol. 23, 711 721, 2009 ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS L. Zhang, F. Yang, and

More information

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES.

A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES. A0M33EOA: EAs for Real-Parameter Optimization. Differential Evolution. CMA-ES. Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics Many parts adapted

More information

THE objective of global optimization is to find the

THE objective of global optimization is to find the Large Scale Global Optimization Using Differential Evolution With Self-adaptation and Cooperative Co-evolution Aleš Zamuda, Student Member, IEEE, Janez Brest, Member, IEEE, Borko Bošković, Student Member,

More information

Optimization of PI Parameters for Speed Controller of a Permanent Magnet Synchronous Motor by using Particle Swarm Optimization Technique

Optimization of PI Parameters for Speed Controller of a Permanent Magnet Synchronous Motor by using Particle Swarm Optimization Technique Optimization of PI Parameters for Speed Controller of a Permanent Magnet Synchronous Motor by using Particle Swarm Optimization Technique Aiffah Mohammed 1, Wan Salha Saidon 1, Muhd Azri Abdul Razak 2,

More information

OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION

OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION Onah C. O. 1, Agber J. U. 2 and Ikule F. T. 3 1, 2, 3 Department of Electrical and Electronics

More information

An Adaptive Population Size Differential Evolution with Novel Mutation Strategy for Constrained Optimization

An Adaptive Population Size Differential Evolution with Novel Mutation Strategy for Constrained Optimization > REPLACE THIS LINE WITH YOUR PAPER IDENTIFICATION NUMBER (DOUBLE-CLICK HERE TO EDIT) < 1 An Adaptive Population Size Differential Evolution with Novel Mutation Strategy for Constrained Optimization Yuan

More information

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES

Introduction to Randomized Black-Box Numerical Optimization and CMA-ES Introduction to Randomized Black-Box Numerical Optimization and CMA-ES July 3, 2017 CEA/EDF/Inria summer school "Numerical Analysis" Université Pierre-et-Marie-Curie, Paris, France Anne Auger, Asma Atamna,

More information

OPTIMIZATION OF MODEL-FREE ADAPTIVE CONTROLLER USING DIFFERENTIAL EVOLUTION METHOD

OPTIMIZATION OF MODEL-FREE ADAPTIVE CONTROLLER USING DIFFERENTIAL EVOLUTION METHOD ABCM Symposium Series in Mechatronics - Vol. 3 - pp.37-45 Copyright c 2008 by ABCM OPTIMIZATION OF MODEL-FREE ADAPTIVE CONTROLLER USING DIFFERENTIAL EVOLUTION METHOD Leandro dos Santos Coelho Industrial

More information

DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION

DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION Progress In Electromagnetics Research B, Vol. 26, 101 113, 2010 DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION M. J. Asi and N. I. Dib Department of Electrical Engineering

More information

Stochastic optimization and a variable metric approach

Stochastic optimization and a variable metric approach The challenges for stochastic optimization and a variable metric approach Microsoft Research INRIA Joint Centre, INRIA Saclay April 6, 2009 Content 1 Introduction 2 The Challenges 3 Stochastic Search 4

More information

Optimisation numérique par algorithmes stochastiques adaptatifs

Optimisation numérique par algorithmes stochastiques adaptatifs Optimisation numérique par algorithmes stochastiques adaptatifs Anne Auger M2R: Apprentissage et Optimisation avancés et Applications anne.auger@inria.fr INRIA Saclay - Ile-de-France, project team TAO

More information

Differential evolution with an individual-dependent mechanism

Differential evolution with an individual-dependent mechanism Loughborough University Institutional Repository Differential evolution with an individualdependent mechanism This item was submitted to Loughborough University's Institutional Repository by the/an author.

More information

ARTIFICIAL NEURAL NETWORKS REGRESSION ON ENSEMBLE STRATEGIES IN DIFFERENTIAL EVOLUTION

ARTIFICIAL NEURAL NETWORKS REGRESSION ON ENSEMBLE STRATEGIES IN DIFFERENTIAL EVOLUTION ARTIFICIAL NEURAL NETWORKS REGRESSION ON ENSEMBLE STRATEGIES IN DIFFERENTIAL EVOLUTION Iztok Fister Jr. 1,Ponnuthurai Nagaratnam Suganthan 2, Damjan Strnad 1, Janez Brest 1,Iztok Fister 1 1 University

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

Dynamic Optimization using Self-Adaptive Differential Evolution

Dynamic Optimization using Self-Adaptive Differential Evolution Dynamic Optimization using Self-Adaptive Differential Evolution IEEE Congress on Evolutionary Computation (IEEE CEC 2009), Trondheim, Norway, May 18-21, 2009 J. Brest, A. Zamuda, B. Bošković, M. S. Maučec,

More information

Vectors and Matrices Statistics with Vectors and Matrices

Vectors and Matrices Statistics with Vectors and Matrices Vectors and Matrices Statistics with Vectors and Matrices Lecture 3 September 7, 005 Analysis Lecture #3-9/7/005 Slide 1 of 55 Today s Lecture Vectors and Matrices (Supplement A - augmented with SAS proc

More information

Differential Evolution: Competitive Setting of Control Parameters

Differential Evolution: Competitive Setting of Control Parameters Proceedings of the International Multiconference on Computer Science and Information Technology pp. 207 213 ISSN 1896-7094 c 2006 PIPS Differential Evolution: Competitive Setting of Control Parameters

More information

EFFECT OF STRATEGY ADAPTATION ON DIFFERENTIAL EVOLUTION IN PRESENCE AND ABSENCE OF PARAMETER ADAPTATION: AN INVESTIGATION

EFFECT OF STRATEGY ADAPTATION ON DIFFERENTIAL EVOLUTION IN PRESENCE AND ABSENCE OF PARAMETER ADAPTATION: AN INVESTIGATION JAISCR, 2018, Vol. 8, No. 3, pp. 211 235 10.1515/jaiscr-2018-0014 EFFECT OF STRATEGY ADAPTATION ON DIFFERENTIAL EVOLUTION IN PRESENCE AND ABSENCE OF PARAMETER ADAPTATION: AN INVESTIGATION Deepak Dawar

More information

Integer weight training by differential evolution algorithms

Integer weight training by differential evolution algorithms Integer weight training by differential evolution algorithms V.P. Plagianakos, D.G. Sotiropoulos, and M.N. Vrahatis University of Patras, Department of Mathematics, GR-265 00, Patras, Greece. e-mail: vpp

More information

Adaptive Differential Evolution and Exponential Crossover

Adaptive Differential Evolution and Exponential Crossover Proceedings of the International Multiconference on Computer Science and Information Technology pp. 927 931 ISBN 978-83-60810-14-9 ISSN 1896-7094 Adaptive Differential Evolution and Exponential Crossover

More information

PARTICLE swarm optimization (PSO) is one powerful and. A Competitive Swarm Optimizer for Large Scale Optimization

PARTICLE swarm optimization (PSO) is one powerful and. A Competitive Swarm Optimizer for Large Scale Optimization IEEE TRANSACTIONS ON CYBERNETICS, VOL. XX, NO. X, XXXX XXXX 1 A Competitive Swarm Optimizer for Large Scale Optimization Ran Cheng and Yaochu Jin, Senior Member, IEEE Abstract In this paper, a novel competitive

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Blackbox Optimization Marc Toussaint U Stuttgart Blackbox Optimization The term is not really well defined I use it to express that only f(x) can be evaluated f(x) or 2 f(x)

More information

The CMA Evolution Strategy: A Tutorial

The CMA Evolution Strategy: A Tutorial The CMA Evolution Strategy: A Tutorial Nikolaus Hansen November 6, 205 Contents Nomenclature 3 0 Preliminaries 4 0. Eigendecomposition of a Positive Definite Matrix... 5 0.2 The Multivariate Normal Distribution...

More information

Regrouping Particle Swarm Optimization: A New Global Optimization Algorithm with Improved Performance Consistency Across Benchmarks

Regrouping Particle Swarm Optimization: A New Global Optimization Algorithm with Improved Performance Consistency Across Benchmarks Regrouping Particle Swarm Optimization: A New Global Optimization Algorithm with Improved Performance Consistency Across Benchmarks George I. Evers Advisor: Dr. Mounir Ben Ghalia Electrical Engineering

More information

Research Article Algorithmic Mechanism Design of Evolutionary Computation

Research Article Algorithmic Mechanism Design of Evolutionary Computation Computational Intelligence and Neuroscience Volume 2015, Article ID 591954, 17 pages http://dx.doi.org/10.1155/2015/591954 Research Article Algorithmic Mechanism Design of Evolutionary Computation Yan

More information

The Essential Particle Swarm. James Kennedy Washington, DC

The Essential Particle Swarm. James Kennedy Washington, DC The Essential Particle Swarm James Kennedy Washington, DC Kennedy.Jim@gmail.com The Social Template Evolutionary algorithms Other useful adaptive processes in nature Social behavior Social psychology Looks

More information

Comparison of Loss Sensitivity Factor & Index Vector methods in Determining Optimal Capacitor Locations in Agricultural Distribution

Comparison of Loss Sensitivity Factor & Index Vector methods in Determining Optimal Capacitor Locations in Agricultural Distribution 6th NATIONAL POWER SYSTEMS CONFERENCE, 5th-7th DECEMBER, 200 26 Comparison of Loss Sensitivity Factor & Index Vector s in Determining Optimal Capacitor Locations in Agricultural Distribution K.V.S. Ramachandra

More information

Quantum-Inspired Differential Evolution with Particle Swarm Optimization for Knapsack Problem

Quantum-Inspired Differential Evolution with Particle Swarm Optimization for Knapsack Problem JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 31, 1757-1773 (2015) Quantum-Inspired Differential Evolution with Particle Swarm Optimization for Knapsack Problem DJAAFAR ZOUACHE 1 AND ABDELOUAHAB MOUSSAOUI

More information

A Restart CMA Evolution Strategy With Increasing Population Size

A Restart CMA Evolution Strategy With Increasing Population Size Anne Auger and Nikolaus Hansen A Restart CMA Evolution Strategy ith Increasing Population Size Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005 c IEEE A Restart CMA Evolution Strategy

More information

Experimental Comparisons of Derivative Free Optimization Algorithms

Experimental Comparisons of Derivative Free Optimization Algorithms Experimental Comparisons of Derivative Free Optimization Algorithms Anne Auger Nikolaus Hansen J. M. Perez Zerpa Raymond Ros Marc Schoenauer TAO Project-Team, INRIA Saclay Île-de-France, and Microsoft-INRIA

More information

Fuzzy Cognitive Maps Learning through Swarm Intelligence

Fuzzy Cognitive Maps Learning through Swarm Intelligence Fuzzy Cognitive Maps Learning through Swarm Intelligence E.I. Papageorgiou,3, K.E. Parsopoulos 2,3, P.P. Groumpos,3, and M.N. Vrahatis 2,3 Department of Electrical and Computer Engineering, University

More information

CLASSICAL gradient methods and evolutionary algorithms

CLASSICAL gradient methods and evolutionary algorithms IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 2, NO. 2, JULY 1998 45 Evolutionary Algorithms and Gradient Search: Similarities and Differences Ralf Salomon Abstract Classical gradient methods and

More information

Determination of Component Values for Butterworth Type Active Filter by Differential Evolution Algorithm

Determination of Component Values for Butterworth Type Active Filter by Differential Evolution Algorithm Determination of Component Values for Butterworth Type Active Filter by Differential Evolution Algorithm Bahadır Hiçdurmaz Department of Electrical & Electronics Engineering Dumlupınar University Kütahya

More information

Multiple Similarities Based Kernel Subspace Learning for Image Classification

Multiple Similarities Based Kernel Subspace Learning for Image Classification Multiple Similarities Based Kernel Subspace Learning for Image Classification Wang Yan, Qingshan Liu, Hanqing Lu, and Songde Ma National Laboratory of Pattern Recognition, Institute of Automation, Chinese

More information

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague

More information

Evolution Strategies and Covariance Matrix Adaptation

Evolution Strategies and Covariance Matrix Adaptation Evolution Strategies and Covariance Matrix Adaptation Cours Contrôle Avancé - Ecole Centrale Paris Anne Auger January 2014 INRIA Research Centre Saclay Île-de-France University Paris-Sud, LRI (UMR 8623),

More information

Problems of cryptography as discrete optimization tasks

Problems of cryptography as discrete optimization tasks Nonlinear Analysis 63 (5) e831 e837 www.elsevier.com/locate/na Problems of cryptography as discrete optimization tasks E.C. Laskari a,b, G.C. Meletiou c,, M.N. Vrahatis a,b a Computational Intelligence

More information

Gaussian bare-bones artificial bee colony algorithm

Gaussian bare-bones artificial bee colony algorithm Soft Comput DOI 10.1007/s00500-014-1549-5 METHODOLOGIES AND APPLICATION Gaussian bare-bones artificial bee colony algorithm Xinyu Zhou Zhijian Wu Hui Wang Shahryar Rahnamayan Springer-Verlag Berlin Heidelberg

More information

Background Mathematics (2/2) 1. David Barber

Background Mathematics (2/2) 1. David Barber Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and

More information

Dimensionality Reduction

Dimensionality Reduction Dimensionality Reduction Le Song Machine Learning I CSE 674, Fall 23 Unsupervised learning Learning from raw (unlabeled, unannotated, etc) data, as opposed to supervised data where a classification of

More information

Evolution Strategies. Nikolaus Hansen, Dirk V. Arnold and Anne Auger. February 11, 2015

Evolution Strategies. Nikolaus Hansen, Dirk V. Arnold and Anne Auger. February 11, 2015 Evolution Strategies Nikolaus Hansen, Dirk V. Arnold and Anne Auger February 11, 2015 1 Contents 1 Overview 3 2 Main Principles 4 2.1 (µ/ρ +, λ) Notation for Selection and Recombination.......................

More information

Self-Adaptive Ant Colony System for the Traveling Salesman Problem

Self-Adaptive Ant Colony System for the Traveling Salesman Problem Proceedings of the 29 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 29 Self-Adaptive Ant Colony System for the Traveling Salesman Problem Wei-jie Yu, Xiao-min

More information

Simulated Annealing with Parameter Tuning for Wind Turbine Placement Optimization

Simulated Annealing with Parameter Tuning for Wind Turbine Placement Optimization Simulated Annealing with Parameter Tuning for Wind Turbine Placement Optimization Daniel Lückehe 1, Oliver Kramer 2, and Manfred Weisensee 3 1 Department of Geoinformation, Jade University of Applied Sciences,

More information

DE/BBO: A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization

DE/BBO: A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization 1 : A Hybrid Differential Evolution with Biogeography-Based Optimization for Global Numerical Optimization Wenyin Gong, Zhihua Cai, and Charles X. Ling, Senior Member, IEEE Abstract Differential Evolution

More information

An Evolutionary Programming Based Algorithm for HMM training

An Evolutionary Programming Based Algorithm for HMM training An Evolutionary Programming Based Algorithm for HMM training Ewa Figielska,Wlodzimierz Kasprzak Institute of Control and Computation Engineering, Warsaw University of Technology ul. Nowowiejska 15/19,

More information

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component

More information

A COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION

A COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION A COMPARISON OF PARTICLE SWARM OPTIMIZATION AND DIFFERENTIAL EVOLUTION Vu Truong Vu Ho Chi Minh City University of Transport, Faculty of Civil Engineering No.2, D3 Street, Ward 25, Binh Thanh District,

More information

Modified Differential Evolution for Nonlinear Optimization Problems with Simple Bounds

Modified Differential Evolution for Nonlinear Optimization Problems with Simple Bounds Modified Differential Evolution for Nonlinear Optimization Problems with Simple Bounds Md. Abul Kalam Azad a,, Edite M.G.P. Fernandes b a Assistant Researcher, b Professor Md. Abul Kalam Azad Algoritmi

More information

Nature inspired optimization technique for the design of band stop FIR digital filter

Nature inspired optimization technique for the design of band stop FIR digital filter Nature inspired optimization technique for the design of band stop FIR digital filter Dilpreet Kaur, 2 Balraj Singh M.Tech (Scholar), 2 Associate Professor (ECE), 2 (Department of Electronics and Communication

More information

A self-guided Particle Swarm Optimization with Independent Dynamic Inertia Weights Setting on Each Particle

A self-guided Particle Swarm Optimization with Independent Dynamic Inertia Weights Setting on Each Particle Appl. Math. Inf. Sci. 7, No. 2, 545-552 (2013) 545 Applied Mathematics & Information Sciences An International Journal A self-guided Particle Swarm Optimization with Independent Dynamic Inertia Weights

More information

AMULTIOBJECTIVE optimization problem (MOP) can

AMULTIOBJECTIVE optimization problem (MOP) can 1 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1 Letters 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 Decomposition-Based Multiobjective Evolutionary Algorithm with an Ensemble of Neighborhood Sizes Shi-Zheng

More information

Performance Comparison of PSO Based State Feedback Gain (K) Controller with LQR-PI and Integral Controller for Automatic Frequency Regulation

Performance Comparison of PSO Based State Feedback Gain (K) Controller with LQR-PI and Integral Controller for Automatic Frequency Regulation Performance Comparison of PSO Based Feedback Gain Controller with LQR-PI and Controller for Automatic Frequency Regulation NARESH KUMARI 1, A. N. JHA 2, NITIN MALIK 3 1,3 School of Engineering and Technology,

More information

Regular paper. Particle Swarm Optimization Applied to the Economic Dispatch Problem

Regular paper. Particle Swarm Optimization Applied to the Economic Dispatch Problem Rafik Labdani Linda Slimani Tarek Bouktir Electrical Engineering Department, Oum El Bouaghi University, 04000 Algeria. rlabdani@yahoo.fr J. Electrical Systems 2-2 (2006): 95-102 Regular paper Particle

More information

Secondary Frequency Control of Microgrids In Islanded Operation Mode and Its Optimum Regulation Based on the Particle Swarm Optimization Algorithm

Secondary Frequency Control of Microgrids In Islanded Operation Mode and Its Optimum Regulation Based on the Particle Swarm Optimization Algorithm International Academic Institute for Science and Technology International Academic Journal of Science and Engineering Vol. 3, No. 1, 2016, pp. 159-169. ISSN 2454-3896 International Academic Journal of

More information

Genetic Algorithm: introduction

Genetic Algorithm: introduction 1 Genetic Algorithm: introduction 2 The Metaphor EVOLUTION Individual Fitness Environment PROBLEM SOLVING Candidate Solution Quality Problem 3 The Ingredients t reproduction t + 1 selection mutation recombination

More information

Optimal Placement and Sizing of Distributed Generation for Power Loss Reduction using Particle Swarm Optimization

Optimal Placement and Sizing of Distributed Generation for Power Loss Reduction using Particle Swarm Optimization Available online at www.sciencedirect.com Energy Procedia 34 (2013 ) 307 317 10th Eco-Energy and Materials Science and Engineering (EMSES2012) Optimal Placement and Sizing of Distributed Generation for

More information

Limiting the Velocity in the Particle Swarm Optimization Algorithm

Limiting the Velocity in the Particle Swarm Optimization Algorithm Limiting the Velocity in the Particle Swarm Optimization Algorithm Julio Barrera 1, Osiris Álvarez-Bajo 2, Juan J. Flores 3, Carlos A. Coello Coello 4 1 Universidad Michoacana de San Nicolás de Hidalgo,

More information

2 Differential Evolution and its Control Parameters

2 Differential Evolution and its Control Parameters COMPETITIVE DIFFERENTIAL EVOLUTION AND GENETIC ALGORITHM IN GA-DS TOOLBOX J. Tvrdík University of Ostrava 1 Introduction The global optimization problem with box constrains is formed as follows: for a

More information

A PARAMETER CONTROL SCHEME FOR DE INSPIRED BY ACO

A PARAMETER CONTROL SCHEME FOR DE INSPIRED BY ACO A PARAMETER CONTROL SCHEME FOR DE INSPIRED BY ACO Dražen Bajer, Goran Martinović Faculty of Electrical Engineering, Josip Juraj Strossmayer University of Osijek, Croatia drazen.bajer@etfos.hr, goran.martinovic@etfos.hr

More information

Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation.

Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Daniel Ramírez A., Israel Truijillo E. LINDA LAB, Computer Department, UNAM Facultad

More information