Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1

Size: px
Start display at page:

Download "Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1"

Transcription

1 Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1 Per Kristian Lehre University of Nottingham Nottingham NG8 1BB, UK PerKristian.Lehre@nottingham.ac.uk Pietro S. Oliveto University of Sheffield Sheffield S1 4DP, UK P.Oliveto@sheffield.ac.uk Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage, and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). GECCO 15 Companion, July 11-15, 2015, Madrid, Spain. ACM /15/ For the latest version of these slides, see pkl/gecco2015.

2 Bio-sketch - Dr Per Kristian Lehre Assistant Professor in the School of Computer Science, at the University of Nottingham. MSc and PhD in Computer Science from Norwegian University of Science and Technology (NTNU). Research on theoretical aspects of evolutionary algorithms and other randomised search heuristics. Editorial board member of Evolutionary Computation. Guest editor for special issues of IEEE Transactions of Evolutionary Computation and Theoretical Computer Science. Best paper awards at GECCO 2006, 2009, 2010, 2013, ICSTW 2008, and ISAAC 2014, nominations at CEC 2009, and GECCO Coordinator of 2M euro SAGE EU project unifying population genetics and EC theory. Organiser of Dagstuhl seminar on Evolution and Computation. 2 / 63

3 Bio-sketch - Dr Pietro S. Oliveto Vice-Chancellor s Fellow and EPSRC Early Career Fellow in the Department of Computer Science, at the University of Sheffield. Laurea Degree in Computer Science from the University of Catania, Italy (2005). PhD in Computer Science ( ), EPSRC PhD+ Research Fellow ( ), EPSRC Postdoctoral Fellow in Theoretical Computer Science at the University of Birmingham, UK Research on theoretical aspects of evolutionary algorithms and other randomised search heuristics. Guest editor for special issues of Evolutionary Computation (MIT Press, 2015) and Computer Science and Technology (Springer, 2012). Best paper awards at GECCO 2008, ICARIS 2011, GECCO 2014 and best paper nominations at CEC 2009, ECTA Chair of IEEE CIS Task Force on Theoretical Foundations of Bio-inspired Computation. 3 / 63

4 Aims and Goals of this Tutorial This tutorial will provide an overview of the goals of time complexity analysis of Evolutionary Algorithms (EAs) the most common and effective techniques You should attend if you wish to theoretically understand the behaviour and performance of the search algorithms you design familiarise with the techniques used in the time complexity analysis of EAs pursue research in the area enable you or enhance your ability to 1 understand theoretically the behaviour of EAs on different problems 2 perform time complexity analysis of simple EAs on common toy problems 3 read and understand research papers on the computational complexity of EAs 4 have the basic skills to start independent research in the area 5 follow the other theory tutorials later on today 4 / 63

5 Introduction to the theory of EAs Evolutionary Algorithms and Computer Science Goals of design and analysis of algorithms 1 correctness does the algorithm always output the correct solution? 2 computational complexity how many computational resources are required? For Evolutionary Algorithms (General purpose) 1 convergence Does the EA find the solution in finite time? 2 time complexity how long does it take to find the optimum? (time = n. of fitness function evaluations) 5 / 63

6 Introduction to the theory of EAs Brief history Theoretical studies of Evolutionary Algorithms (EAs), albeit few, have always existed since the seventies [Goldberg, 1989]; Early studies were concerned with explaining the behaviour rather than analysing their performance. Schema Theory was considered fundamental; First proposed to understand the behaviour of the simple GA [Holland, 1992]; It cannot explain the performance or limit behaviour of EAs; Building Block Hypothesis was controversial [Reeves and Rowe, 2002]; No Free Lunch [Wolpert and Macready, 1997] Over all functions... Convergence results appeared in the nineties [Rudolph, 1998]; Related to the time limit behaviour of EAs. 6 / 63

7 Convergence analysis of EAs Convergence Definition Ideally the EA should find the solution in finite steps with probability 1 (visit the global optimum in finite time); If the solution is held forever after, then the algorithm converges to the optimum! 7 / 63

8 Convergence analysis of EAs Convergence Definition Ideally the EA should find the solution in finite steps with probability 1 (visit the global optimum in finite time); If the solution is held forever after, then the algorithm converges to the optimum! Conditions for Convergence ([Rudolph, 1998]) 1 There is a positive probability to reach any point in the search space from any other point 2 The best found solution is never removed from the population (elitism) 7 / 63

9 Convergence analysis of EAs Convergence Definition Ideally the EA should find the solution in finite steps with probability 1 (visit the global optimum in finite time); If the solution is held forever after, then the algorithm converges to the optimum! Conditions for Convergence ([Rudolph, 1998]) 1 There is a positive probability to reach any point in the search space from any other point 2 The best found solution is never removed from the population (elitism) Canonical GAs using mutation, crossover and proportional selection Do Not converge! Elitist variants Do converge! In practice, is it interesting that an algorithm converges to the optimum? Most EAs visit the global optimum in finite time (RLS does not!) How much time? 7 / 63

10 Computational complexity of EAs Computational Complexity of EAs 8 / 63

11 Computational complexity of EAs Computational Complexity of EAs Generally means predicting the resources the algorithm requires: Usually the computational time: the number of primitive steps; Usually grows with size of the input; Usually expressed in asymptotic notation; Exponential runtime: Inefficient algorithm Polynomial runtime: Efficient algorithm 8 / 63

12 Computational complexity of EAs Computational Complexity of EAs However (EAs): 1 In practice the time for a fitness function evaluation is much higher than the rest; 2 EAs are randomised algorithms They do not perform the same operations even if the input is the same! They do not output the same result if run twice! Hence, the runtime of an EA is a random variable T f. We are interested in: 1 Estimating E(T f ), the expected runtime of the EA for f ; 2 Estimating P(T f t), the success probability of the EA in t steps for f. 8 / 63

13 Computational complexity of EAs Asymptotic notation f (n) O(g(n)) constants c, n 0 > 0 st. 0 f (n) cg(n) n n 0 f (n) Ω(g(n)) constants c, n 0 > 0 st. 0 cg(n) f (n) n n 0 f (n) Θ(g(n)) f (n) O(g(n)) and f (n) Ω(g(n)) f (n) o(g(n)) lim n f (n) g(n) = 0 9 / 63

14 Computational complexity of EAs Goals Understand how the runtime depends on: characteristics of the problem parameters of the algorithm In order to: explain the success or the failure of these methods in practical applications, understand which problems are optimized (or approximated) efficiently by a given algorithm and which are not guide the choice of the best algorithm for the problem at hand, determine the optimal parameter settings, aid the algorithm design. 10 / 63

15 General EAs Evolutionary Algorithms (µ+λ) EA Initialise P 0 with µ individuals chosen uniformly a random from {0, 1} n for t = 0, 1, 2,... until stopping condition met do Create λ new individuals by choosing x P t uniformly at random flipping each bit in x with probability p Create the new population P t+1 by choosing the best µ individuals out of µ + λ. end for 11 / 63

16 General EAs Evolutionary Algorithms (µ+λ) EA Initialise P 0 with µ individuals chosen uniformly a random from {0, 1} n for t = 0, 1, 2,... until stopping condition met do Create λ new individuals by choosing x P t uniformly at random flipping each bit in x with probability p Create the new population P t+1 by choosing the best µ individuals out of µ + λ. end for If µ = λ = 1, then we get the (1+1) EA; 11 / 63

17 General EAs Evolutionary Algorithms (µ+λ) EA Initialise P 0 with µ individuals chosen uniformly a random from {0, 1} n for t = 0, 1, 2,... until stopping condition met do Create λ new individuals by choosing x P t uniformly at random flipping each bit in x with probability p Create the new population P t+1 by choosing the best µ individuals out of µ + λ. end for If µ = λ = 1, then we get the (1+1) EA; p = 1/n is generally considered a good parameter setting [Bäck, 1993, Droste et al., 1998]; 11 / 63

18 General EAs Evolutionary Algorithms (µ+λ) EA Initialise P 0 with µ individuals chosen uniformly a random from {0, 1} n for t = 0, 1, 2,... until stopping condition met do Create λ new individuals by choosing x P t uniformly at random flipping each bit in x with probability p Create the new population P t+1 by choosing the best µ individuals out of µ + λ. end for If µ = λ = 1, then we get the (1+1) EA; p = 1/n is generally considered a good parameter setting [Bäck, 1993, Droste et al., 1998]; By introducing stochastic selection and crossover we obtain a Genetic Algorithm (GA) 11 / 63

19 (1+1) EA and RLS (1+1) Evolutionary Algorithm (1+1) EA Initialise x uniformly at random from {0, 1} n. repeat Create x by flipping each bit in x with p = 1/n. if f (x ) f (x) then x x. end if until stopping condition met. If only one bit is flipped per iteration: Random Local Search (RLS). How does it work? 12 / 63

20 (1+1) EA and RLS (1+1) Evolutionary Algorithm (1+1) EA Initialise x uniformly at random from {0, 1} n. repeat Create x by flipping each bit in x with p = 1/n. if f (x ) f (x) then x x. end if until stopping condition met. If only one bit is flipped per iteration: Random Local Search (RLS). How does it work? Given x, how many bits will flip in expectation? 12 / 63

21 (1+1) EA and RLS (1+1) Evolutionary Algorithm (1+1) EA Initialise x uniformly at random from {0, 1} n. repeat Create x by flipping each bit in x with p = 1/n. if f (x ) f (x) then x x. end if until stopping condition met. If only one bit is flipped per iteration: Random Local Search (RLS). How does it work? Given x, how many bits will flip in expectation? E[X] = E[X 1 + X X n] = E[X 1] + E[X 2] + + E[X n] = 12 / 63

22 (1+1) EA and RLS (1+1) Evolutionary Algorithm (1+1) EA Initialise x uniformly at random from {0, 1} n. repeat Create x by flipping each bit in x with p = 1/n. if f (x ) f (x) then x x. end if until stopping condition met. If only one bit is flipped per iteration: Random Local Search (RLS). How does it work? Given x, how many bits will flip in expectation? E[X] = E[X 1 + X X n] = E[X 1] + E[X 2] + + E[X n] = (E[X i] = 1 1/n + 0 (1 1/n) = 1 1/n = 1/n E(X) = np) 12 / 63

23 (1+1) EA and RLS (1+1) Evolutionary Algorithm (1+1) EA Initialise x uniformly at random from {0, 1} n. repeat Create x by flipping each bit in x with p = 1/n. if f (x ) f (x) then x x. end if until stopping condition met. If only one bit is flipped per iteration: Random Local Search (RLS). How does it work? Given x, how many bits will flip in expectation? E[X] = E[X 1 + X X n] = E[X 1] + E[X 2] + + E[X n] = (E[X i] = 1 1/n + 0 (1 1/n) = 1 1/n = 1/n E(X) = np) n = 1 1/n = n/n = 1 i=1 12 / 63

24 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j 13 / 63

25 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j What is the probability of flipping exactly one bit? 13 / 63

26 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j What is the probability of flipping exactly one bit? ( ) ( ) ( n 1 Pr (X = 1) = 1 1 ) n 1 ( = 1 1 ) n 1 1/e n n n 13 / 63

27 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j What is the probability of flipping exactly one bit? ( ) ( ) ( n 1 Pr (X = 1) = 1 1 ) n 1 ( = 1 1 ) n 1 1/e n n n Is flipping two bits more likely than flipping none? 13 / 63

28 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j What is the probability of flipping exactly one bit? ( ) ( ) ( n 1 Pr (X = 1) = 1 1 ) n 1 ( = 1 1 ) n 1 1/e n n n Is flipping two bits more likely than flipping none? ( ) ( ) n 1 2 ( Pr (X = 2) = 1 1 ) n 2 2 n n n(n 1) = 2 = 1 2 ( ) 1 2 ( 1 1 ) n 2 n n ( 1 1 n ) n 1 1/(2e) 13 / 63

29 General properties (1+1) EA: 2 How likely is it that exactly one bit flips? Pr (X = j) = ( ) n j p j (1 p) n j What is the probability of flipping exactly one bit? ( ) ( ) ( n 1 Pr (X = 1) = 1 1 ) n 1 ( = 1 1 ) n 1 1/e n n n Is flipping two bits more likely than flipping none? ( ) ( ) n 1 2 ( Pr (X = 2) = 1 1 ) n 2 2 n n n(n 1) = 2 = 1 2 ( ) 1 2 ( 1 1 ) n 2 n n ( 1 1 n ) n 1 1/(2e) While Pr (X = 0) = ( ) n (1/n) 0 (1 1/n) n 1/e 0 13 / 63

30 General properties Running Example - Functions of Unitation ( n ) g(x) = f x i i=1 where f : R R 14 / 63

31 General properties Running Example - Functions of Unitation ( n ) g(x) = f x i i=1 where f : R R 14 / 63

32 General properties Running Example - Functions of Unitation r f (x) = f i(x) i=1 14 / 63

33 General properties Linear Unitation Block f ( x ) = { a x + b if k < n x k + m 0 otherwise. 15 / 63

34 General properties Toy Problem Framework - Gap f ( x ) = { a if n x = k + m 0 otherwise. 16 / 63

35 General properties Toy Problem Framework - Plateau f ( x ) = { a if k < n x k + m 0 otherwise. 17 / 63

36 General properties Upper bound on the total runtime r f (x) = f i(x) i=1 Assumptions r sub-functions f 1, f 2,..., f r T i time to optimise sub-function f i the evolutionary algorithm is elitist By linearity of expectation, an upper bound on the expected runtime is [ r ] r E [T] E T i = E [T i]. i=1 i=1 18 / 63

37 The gap sub-problem Gap block: upper and lower bounds f ( x ) = { a if n x = k + m 0 otherwise. The probability p of optimising a gap block of length m at position k is ( ) ( ) m + k 1 m ( ) ( ) 1 m + k 1 m m n e p m n The expected time to optimise the gap block is 1/p ( ) 1 ( m + k n m E [T] en m m + k m m ) 1 19 / 63

38 The gap sub-problem Gap block: upper and lower bounds f ( x ) = { a if n x = k + m 0 otherwise. The probability p of optimising a gap block of length m at position k is ( ) m + k m ( ) ( ) 1 m + k 1 m ( ) ( ) nm e 1 m + k 1 m ( (m + k)e m n e p m n nm The expected time to optimise the gap block is 1/p ( ) m ( ) 1 ( ) 1 ( ) nm m + k n m E [T] en m m + k nm m e (m + k)e m m m + k ) m using ( ) n k ( k n ( k) en ) k k for k / 63

39 Tail Inequalities E [X] Tail inequalities: The expectation can often be estimated easily. Would like to know the probability of deviating far from expectation, i.e., the tails of the distribution Tail inequalities give bounds on the tails given the expectation. 20 / 63

40 Markov s inequality Markov s Inequality [Motwani and Raghavan, 1995] A fundamental inequality from which many others are derived. 21 / 63

41 Markov s inequality Markov s Inequality [Motwani and Raghavan, 1995] A fundamental inequality from which many others are derived. Theorem (Markov s Inequality) Let X be a random variable assuming only non-negative values. Then for all t R +, Pr(X t) E [X]. t 21 / 63

42 Markov s inequality Markov s Inequality [Motwani and Raghavan, 1995] A fundamental inequality from which many others are derived. Theorem (Markov s Inequality) Let X be a random variable assuming only non-negative values. Then for all t R +, Pr(X t) E [X]. t Number of bits that are flipped in a mutation step If E [X] = 1, then Pr(X 2) E [X] /2 = 1/2. 21 / 63

43 Markov s inequality Markov s Inequality [Motwani and Raghavan, 1995] A fundamental inequality from which many others are derived. Theorem (Markov s Inequality) Let X be a random variable assuming only non-negative values. Then for all t R +, Pr(X t) E [X]. t Number of bits that are flipped in a mutation step If E [X] = 1, then Pr(X 2) E [X] /2 = 1/2. Number of one-bits after initialisation If E [X] = n/2, then Pr(X (2/3)n) E[X] = n/2 = 3/4. (2/3)n (2/3)n 21 / 63

44 Markov s inequality Markov s Inequality [Motwani and Raghavan, 1995] A fundamental inequality from which many others are derived. Theorem (Markov s Inequality) Let X be a random variable assuming only non-negative values. Then for all t R +, Pr(X t) E [X]. t Number of bits that are flipped in a mutation step If E [X] = 1, then Pr(X 2) E [X] /2 = 1/2. Number of one-bits after initialisation If E [X] = n/2, then Pr(X (2/3)n) E[X] = n/2 = 3/4. (2/3)n (2/3)n 21 / 63

45 Chernoff bounds Chernoff Bounds Let X 1, X 2,... X n be independent Poisson trials each with probability p i; For X = n i=1 Xi the expectation is E(X) = n pi. i=1 Theorem (Chernoff Bounds) ( ) E[X]δ 1 Pr(X (1 δ)e [X]) exp 2 for 0 δ 1. 2 ( ) E[X] 2 e Pr(X > (1 + δ)e [X]) δ (1+δ) for δ > 0. 1+δ 22 / 63

46 Chernoff bounds Chernoff Bounds Let X 1, X 2,... X n be independent Poisson trials each with probability p i; For X = n i=1 Xi the expectation is E(X) = n pi. i=1 Theorem (Chernoff Bounds) ( ) E[X]δ 1 Pr(X (1 δ)e [X]) exp 2 for 0 δ 1. 2 ( ) E[X] 2 e Pr(X > (1 + δ)e [X]) δ (1+δ) for δ > 0. 1+δ What is the probability that we have more than (2/3)n one-bits at initialisation? 22 / 63

47 Chernoff bounds Chernoff Bounds Let X 1, X 2,... X n be independent Poisson trials each with probability p i; For X = n i=1 Xi the expectation is E(X) = n pi. i=1 Theorem (Chernoff Bounds) ( ) E[X]δ 1 Pr(X (1 δ)e [X]) exp 2 for 0 δ 1. 2 ( ) E[X] 2 e Pr(X > (1 + δ)e [X]) δ (1+δ) for δ > 0. 1+δ What is the probability that we have more than (2/3)n one-bits at initialisation? p i = 1/2, E [X] = n/2, 22 / 63

48 Chernoff bounds Chernoff Bounds Let X 1, X 2,... X n be independent Poisson trials each with probability p i; For X = n i=1 Xi the expectation is E(X) = n pi. i=1 Theorem (Chernoff Bounds) ( ) E[X]δ 1 Pr(X (1 δ)e [X]) exp 2 for 0 δ 1. 2 ( ) E[X] 2 e Pr(X > (1 + δ)e [X]) δ (1+δ) for δ > 0. 1+δ What is the probability that we have more than (2/3)n one-bits at initialisation? p i = 1/2, E [X] = n/2, (we fix δ = 1/3 (1 + δ)e [X] = (2/3)n); then: ( ) n/2 e Pr(X > (2/3)n) 1/3 (4/3) = c n/2 4/3 22 / 63

49 Chernoff bounds Chernoff Bound Simple Application Bitstring of length n = 100 Pr(X i) = 1/2 and E(X) = np = 100/2 = / 63

50 Chernoff bounds Chernoff Bound Simple Application Bitstring of length n = 100 Pr(X i) = 1/2 and E(X) = np = 100/2 = 50. What is the probability to have at least 75 1-bits? 23 / 63

51 Chernoff bounds Chernoff Bound Simple Application Bitstring of length n = 100 Pr(X i) = 1/2 and E(X) = np = 100/2 = 50. What is the probability to have at least 75 1-bits? Markov: Pr(X 75) = 2 3 ( e ) 50 Chernoff: Pr(X (1 + 1/2)50) (3/2) < /2 Truth: Pr(X 75) = ( ) i=75 i < / 63

52 AFL method for upper bounds Onemax Onemax(x) := x 1 + x x n = f (x) n i=1 x i n n x 24 / 63

53 AFL method for upper bounds Fitness-based Partitions Fitness Am Am 1. A3 A2 A1 Definition A tuple (A 1, A 2,..., A m) is an f -based partition of f : X R if 1 A 1 A 2 A m = X 2 A i A j = for i j 3 f (A 1) < f (A 2) < < f (A m) 4 f (A m) = max x f (x) Example Partition of Onemax into n + 1 levels A j := {x {0, 1} n Onemax(x) = j} 25 / 63

54 AFL method for upper bounds Artificial Fitness Levels - Upper bounds A m s i : prob. of starting in A i u i : prob. of jumping from A i to any A j, i < j. T i : Time to jump from A i to any A j, i < j. Fitness A m 1. A 3 A 2 A 1 Expected runtime [ m 1 m 1 ] E [T] s i E T j i=1 j=i m 1 m 1 = s i E [T j ] i=1 j=i m 1 m 1 m 1 = s i 1/u j 1/u j. i=1 j=i j=1 26 / 63

55 AFL method for upper bounds (1+1) EA on Onemax Theorem The expected runtime of (1+1) EA on Onemax is O(n ln n). Proof 27 / 63

56 AFL method for upper bounds (1+1) EA on Onemax Theorem The expected runtime of (1+1) EA on Onemax is O(n ln n). Proof The current solution is in level A j if it has j ones (hence n j zeroes). 27 / 63

57 AFL method for upper bounds (1+1) EA on Onemax Theorem The expected runtime of (1+1) EA on Onemax is O(n ln n). Proof The current solution is in level A j if it has j ones (hence n j zeroes). To reach a higher fitness level it is sufficient to flip a zero into a one and leave the other bits unchanged, which occurs with probability u j (n j) 1 ( 1 1 ) n 1 n j n n en 27 / 63

58 AFL method for upper bounds (1+1) EA on Onemax Theorem The expected runtime of (1+1) EA on Onemax is O(n ln n). Proof The current solution is in level A j if it has j ones (hence n j zeroes). To reach a higher fitness level it is sufficient to flip a zero into a one and leave the other bits unchanged, which occurs with probability u j (n j) 1 ( 1 1 ) n 1 n j n n en Then by Artificial Fitness Levels m 1 E [T] 1/u j j=0 n 1 j=0 en n j = en n i=1 1 i en(ln n + 1) = O(n ln n) 27 / 63

59 AFL method for upper bounds Linear Unitation Block: Upper bound Theorem The expected runtime of the (1+1)-EA for a linear block is O(n ln((m + k)/k)). Proof 28 / 63

60 AFL method for upper bounds Linear Unitation Block: Upper bound Theorem The expected runtime of the (1+1)-EA for a linear block is O(n ln((m + k)/k)). Proof Let i := n j be the number of 0-bits in block A j ( ) The probability is u i i 1 n 1 1 n 1 ( n i ) en 28 / 63

61 AFL method for upper bounds Linear Unitation Block: Upper bound Theorem The expected runtime of the (1+1)-EA for a linear block is O(n ln((m + k)/k)). Proof Let i := n j be the number of 0-bits in block A j ( ) The probability is u i i 1 n 1 1 n 1 ( n i ) en Hence, ( ) ( 1 u i en ) i 28 / 63

62 AFL method for upper bounds Linear Unitation Block: Upper bound Theorem The expected runtime of the (1+1)-EA for a linear block is O(n ln((m + k)/k)). Proof Let i := n j be the number of 0-bits in block A j ( ) The probability is u i i 1 n 1 1 n 1 ( n i ) en Hence, ( ) ( 1 u i en ) i Then (Artificial Fitness Levels): ( k+m k+m en k+m ) 1 E(T) en i i en 1 k ( ) i 1 m + k en ln i k i=k+1 i=k+1 i=1 i=1 28 / 63

63 AFL for lower bounds Artificial Fitness Levels - Lower bounds 2 Theorem ([Sudholt, 2010]) Let A m A m 1 s i : prob. of starting in A i u i : prob. of leaving A i, and p ij : prob. of jumping from A i to A j. Fitness. A 3 If there exists a χ [0, 1) st. for i < j m 1 p ij χ p ik, k=j A 2 A 1 then m 1 E [T] χ i=1 s i m 1 j=i 1 u j. 2 A different version of the theorem is presented. 29 / 63

64 AFL for lower bounds (1+1) EA lower bound for Onemax Fitness level A i := {x {0, 1} n Onemax(x) = i} i {}}{{}}{ x = A i n i Probability p ij of jumping to level j > i and beyond ( ) ( ) n i 1 j i ( p ij 1 1 ) n (j i) j i n n n 1 ( ) ( ) n i 1 j i p ik j i n Hence, for χ = 1/e k=j p ij ( 1 1 n ) n (j i) n 1 k=j n 1 p ik χ k=j p ik 30 / 63

65 AFL for lower bounds (1+1) EA lower bound for Onemax Theorem The expected runtime of the (1+1) EA for Onemax is Ω(n ln n). Probability u i of any improvement u i n i n We have already seen that n si 3/4, hence i=(2/3)n ( ) n 1 1 E [T] e > > i=0 s i ( ) ( (2/3)n 1 e i=0 n 1 j=i ( n e ) (1 3/4) 1 u j s i ) ( n/3 n 1 u j j=(2/3)n j=1 1 j ) 1 = Ω(n ln n) 31 / 63

66 AFL for lower bounds Linear Block: Lower Bound Theorem The expected time to finish a linear block of length m starting at k + m 0-bits is Ω(n ln((m + k)/k)). For 0 i m, define A i := {x : n x = k + m i}. Note that ( ) ( ) k + m i 1 j i ( p ij = 1 1 ) n (j i) j i n n m 1 ( ) ( ) k + m i 1 j i p ik j i n Therefore, k=j p ij ( 1 1 n and assuming that s 0 = 1, we get ( ) m 1 1 e E [T] i=0 1 u i ( 1 e ) n (j i) m 1 ) m 1 i=0 ( ) m 1 1 p ik e k=j k=j p ik ( ) ( m+k n n m + k i = 1 e i i=1 k i=1 ) 1 i 32 / 63

67 AFL for non-elitist EAs Advanced: Fitness levels for non-elitist populations x P t P t+1 for t = 0, 1, 2,... until termination condition do for i = 1 to λ do Sample i-th parent x according to p sel (P t, f ) Sample i-th offspring P t+1(i) according to p var(x) end for end for A general algorithmic scheme for non-elitistic EAs f : X R fitness function over arbitrary finite search space X p sel selection mechanism (e.g. (µ, λ)-selection) p var variation operator (e.g. mutation) 33 / 63

68 AFL for non-elitist EAs Advanced: Fitness Levels for non-elitist Populations Theorem ([Dang and Lehre, 2014]) Fitness A m A m 1. A 3 If exists δ, γ, s 1,..., s m, s, p 0 (0, 1) st. ( ) (C1) p var y A + j x A j sj s upgrade probability s j ( ) (C2) p var y Aj A + j x A j p0 resting probability p 0 A 2 A 1 34 / 63

69 AFL for non-elitist EAs Advanced: Fitness Levels for non-elitist Populations Theorem ([Dang and Lehre, 2014]) Fitness A m A m 1. A 3 A 2 A 1 If exists δ, γ, s 1,..., s m, s, p 0 (0, 1) st. ( ) (C1) p var y A + j x A j sj s upgrade probability s j ( ) (C2) p var y Aj A + j x A j p0 resting probability p 0 (C3) β(γ) > γ(1 + δ)/p 0 for all γ < γ high selective pressure (C4) λ > c ln(m/s ) for some const. c large population size then for a constant c > 0 ( E [T] c mλ ln λ + m 1 j=1 ) 1 s j 34 / 63

70 AFL for non-elitist EAs Example: (µ, λ) EA on LeadingOnes Leading 1-bits. Random bitstring. {}}{{}}{ x = First 0-bit. n LeadingOnes(x) = i i=1 j=1 x i Theorem If λ/µ > e and λ > c ln n, then the expected runtime of (µ,λ) EA on LeadingOnes is O(nλ ln λ + n 2 ). 35 / 63

71 AFL for non-elitist EAs Measuring Selective Pressure Definition Let x (1), x (2),..., x (λ) be the individuals in a population P X λ, sorted according to a fitness function f : X R, i.e. f ( x (1)) f ( x (2)) f ( x (λ)). For any γ (0, 1), the cumulative selection probability of p sel is β(γ) := Pr ( f (y) f ( x (γλ)) y is sampled from p sel (P, f ) ) 36 / 63

72 AFL for non-elitist EAs Cumulative Selection Prob. - Example f γλ λ β(γ) = Pr ( f (y) f ( x (γλ)) y is sampled from p sel (P, f ) ) 37 / 63

73 AFL for non-elitist EAs Cumulative Selection Prob. - Example (µ, λ)-selection f γλ λ µ p sel β(γ) = Pr ( f (y) f ( x (γλ)) y is sampled from p sel (P, f ) ) γλ µ if γλ µ 37 / 63

74 AFL for non-elitist EAs Example Application 3 (µ,λ) EA with bit-wise mutation rate χ/n on LeadingOnes Partition of fitness function into m := n + 1 levels A j := {x {0, 1} n x 1 = x 2 = = x j 1 = 1 x j = 0} If λ/µ > e χ and λ > c ln(n) then (C1) p var ( y A + j x A j ) = Ω(1/n) (C2) p var ( y Aj A + j x A j ) e χ (C3) (C4) β(γ) γλ/µ > γe χ λ > c ln(n) 3 Calculations on this slide are approximate. See [Dang and Lehre, 2014] for exact calculations. 38 / 63

75 AFL for non-elitist EAs Example Application 3 (µ,λ) EA with bit-wise mutation rate χ/n on LeadingOnes Partition of fitness function into m := n + 1 levels A j := {x {0, 1} n x 1 = x 2 = = x j 1 = 1 x j = 0} If λ/µ > e χ and λ > c ln(n) then (C1) p var ( y A + j x A j ) = Ω(1/n) =: sj =: s (C2) p var ( y Aj A + j x A j ) e χ =: p 0 (C3) β(γ) γλ/µ > γe χ = γ/p 0 (C4) λ > c ln(n) > c ln(m/s ) then E [T] = O(mλ ln λ + m j=1 s 1 j ) = O(nλ ln λ + n 2 ) 3 Calculations on this slide are approximate. See [Dang and Lehre, 2014] for exact calculations. 38 / 63

76 AFL for non-elitist EAs Artificial Fitness Levels: Conclusions It s a powerful general method to obtain (often) tight upper bounds on the runtime of simple EAs; For offspring populations tight bounds can often be achieved with the general method; There exist variants 4 of artificial fitness levels for populations [Dang and Lehre, 2014], including genetic algoritms using crossover [Corus et al., 2014] 4 See another tutorial at pkl/populations/ 39 / 63

77 What is Drift 5 Analysis? 5 NB! (Stochastic) drift is a different concept than genetic drift in population genetics. 40 / 63

78 What is Drift 5 Analysis? Prediction of the long term behaviour of a process X hitting time, stability, occupancy time etc. from properties of. 5 NB! (Stochastic) drift is a different concept than genetic drift in population genetics. 40 / 63

79 Additive Drift Theorem Additive Drift Theorem ε 0 Y k = d(x k ) (C1+) k E [Y k+1 Y k Y k > 0] ε (C1 ) k E [Y k+1 Y k Y k > 0] ε b 41 / 63

80 Additive Drift Theorem Additive Drift Theorem ε 0 Y k = d(x k ) (C1+) k E [Y k+1 Y k Y k > 0] ε (C1 ) k E [Y k+1 Y k Y k > 0] ε b Theorem ([He and Yao, 2001, Jägersküpper, 2007, Jägersküpper, 2008]) Given a stochastic process Y 1, Y 2,... over an interval [0, b] R. Define T := min{k 0 Y k = 0}, and assume E [T] <. If (C1+) holds for an ε > 0, then E [T Y 0] b/ε. If (C1 ) holds for an ε > 0, then E [T Y 0] Y 0/ε. 41 / 63

81 Additive Drift Theorem Plateau Block Function: Upper Bound Let k > n/2 + ɛn. { a if k n x k + m PlateauBlock l ( x ) = 0 otherwise. Theorem The expected time for the (1+1)-EA to optimise the Plateau function is O(m). Proof Let X t be the number of 0-bits at time t. Then the drift is E[ (t)] Xt n n Xt n Hence, by drift analysis E[T] = 2Xt n 1 2k n 1 m (2k)/n 1 = mn 2k n = O(m) where the last equality holds as long as k > n/2 + ɛn 42 / 63

82 Additive Drift Theorem Plateau Block Function: Lower Bound Let k > n/2 + ɛn. { a if k n x k + m PlateauBlock l ( x ) = 0 otherwise. Theorem The expected time for the (1+1)-EA to optimise the Plateau function is Θ(m). Proof Let X t be the number of 0-bits at time t. Then the drift is E( (t) = Xt n n Xt n Hence, by drift analysis E[T] = 2Xt n 1 2(m + k) n 1 m 2(m + k)/n 1 = mn 2(m + k) n = Ω(m) where the last equality holds as long as k > n/2 + ɛn 43 / 63

83 Multiplicative Drift Theorem Drift Analysis for Onemax Lets calculate the runtime of the (1+1)-EA using the additive Drift Theorem. 1 Let d(x t) = i where i is the number of zeroes in the bitstring; 44 / 63

84 Multiplicative Drift Theorem Drift Analysis for Onemax Lets calculate the runtime of the (1+1)-EA using the additive Drift Theorem. 1 Let d(x t) = i where i is the number of zeroes in the bitstring; 2 Note that d(x t) d(x t+1) 0 for all t; 3 The distance decreases by 1 as long as a 0 is flipped and the ones remain unchanged: ( E( (t)) = E[d(X t) d(x t+1) X t] 1 i n 1 1 n ) n 1 i en 1 en =: δ 44 / 63

85 Multiplicative Drift Theorem Drift Analysis for Onemax Lets calculate the runtime of the (1+1)-EA using the additive Drift Theorem. 1 Let d(x t) = i where i is the number of zeroes in the bitstring; 2 Note that d(x t) d(x t+1) 0 for all t; 3 The distance decreases by 1 as long as a 0 is flipped and the ones remain unchanged: ( E( (t)) = E[d(X t) d(x t+1) X t] 1 i n 1 1 n ) n 1 i en 1 en =: δ 4 The expected initial distance is E(d(X 0)) = n/2 The expected runtime is E(T d(x 0) > 0) E[(d(X0)] δ We need a different distance function! n/2 1/(en) = e/2 n2 = O(n 2 ) 44 / 63

86 Multiplicative Drift Theorem Drift Analysis for Onemax 1 Let d(x t) = ln(i + 1) where i is the number of zeroes in the bitstring; 45 / 63

87 Multiplicative Drift Theorem Drift Analysis for Onemax 1 Let d(x t) = ln(i + 1) where i is the number of zeroes in the bitstring; 2 For x 1, it holds that ln(1 + 1/x) 1/x 1/(2x 2 ) 1/(2x). 3 The distance decreases as long as a 0 is flipped and the ones remain unchanged E [ (t)] = E [d(x t) d(x t+1) d(x t) = i 1] i (ln(i + 1) ln(i)) = i (1 en en ln + 1 ) i i 1 en 2i = 1 =: δ. 2en 45 / 63

88 Multiplicative Drift Theorem Drift Analysis for Onemax 1 Let d(x t) = ln(i + 1) where i is the number of zeroes in the bitstring; 2 For x 1, it holds that ln(1 + 1/x) 1/x 1/(2x 2 ) 1/(2x). 3 The distance decreases as long as a 0 is flipped and the ones remain unchanged E [ (t)] = E [d(x t) d(x t+1) d(x t) = i 1] i (ln(i + 1) ln(i)) = i (1 en en ln + 1 ) i i 1 en 2i = 1 =: δ. 2en 4 The initial distance is d(x 0) ln(n + 1) The expected runtime is (i.e. Eq. (??)): E(T d(x 0) > 0) d(x0) δ ln(n + 1) 1/(2en) = O(n ln n) If the amount of progress is proportional to the distance from the optimum we can use a logarithmic distance! 45 / 63

89 Multiplicative Drift Theorem Multiplicative Drift Theorem Theorem (Multiplicative Drift, [Doerr et al., 2010a]) Let {X t} t N0 be random variables describing a Markov process over a finite state space S R. Let T be the random variable that denotes the earliest point in time t N 0 such that X t = 0. If there exist δ, c min, c max > 0 such that 1 E[X t X t+1 X t] δx t and 2 c min X t c max, for all t < T, then E[T] 2 ( ) δ ln 1 + cmax c min 46 / 63

90 Multiplicative Drift Theorem (1+1)-EA Analysis for Onemax Theorem The expected time for the (1+1)-EA to optimise Onemax is O(n ln n) Proof 47 / 63

91 Multiplicative Drift Theorem (1+1)-EA Analysis for Onemax Theorem The expected time for the (1+1)-EA to optimise Onemax is O(n ln n) Proof Hence, Distance: let X t be the number of zeroes in step t; E[X t+1 X t] X t 1 Xt = en Xt (1 ) 1 en E[X t X t+1 X t = i] X t X t (1 1 en ) = Xt/(en) (δ = 1/(en)) 1 = c min X t c max = n E[T] 2 ( ) δ ln 1 + cmax = 2en ln(1 + n) = O(n ln n) c min 47 / 63

92 Multiplicative Drift Theorem Linear Unitation Block: Upper Bound Theorem The expected time for the (1+1)-EA to optimise the Linear Unitation Block is O(n ln((m + k)/k)) Proof 48 / 63

93 Multiplicative Drift Theorem Linear Unitation Block: Upper Bound Theorem The expected time for the (1+1)-EA to optimise the Linear Unitation Block is O(n ln((m + k)/k)) Proof Hence, Distance: let i be the number of zeroes; E[X t+1 X t] X t 1 Xt = ( ) en Xt 1 1 en ( E[X t X t+1 X t] X t X t 1 1 en k = c min X t c max = m + k ) = 1 1 Xt (δ := ) en en E[T] 2 ( ) δ ln 1 + cmax = 2en ln(1 + (m + k)/k) = O(n ln((m + k)/k)) c min 48 / 63

94 Simplified Negative Drift Theorem Simplified Drift Theorem drift away from target target a b no large jumps towards target start Theorem (Simplified Negative-Drift Theorem, [Oliveto and Witt, 2011]) Suppose there exist three constants δ,ɛ,r such that for all t 0: 1 E( t(i)) ɛ for a < i < b,. 2 Prob( t(i) = j) Then 1 (1+δ) j r for i > a and j 1. Prob(T 2 c (b a) ) = 2 Ω(b a) 49 / 63

95 Simplified Negative Drift Theorem Needle in a Haystack Theorem (Oliveto,Witt, Algorithmica 2011) Let η > 0 be constant. Then there is a constant c > 0 such that with probability 1 2 Ω(n) the (1+1)-EA on Needle creates only search points with at most n/2 + ηn ones in 2 cn steps. 50 / 63

96 Simplified Negative Drift Theorem Needle in a Haystack Theorem (Oliveto,Witt, Algorithmica 2011) Let η > 0 be constant. Then there is a constant c > 0 such that with probability 1 2 Ω(n) the (1+1)-EA on Needle creates only search points with at most n/2 + ηn ones in 2 cn steps. Proof Idea By Chernoff bounds the probability that the initial bit string has less than n/2 γn zeroes is e Ω(n). we set b := n/2 γn and a := n/2 2γn where γ := η/2; Proof of Condition 1 Proof of Condition 2 E( (i)) = n i n i n = n 2i n 2γ = ɛ Pr( (i) j) ( ) ( ) n 1 j j n ( n j j! ) ( 1 n ) j 1 j! ( 1 2) j 1 This proves Condition 2 by setting δ = r = / 63

97 P(T < 2 cm ) = 2 Ω(m) 51 / 63 Simplified Negative Drift Theorem Plateau Block Function: Lower Bound Let k + m < (1/2 ɛ)n. { a if k n x k + m PlateauBlock r( x ) = 0 otherwise. Theorem The time for the (1+1)-EA to optimise PlateauBlock r is at least 2 Ω(m) with probability at least 1 2 Ω(m). Proof Let X t be the number of 0-bits at time t. E( (t) = n Xt Xt n n = 1 2Xt n n 2(k + m) = n n If 2(k + m) < n(1 ɛ) by the simplified drift theorem n 2(k + m) n

98 Simplified Negative Drift Theorem Plateau Block Function: Upper Bound Theorem The expected time for the (1+1)-EA to optimise PlateauBlock r is at most e O(m). Proof We calculate the probability p of m consecutive steps across the plateau k+m m i=m+1 where p i (k + m)! k! Hence, i=1 ( ) k + i 1 m ( en (k + m)! 1 en k! en ( (k + m)! k + m = m! = m! m!k! m E [T] m 1/p = m ) ( m e ) m ( ) k + m m = e ) m ( ) k + m m = m ( ) e 2 m n k + m ( ) k + m m e 2 n ( k + m e ) m 52 / 63

99 Simplified Negative Drift Theorem Some history 6. Origins Stability of equilibria in ODEs (Lyapunov, 1892) Stability of Markov Chains (see eg [Meyn and Tweedie, 1993]) 1982 paper by Hajek [Hajek, 1982] Simulated annealing (1988) [Sasaki and Hajek, 1988] Drift Analysis of Evolutionary Algorithms Introduced to EC in 2001 by He and Yao [He and Yao, 2001, He and Yao, 2004] (additive drift) (1+1) EA on linear functions: O(n ln n) [He and Yao, 2001] (1+1) EA on maximum matching by Giel and Wegener [Giel and Wegener, 2003] Simplified drift in 2008 by Oliveto and Witt [Oliveto and Witt, 2011] Multiplicative drift by Doerr et al [Doerr et al., 2010b] (1+1) EA on linear functions: en ln(n) + O(n) [Witt, 2012] Variable drift by Johannsen [Johannsen, 2010] and Mitavskiy et al. [Mitavskiy et al., 2009] Population drift by Lehre [Lehre, 2011] 6 More on drift in GECCO 2012 tutorial by Lehre pkl/drift 53 / 63

100 Summary of Results Summary - (1+1) EA on Functions of Unitation Linear blocks Θ ( n ln ( m+k k )) Gap blocks (( O (( Ω nm m+k ) m ) nm e(m+k) ) m ) Plateau blocks e O(m) if k < n(1/2 ε) Θ(m) if k > n(1/2 + ε) 54 / 63

101 Overview Final Overview Overview Tail Inequalities Artificial Fitness Levels Drift Analysis Other Techniques (Not covered) Family Trees [Witt, 2006] Gambler s Ruin & Martingales [Jansen and Wegener, 2001] Probability Generating Functions [Doerr et al., 2011] Branching Processes [Lehre and Yao, 2012] / 63

102 Further reading Further Reading 56 / 63

103 Further reading Thank you! 57 / 63

104 Further reading Acknowledgements This project has received funding from the European Union s Seventh Framework Programme for research, technological development and demonstration under grant agreement no (SAGE). This work was supported in part by the Engineering and Physical Sciences Research Council under grant no EP/M004252/1. 58 / 63

105 Further reading References I Bäck, T. (1993). Optimal mutation rates in genetic search. In In Proceedings of the Fifth International Conference on Genetic Algorithms (ICGA), pages 2 8. Corus, D., Dang, D., Eremeev, A. V., and Lehre, P. K. (2014). Level-based analysis of genetic algorithms and other search processes. In Parallel Problem Solving from Nature - PPSN XIII - 13th International Conference, Ljubljana, Slovenia, September 13-17, Proceedings, pages Dang, D.-C. and Lehre, P. K. (2014). Refined Upper Bounds on the Expected Runtime of Non-elitist Populations from Fitness-Levels. In Proceedings of the 16th Annual Conference on Genetic and Evolutionary Computation Conference (GECCO 2014), pages Doerr, B., Fouz, M., and Witt, C. (2011). Sharp bounds by probability-generating functions and variable drift. In Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, GECCO 11, pages , New York, NY, USA. ACM. Doerr, B., Johannsen, D., and Winzen, C. (2010a). Multiplicative drift analysis. In Proceedings of the 12th annual conference on Genetic and evolutionary computation, GECCO 10, pages ACM. Doerr, B., Johannsen, D., and Winzen, C. (2010b). Multiplicative drift analysis. In GECCO 10: Proceedings of the 12th annual conference on Genetic and evolutionary computation, pages , New York, NY, USA. ACM. 59 / 63

106 Further reading References II Droste, S., Jansen, T., and Wegener, I. (1998). A rigorous complexity analysis of the (1 + 1) evolutionary algorithm for separable functions with boolean inputs. Evolutionary Computation, 6(2): Giel, O. and Wegener, I. (2003). Evolutionary algorithms and the maximum matching problem. In Proceedings of the 20th Annual Symposium on Theoretical Aspects of Computer Science (STACS 2003), pages Goldberg, D. E. (1989). Genetic Algorithms for Search, Optimization, and Machine Learning. Addison-Wesley. Hajek, B. (1982). Hitting-time and occupation-time bounds implied by drift analysis with applications. Advances in Applied Probability, 13(3): He, J. and Yao, X. (2001). Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127(1): He, J. and Yao, X. (2004). A study of drift analysis for estimating computation time of evolutionary algorithms. Natural Computing: an international journal, 3(1): / 63

107 Further reading References III Holland, J. H. (1992). Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The MIT Press. Jägersküpper, J. (2007). Algorithmic analysis of a basic evolutionary algorithm for continuous optimization. Theoretical Computer Science, 379(3): Jägersküpper, J. (2008). A blend of markov-chain and drift analysis. In PPSN, pages Jansen, T. and Wegener, I. (2001). Evolutionary algorithms - how to cope with plateaus of constant fitness and when to reject strings of the same fitness. IEEE Trans. Evolutionary Computation, 5(6): Johannsen, D. (2010). Random combinatorial structures and randomized search heuristics. PhD thesis, Universität des Saarlandes. Lehre, P. K. (2011). Negative drift in populations. In Proceedings of Parallel Problem Solving from Nature - (PPSN XI), volume 6238 of LNCS, pages Springer Berlin / Heidelberg. 61 / 63

108 Further reading References IV Lehre, P. K. and Yao, X. (2012). On the impact of mutation-selection balance on the runtime of evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 16(2): Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer-Verlag. Mitavskiy, B., Rowe, J. E., and Cannings, C. (2009). Theoretical analysis of local search strategies to optimize network communication subject to preserving the total number of links. International Journal of Intelligent Computing and Cybernetics, 2(2): Motwani, R. and Raghavan, P. (1995). Randomized Algorithms. Cambridge University Press. Oliveto, P. S. and Witt, C. (2011). Simplified drift analysis for proving lower bounds inevolutionary computation. Algorithmica, 59(3): Reeves, C. R. and Rowe, J. E. (2002). Genetic Algorithms: Principles and Perspectives: A Guide to GA Theory. Kluwer Academic Publishers, Norwell, MA, USA. Rudolph, G. (1998). Finite Markov chain results in evolutionary computation: A tour d horizon. Fundamenta Informaticae, 35(1 4): / 63

109 Further reading References V Sasaki, G. H. and Hajek, B. (1988). The time complexity of maximum matching by simulated annealing. Journal of the Association for Computing Machinery, 35(2): Sudholt, D. (2010). General lower bounds for the running time of evolutionary algorithms. In PPSN (1), pages Witt, C. (2006). Runtime analysis of the (µ+1) ea on simple pseudo-boolean functions evolutionary computation. In GECCO 06: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pages , New York, NY, USA. ACM Press. Witt, C. (2012). Optimizing linear functions with randomized search heuristics - the robustness of mutation. In Dürr, C. and Wilke, T., editors, 29th International Symposium on Theoretical Aspects of Computer Science (STACS 2012), volume 14 of Leibniz International Proceedings in Informatics (LIPIcs), pages , Dagstuhl, Germany. Schloss Dagstuhl Leibniz-Zentrum fuer Informatik. Wolpert, D. and Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Trans. Evolutionary Computation, 1(1): / 63

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms Pietro S. Oliveto Department of Computer Science, University of Sheffield, UK Symposium Series in Computational Intelligence

More information

Black Box Search By Unbiased Variation

Black Box Search By Unbiased Variation Black Box Search By Unbiased Variation Per Kristian Lehre and Carsten Witt CERCIA, University of Birmingham, UK DTU Informatics, Copenhagen, Denmark ThRaSH - March 24th 2010 State of the Art in Runtime

More information

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions Chao Qian,2, Yang Yu 2, and Zhi-Hua Zhou 2 UBRI, School of Computer Science and Technology, University of

More information

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms Technical University of Denmark (Workshop Computational Approaches to Evolution, March 19, 2014) 1/22 Context Running time/complexity

More information

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Thomas Jansen and Ingo Wegener FB Informatik, LS 2, Univ. Dortmund, 44221 Dortmund,

More information

Refined Upper Bounds on the Expected Runtime of Non-elitist Populations from Fitness-Levels

Refined Upper Bounds on the Expected Runtime of Non-elitist Populations from Fitness-Levels Refined Upper Bounds on the Expected Runtime of Non-elitist Populations from Fitness-Levels ABSTRACT Duc-Cuong Dang ASAP Research Group School of Computer Science University of Nottingham duc-cuong.dang@nottingham.ac.uk

More information

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Downloaded from orbit.dtu.dk on: Oct 12, 2018 Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Witt, Carsten Published in: 29th International Symposium on Theoretical

More information

When to Use Bit-Wise Neutrality

When to Use Bit-Wise Neutrality When to Use Bit-Wise Neutrality Tobias Friedrich Department 1: Algorithms and Complexity Max-Planck-Institut für Informatik Saarbrücken, Germany Frank Neumann Department 1: Algorithms and Complexity Max-Planck-Institut

More information

When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm?

When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm? When Is an Estimation of Distribution Algorithm Better than an Evolutionary Algorithm? Tianshi Chen, Per Kristian Lehre, Ke Tang, and Xin Yao Abstract Despite the wide-spread popularity of estimation of

More information

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms [Extended Abstract] Tobias Storch Department of Computer Science 2, University of Dortmund, 44221 Dortmund,

More information

Runtime Analysis of Genetic Algorithms with Very High Selection Pressure

Runtime Analysis of Genetic Algorithms with Very High Selection Pressure Runtime Analysis of Genetic Algorithms with Very High Selection Pressure Anton V. Eremeev 1,2 1 Sobolev Institute of Mathematics, Omsk Branch, 13 Pevtsov str., 644099, Omsk, Russia 2 Omsk State University

More information

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Frank Neumann 1 and Andrew M. Sutton 2 1 Optimisation and Logistics, School of Computer Science, The

More information

Plateaus Can Be Harder in Multi-Objective Optimization

Plateaus Can Be Harder in Multi-Objective Optimization Plateaus Can Be Harder in Multi-Objective Optimization Tobias Friedrich and Nils Hebbinghaus and Frank Neumann Max-Planck-Institut für Informatik, Campus E1 4, 66123 Saarbrücken, Germany Abstract In recent

More information

The Fitness Level Method with Tail Bounds

The Fitness Level Method with Tail Bounds The Fitness Level Method with Tail Bounds Carsten Witt DTU Compute Technical University of Denmark 2800 Kgs. Lyngby Denmark arxiv:307.4274v [cs.ne] 6 Jul 203 July 7, 203 Abstract The fitness-level method,

More information

The Nottingham eprints service makes this work by researchers of the University of Nottingham available open access under the following conditions.

The Nottingham eprints service makes this work by researchers of the University of Nottingham available open access under the following conditions. Dang, Duc-Cuong and Lehre, Per Kristian 2016 Runtime analysis of non-elitist populations: from classical optimisation to partial information. Algorithmica, 75 3. pp. 428-461. ISSN 1432-0541 Access from

More information

Black-Box Search by Unbiased Variation

Black-Box Search by Unbiased Variation Electronic Colloquium on Computational Complexity, Revision 1 of Report No. 102 (2010) Black-Box Search by Unbiased Variation Per Kristian Lehre and Carsten Witt Technical University of Denmark Kgs. Lyngby,

More information

Crossover can be constructive when computing unique input output sequences

Crossover can be constructive when computing unique input output sequences Crossover can be constructive when computing unique input output sequences Per Kristian Lehre and Xin Yao The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA),

More information

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Tobias Friedrich 1, Christian Horoba 2, and Frank Neumann 1 1 Max-Planck-Institut für Informatik, Saarbrücken, Germany 2

More information

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web:   jxh Evolutionary Computation Theory Jun He School of Computer Science University of Birmingham Web: www.cs.bham.ac.uk/ jxh Outline Motivation History Schema Theorem Convergence and Convergence Rate Computational

More information

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms Yang Yu and Zhi-Hua Zhou National Laboratory for Novel Software Technology Nanjing University, Nanjing 20093, China

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Multiplicative Drift Analysis

Multiplicative Drift Analysis Multiplicative Drift Analysis Benjamin Doerr Daniel Johannsen Carola Winzen Max-Planck-Institut für Informatik Campus E1 4 66123 Saarbrücken, Germany arxiv:1101.0776v1 [cs.ne] 4 Jan 2011 Submitted January

More information

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS Ingo Wegener FB Informatik, LS2, Univ. Dortmund, 44221 Dortmund, Germany wegener@ls2.cs.uni-dortmund.de Abstract Many experiments

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 53 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences

Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences 1 Runtime Analysis of (1+1) EA on Computing Unique Input Output Sequences Per Kristian Lehre and Xin Yao Abstract Computing unique input output (UIO) sequences is a fundamental and hard problem in conformance

More information

Switch Analysis for Running Time Analysis of Evolutionary Algorithms

Switch Analysis for Running Time Analysis of Evolutionary Algorithms IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. XX, NO. X, 204 Switch Analysis for Running Time Analysis of Evolutionary Algorithms Yang Yu, Member, IEEE, Chao Qian, Zhi-Hua Zhou, Fellow, IEEE Abstract

More information

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Algorithmica (2017) 78:641 659 DOI 10.1007/s00453-016-0262-4 A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Andrei Lissovoi 1 Carsten Witt 2 Received: 30 September 2015

More information

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 U N I V E R S I T Ä T D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 Design und Management komplexer technischer Prozesse und Systeme mit Methoden

More information

When to use bit-wise neutrality

When to use bit-wise neutrality Nat Comput (010) 9:83 94 DOI 10.1007/s11047-008-9106-8 When to use bit-wise neutrality Tobias Friedrich Æ Frank Neumann Published online: 6 October 008 Ó Springer Science+Business Media B.V. 008 Abstract

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem

Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Research Collection Working Paper Running time analysis of a multi-objective evolutionary algorithm on a simple discrete optimization problem Author(s): Laumanns, Marco; Thiele, Lothar; Zitzler, Eckart;

More information

Improved Runtime Bounds for the Univariate Marginal Distribution Algorithm via Anti-Concentration

Improved Runtime Bounds for the Univariate Marginal Distribution Algorithm via Anti-Concentration Improved Runtime Bounds for the Univariate Marginal Distribution Algorithm via Anti-Concentration Phan Trung Hai Nguyen June 26, 2017 School of Computer Science University of Birmingham Birmingham B15

More information

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem Jun He 1, Yuren Zhou 2, and Xin Yao 3 1 J. He is with the Department of Computer Science,

More information

arxiv: v1 [cs.dm] 29 Aug 2013

arxiv: v1 [cs.dm] 29 Aug 2013 Collecting Coupons with Random Initial Stake Benjamin Doerr 1,2, Carola Doerr 2,3 1 École Polytechnique, Palaiseau, France 2 Max Planck Institute for Informatics, Saarbrücken, Germany 3 Université Paris

More information

Improved time complexity analysis of the Simple Genetic Algorithm

Improved time complexity analysis of the Simple Genetic Algorithm Downloaded from orbit.dtu.dk on: Nov 0, 0 Improved time complexity analysis of the Simple Genetic Algorithm Oliveto, Pietro S.; Witt, Carsten Published in: Theoretical Computer Science Link to article,

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of semantic mutation operators Andrea Mambrini 1 University of Birmingham, Birmingham UK 6th June 2013 1 / 33 Andrea Mambrini GSGP: theory-laden

More information

Runtime Analysis of a Simple Ant Colony Optimization Algorithm

Runtime Analysis of a Simple Ant Colony Optimization Algorithm Algorithmica (2009) 54: 243 255 DOI 10.1007/s00453-007-9134-2 Runtime Analysis of a Simple Ant Colony Optimization Algorithm Frank Neumann Carsten Witt Received: 22 January 2007 / Accepted: 20 November

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling

A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling Andrew M. Sutton and Frank Neumann School of Computer Science, University of Adelaide, Adelaide, SA 5005, Australia

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Runtime Analysis of Binary PSO

Runtime Analysis of Binary PSO Runtime Analysis of Binary PSO Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund,

More information

Average Drift Analysis and Population Scalability

Average Drift Analysis and Population Scalability Average Drift Analysis and Population Scalability Jun He and Xin Yao Abstract This paper aims to study how the population size affects the computation time of evolutionary algorithms in a rigorous way.

More information

First hitting time analysis of continuous evolutionary algorithms based on average gain

First hitting time analysis of continuous evolutionary algorithms based on average gain Cluster Comput (06) 9:33 33 DOI 0.007/s0586-06-0587-4 First hitting time analysis of continuous evolutionary algorithms based on average gain Zhang Yushan Huang Han Hao Zhifeng 3 Hu Guiwu Received: 6 April

More information

Computing Single Source Shortest Paths using Single-Objective Fitness Functions

Computing Single Source Shortest Paths using Single-Objective Fitness Functions Computing Single Source Shortest Paths using Single-Objective Fitness Functions Surender Baswana Somenath Biswas Benjamin Doerr Tobias Friedrich Piyush P. Kurur Department of Computer Science and Engineering

More information

Faster Black-Box Algorithms Through Higher Arity Operators

Faster Black-Box Algorithms Through Higher Arity Operators Faster Black-Box Algorithms Through Higher Arity Operators Benjamin Doerr 1 Daniel Johannsen 1 Timo Kötzing1 Per Kristian Lehre Markus Wagner 1 1 Max-Planck-Institut für Informatik Campus E1 4 6613 Saarbrücken,

More information

When Hypermutations and Ageing Enable Artificial Immune Systems to Outperform Evolutionary Algorithms 1. Dogan Corus, Pietro S. Oliveto, Donya Yazdani

When Hypermutations and Ageing Enable Artificial Immune Systems to Outperform Evolutionary Algorithms 1. Dogan Corus, Pietro S. Oliveto, Donya Yazdani When Hypermutations and Ageing Enable Artificial Immune Systems to Outperform Evolutionary Algorithms 1 Dogan Corus, Pietro S. Oliveto, Donya Yazdani Department of Computer Science, University of Sheffield

More information

On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments

On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments Chao Qian 1, Yang Yu 1, Yaochu Jin 2, and Zhi-Hua Zhou 1 1 National Key Laboratory for Novel Software Technology, Nanjing

More information

Simulated Annealing Beats Metropolis in Combinatorial Optimization

Simulated Annealing Beats Metropolis in Combinatorial Optimization Electronic Colloquium on Computational Complexity, Report No. 89 (2004) Simulated Annealing Beats Metropolis in Combinatorial Optimization Ingo Wegener FB Informatik, LS2, Univ. Dortmund, Germany ingo.wegener@uni-dortmund.de

More information

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress

Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Gaussian EDA and Truncation Selection: Setting Limits for Sustainable Progress Petr Pošík Czech Technical University, Faculty of Electrical Engineering, Department of Cybernetics Technická, 66 7 Prague

More information

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT --

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT -- Towards an Analytic Framework for Analysing the Computation Time of Evolutionary Algorithms Jun He and Xin Yao Abstract In spite of many applications of evolutionary algorithms in optimisation, theoretical

More information

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 U N I V E R S I T Ä T D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 Design und Management komplexer technischer Prozesse und Systeme mit Methoden

More information

Improved Computational Complexity Results for Weighted ORDER and MAJORITY

Improved Computational Complexity Results for Weighted ORDER and MAJORITY Improved Computational Complexity Results for Weighted ORDER and MAJORITY Anh Nguyen Evolutionary Computation Group School of Computer Science The University of Adelaide Adelaide, SA 5005, Australia Tommaso

More information

Analysis of Evolutionary Algorithms for the Longest Common Subsequence Problem

Analysis of Evolutionary Algorithms for the Longest Common Subsequence Problem Analysis of Evolutionary Algorithms for the Longest Common Subsequence Problem ABSTRACT Thomas Jansen Universität Dortmund Fachbereich Informatik LS Effiziente Alg. u. Komplexitätstheorie 44221 Dortmund,

More information

Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization

Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization Patrick Briest, Dimo Brockhoff, Bastian Degener, Matthias Englert, Christian Gunia, Oliver Heering,

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Andrea Mambrini University of Birmingham, UK NICaiA Exchange Programme LaMDA group, Nanjing University, China 7th

More information

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT --

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT -- Conditions for the Convergence of Evolutionary Algorithms Jun He and Xinghuo Yu 1 Abstract This paper presents a theoretical analysis of the convergence conditions for evolutionary algorithms. The necessary

More information

Simple Max-Min Ant Systems and the Optimization of Linear Pseudo-Boolean Functions

Simple Max-Min Ant Systems and the Optimization of Linear Pseudo-Boolean Functions Simple Max-Min Ant Systems and the Optimization of Linear Pseudo-Boolean Functions Timo Kötzing Max-Planck-Institut für Informatik 66123 Saarbrücken, Germany Dirk Sudholt CERCIA, University of Birmingham

More information

An Analysis on Recombination in Multi-Objective Evolutionary Optimization

An Analysis on Recombination in Multi-Objective Evolutionary Optimization An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China

More information

Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions

Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Expected Running Time Analysis of a Multiobjective Evolutionary Algorithm on Pseudo-boolean Functions Nilanjan Banerjee and Rajeev Kumar Department of Computer Science and Engineering Indian Institute

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Introduction. Genetic Algorithm Theory. Overview of tutorial. The Simple Genetic Algorithm. Jonathan E. Rowe

Introduction. Genetic Algorithm Theory. Overview of tutorial. The Simple Genetic Algorithm. Jonathan E. Rowe Introduction Genetic Algorithm Theory Jonathan E. Rowe University of Birmingham, UK GECCO 2012 The theory of genetic algorithms is beginning to come together into a coherent framework. However, there are

More information

Runtime Analysis of the (1 + (λ, λ)) Genetic Algorithm on Random Satisfiable 3-CNF Formulas

Runtime Analysis of the (1 + (λ, λ)) Genetic Algorithm on Random Satisfiable 3-CNF Formulas Runtime Analysis of the (1 + (λ, λ)) Genetic Algorithm on Random Satisfiable 3-CNF Formulas arxiv:1704.04366v1 [cs.ne] 14 Apr 2017 Maxim Buzdalov April 17, 2017 Abstract Benjamin Doerr The (1+(λ, λ)) genetic

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Frank Neumann 1, Dirk Sudholt 2, and Carsten Witt 2 1 Max-Planck-Institut für Informatik, 66123 Saarbrücken, Germany, fne@mpi-inf.mpg.de

More information

Runtime Analysis of a Binary Particle Swarm Optimizer

Runtime Analysis of a Binary Particle Swarm Optimizer Runtime Analysis of a Binary Particle Swarm Optimizer Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische

More information

On the Approximation Ability of Evolutionary Optimization with Application to Minimum Set Cover: Extended Abstract

On the Approximation Ability of Evolutionary Optimization with Application to Minimum Set Cover: Extended Abstract Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence On the Approximation Ability of Evolutionary Optimization with Application to Minimum Set Cover: Extended Abstract

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Lecture Algorithm Configura4on and Theore4cal Analysis Outline Algorithm Configuration Theoretical Analysis 2 Algorithm Configuration Question: If an EA toolbox is available (which

More information

CSC 4510 Machine Learning

CSC 4510 Machine Learning 10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

A Computational View on Natural Evolution

A Computational View on Natural Evolution A Computational View on Natural Evolution On the Rigorous Analysis of the Speed of Adaptation Jorge Pérez Heredia Department of Computer Science University of Sheffield This dissertation is submitted for

More information

The (1+) evolutionary algorithm with self-adjusting mutation rate

The (1+) evolutionary algorithm with self-adjusting mutation rate Downloaded from orbit.dtu.dk on: Dec 15, 2017 The 1+) evolutionary algorithm with self-adjusting mutation rate Doerr, Benjamin; Witt, Carsten; Gießen, Christian; Yang, Jing Published in: Proceedings of

More information

PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM

PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM /59 PROBABILISTIC ANALYSIS OF THE (1 + 1)-EVOLUTIONARY ALGORITHM Hsien-Kuei Hwang (joint with Alois Panholzer, Nicolas Rolin, Tsung-Hsi Tsai, Wei-Mei Chen) October 13, 2017 /59 MIT: EVOLUTONARY COMPUTATION

More information

Landscapes and Other Art Forms.

Landscapes and Other Art Forms. Landscapes and Other Art Forms. Darrell Whitley Computer Science, Colorado State University Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Theory of Evolutionary Computation: A True Beginners Tutorial

Theory of Evolutionary Computation: A True Beginners Tutorial Theory of Evolutionary Computation: A True Beginners Tutorial Benjamin Doerr École Polytechnique, Paris-Saclay, France CEC 2017, Donostia San Sebastián 1 Link to the Latest Version You can always find

More information

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties

Introduction to Black-Box Optimization in Continuous Search Spaces. Definitions, Examples, Difficulties 1 Introduction to Black-Box Optimization in Continuous Search Spaces Definitions, Examples, Difficulties Tutorial: Evolution Strategies and CMA-ES (Covariance Matrix Adaptation) Anne Auger & Nikolaus Hansen

More information

Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem

Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Running Time Analysis of Multi-objective Evolutionary Algorithms on a Simple Discrete Optimization Problem Marco Laumanns 1, Lothar Thiele 1, Eckart Zitzler 1,EmoWelzl 2,and Kalyanmoy Deb 3 1 ETH Zürich,

More information

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23 CIS 11 Data Structures and Algorithms with Java Spring 018 Big-Oh Notation Monday, January /Tuesday, January 3 Learning Goals Review Big-Oh and learn big/small omega/theta notations Discuss running time

More information

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study

On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study On the Usefulness of Infeasible Solutions in Evolutionary Search: A Theoretical Study Yang Yu, and Zhi-Hua Zhou, Senior Member, IEEE National Key Laboratory for Novel Software Technology Nanjing University,

More information

On Constrained Boolean Pareto Optimization

On Constrained Boolean Pareto Optimization On Constrained Boolean Pareto Optimization Chao Qian and Yang Yu and Zhi-Hua Zhou National Key Laboratory for Novel Software Technology, Nanjing University Collaborative Innovation Center of Novel Software

More information

Unit 1A: Computational Complexity

Unit 1A: Computational Complexity Unit 1A: Computational Complexity Course contents: Computational complexity NP-completeness Algorithmic Paradigms Readings Chapters 3, 4, and 5 Unit 1A 1 O: Upper Bounding Function Def: f(n)= O(g(n)) if

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

Computational Complexity

Computational Complexity Computational Complexity S. V. N. Vishwanathan, Pinar Yanardag January 8, 016 1 Computational Complexity: What, Why, and How? Intuitively an algorithm is a well defined computational procedure that takes

More information

U N I V E R S I T Y OF D O R T M U N D

U N I V E R S I T Y OF D O R T M U N D U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Natural Computing. Lecture 11. Michael Herrmann phone: Informatics Forum /10/2011 ACO II

Natural Computing. Lecture 11. Michael Herrmann phone: Informatics Forum /10/2011 ACO II Natural Computing Lecture 11 Michael Herrmann mherrman@inf.ed.ac.uk phone: 0131 6 517177 Informatics Forum 1.42 25/10/2011 ACO II ACO (in brief) ACO Represent solution space Set parameters, initialize

More information

A Database Framework for Classifier Engineering

A Database Framework for Classifier Engineering A Database Framework for Classifier Engineering Benny Kimelfeld 1 and Christopher Ré 2 1 LogicBlox, Inc. and Technion, Israel 2 Stanford University 1 Introduction In the design of machine-learning solutions,

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Pure Strategy or Mixed Strategy?

Pure Strategy or Mixed Strategy? Pure Strategy or Mixed Strategy? Jun He, Feidun He, Hongbin Dong arxiv:257v4 [csne] 4 Apr 204 Abstract Mixed strategy evolutionary algorithms EAs) aim at integrating several mutation operators into a single

More information

Trace Reconstruction Revisited

Trace Reconstruction Revisited Trace Reconstruction Revisited Andrew McGregor 1, Eric Price 2, and Sofya Vorotnikova 1 1 University of Massachusetts Amherst {mcgregor,svorotni}@cs.umass.edu 2 IBM Almaden Research Center ecprice@mit.edu

More information

Restarting a Genetic Algorithm for Set Cover Problem Using Schnabel Census

Restarting a Genetic Algorithm for Set Cover Problem Using Schnabel Census Restarting a Genetic Algorithm for Set Cover Problem Using Schnabel Census Anton V. Eremeev 1,2 1 Dostoevsky Omsk State University, Omsk, Russia 2 The Institute of Scientific Information for Social Sciences

More information

Essentials on the Analysis of Randomized Algorithms

Essentials on the Analysis of Randomized Algorithms Essentials on the Analysis of Randomized Algorithms Dimitris Diochnos Feb 0, 2009 Abstract These notes were written with Monte Carlo algorithms primarily in mind. Topics covered are basic (discrete) random

More information

Multi-objective Quadratic Assignment Problem instances generator with a known optimum solution

Multi-objective Quadratic Assignment Problem instances generator with a known optimum solution Multi-objective Quadratic Assignment Problem instances generator with a known optimum solution Mădălina M. Drugan Artificial Intelligence lab, Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels,

More information

Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces

Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces Truncation Selection and Gaussian EDA: Bounds for Sustainable Progress in High-Dimensional Spaces Petr Pošík Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics

More information

Evolutionary Algorithms and Other Search Heuristics. Bioinspired Computation in Combinatorial Optimization

Evolutionary Algorithms and Other Search Heuristics. Bioinspired Computation in Combinatorial Optimization Bioinspired Computation in Combinatorial Optimization Algorithms and Their Computational Complexity Evolutionary Algorithms and Other Search Heuristics Most famous search heuristic: Evolutionary Algorithms

More information

Theory of Evolutionary Computation

Theory of Evolutionary Computation Theory of Evolutionary Computation Benjamin Doerr École Polytechnique, Paris-Saclay, France PPSN 2016, Edinburgh 1 Link to the Latest Version You can always find the latest version online at http://people.mpi-inf.mpg.de/~doerr/ppsn16_tutorial_theory.pdf

More information

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS A genetic algorithm is a random search technique for global optimisation in a complex search space. It was originally inspired by an

More information

State of the art in genetic algorithms' research

State of the art in genetic algorithms' research State of the art in genetic algorithms' Genetic Algorithms research Prabhas Chongstitvatana Department of computer engineering Chulalongkorn university A class of probabilistic search, inspired by natural

More information