A) Introduction to the Simple Genetic Algorithm. B) Applications I: Applying Genetic Algorithms to Tackle NP-hard Problems.

Size: px
Start display at page:

Download "A) Introduction to the Simple Genetic Algorithm. B) Applications I: Applying Genetic Algorithms to Tackle NP-hard Problems."

Transcription

1 Genetic Algorithms Sami Khuri Department of Computer Science San José State University One Washington Square San José, CA U.S.A. khuri

2 Outline Outline A) Introduction to the Simple Genetic Algorithm. B) Applications I: Applying Genetic Algorithms to Tackle NP-hard Problems. C) Walsh and Haar Functions. D) Applications II: Other Applications of Genetic Algorithms: Bioinformatics. Tracking a Criminal Suspect. Genetic Programming. Aalto University: Summer

3 Introduction to GA [Gol89] Introduction to Genetic Algorithms GA s were developed by John Holland. GA s are search procedures based on the mechanics of natural selection and natural genetics. GA s make few assumptions about the problem domain and can thus be applied to a broad range of problems. GA s are randomized, but not directionless, search procedures that maneuver through complex spaces looking for optimal solutions. Aalto University: Summer

4 Introduction to GA [Gol89] Comparison with other optimization procedures Genetic Algorithm Traditional Algorithm works with coding of parameter set works with parameters themselves searches from population searches from a single of points point uses payoff information: objective function uses auxiliary knowledge: derivatives, gradients, etc... uses probabilistic transition values is deterministic Aalto University: Summer

5 Simple Genetic Algorithm [KB90] 1) Randomly generate m strings of length n each, to form the initial population: Pop(1). Let i 1 2) Compute fitness and relative fitness of each string in Pop(i). i i+1 3) Form mating pool from Pop(i) by spinning the weighted roulette wheel m times. no done? yes Solution: String in Pop(i+1) with best fitness value 4) Form Pop(i+1) by: a) randomly pairing the strings in the mating pool b) choosing crossing sites at random for each pair c) exchanging rightmost substrings of each pair d) performing bit wise mutation with a very small probability Aalto University: Summer

6 SGA Operators Example: Strings Fitness Relative Fitness Roulette Wheel: 15% 25% 20% 40% Mating Pool: Aalto University: Summer

7 Crossover and Mutation Crossover Choose 2 strings from mating pool. Choose a X-site: Children: Mutation: Flip a randomly chosen bit: Aalto University: Summer

8 The 0/1 Knapsack Problem The 0/1 Knapsack Problem Objects: 1... i... n Weights: w 1... w i... w n Profits: p 1... p i... p n Knapsack capacity: M Maximize n i=1 x i p i subject to n i=1 x i w i M, where x i {0,1} for 1 i n. Each string of the population represents a possible solution. If the j th position of the string is 1 i.e. x j = 1, then the j th object is in the knapsack; otherwise it is not. A string might represent an infeasible solution: the total sum of the weights of the objects exceeds the capacity of the knapsack. Aalto University: Summer

9 The 0/1 Knapsack Problem Problem Instance: Objects: Weights: Profits: Knapsack capacity : 80 String is of weight 46 and has a profit of 96. Two alternatives for the fitness functions are: f( x) = x i p i, i=1,n where x = (x 1, x 2,..., x n ) is a feasible solution; or f( x) = i=1,n x i p i penalty for any x, where infeasible strings are penalized. Aalto University: Summer

10 Sample Runs Aalto University: Summer

11 Sample Runs Aalto University: Summer

12 Algorithm GA Algorithm GA is t := 0; initialize P(t); evaluate P(t); while not terminate P(t) do t := t + 1; P(t) := select P(t 1); recombine P(t); mutate P(t); evaluate P(t); od end GA. Aalto University: Summer

13 Rank-Ordered Genetic Algorithm Encode the problem instance Form Pop(1) by: randomly generating m strings of length n computing the fitness value of each string sorting the strings from highest to lowest fitness value Set i 1 i done? i+1 no Randomly pick a pair of strings among the top 15% of Pop(i) Form Pop(i+1) by: yes Solution: strings with maximum fitness in Pop(i+1) choosing a crossing site at random for the chosen pair performing crossover computing the fitness values of the new strings placing the new strings in the sorted population deleting 2 randomly picked strings among the bottom 15% performing mutation Aalto University: Summer

14 Other Representations and Stochastic Operators Representations, Operators and Techniques: Integer, real and Gray code representations. Randomly generated and seeded initial populations. Inversion and mutation swap. 2-point, k-point, uniform, edge recombination, and partially mapped crossovers. Elitism. Aalto University: Summer

15 Partially Mapped Crossover Partially Mapped Crossover (PMX) PMX is used with permutations (eg. TSP). Parents: Choose 2 X-sites Children: Swap middle strings, then fill the rest Aalto University: Summer

16 The Schema Theorem Why does the GA work? H = is a schema, hyperplane partition, or similarity template describing { , ,...}. There are 3 l 1 schemata in the search space, where l is the length of the encoded string. Every string in the population belongs to 2 l 1 different schemata. Aalto University: Summer

17 The Schema Theorem The 3 and 4 dimensional Hypercubes [Whi94] Aalto University: Summer

18 The Schema Theorem Implicit Parallelism When sampling a population, we are sampling more than just the strings of the population. Many competing schemata are being sampled in an implicitly parallel fashion when a single string is being processed. The new distribution of points in each schema should change according to the average fitness of the strings in the population that are in the schema. Aalto University: Summer

19 The Schema Theorem Schema Properties Order: o(h) Number of fixed positions in the schema H. o( ) = 4. Defining length: δ(h) Distance between leftmost and rightmost fixed positions in H. δ( ) = 5. Aalto University: Summer

20 The Schema Theorem The Building Block Hypothesis When f(h) > f, the representation (instances) of H, in generation t + 1, is expected to increase, if δ(h) and o(h) are small. Aalto University: Summer

21 The Schema Theorem [Hol75] The Schema Theorem m(h, t + 1) m(h, t) f(h,t) f(t) [ 1 pc δ(h) l 1 p mo(h) ] where H : schema t : generation number f(h, t) : average fitness of H f(t) : fitness average of strings m(h, t) : expected number of instances of H δ(h) : defining length of H o(h) : order of H p c : crossover probability p m : mutation probability l : length of strings Aalto University: Summer

22 The Schema Theorem [Hol75] Limitations of the Schema Theorem Computing average fitnesses to evaluate f(h) at time t is often misleading in trying to predict f(h) after several generations. The Schema Theorem cannot predict how a particular schema behaves over time. Other models, of a more dynamic nature, have been proposed. Aalto University: Summer

23 Application Problems Applications I Maximum Cut Problem. Minimum Tardy Task Problem. Minimum Set Covering Problem. Terminal Assignment Problem. Edge Coloring Problem. Aalto University: Summer

24 Different Approaches: Applications: The Principles Knowledge-based restrictions on search space and stochastic operators. In the coming examples, only the fitness function is problem-dependent. Strategy for Fitness Functions: The fitness function uses a graded penalty term. The penalty is a function of the distance from feasibility. Infeasible strings are weaker than feasible strings. Note: The infeasible string s lifespan is quite short. Aalto University: Summer

25 Maximum Cut Problem [KBH94] Maximum Cut Problem Problem Instance: A weighted graph G = (V, E). V = {1,..., n} is the set of vertices and E the set of edges. w ij represents the weight of edge i, j. Feasible Solution: A set C of edges, the cut-set, containing all the edges that have one endpoint in V 0 and the other in V 1, where V 0 V 1 = V, and V 0 V 1 =. Objective Function: The cut-set weight W = i,j C w ij, which is the sum of the weights of the edges in C. Optimal Solution: A cut-set that gives the maximum cut-set weight. Aalto University: Summer

26 Maximum Cut Problem [KBH94] GA encoding for the Maximum Cut Problem: Encode the problem by using binary strings (x 1, x 2,..., x n ), where each digit corresponds to a vertex. The fitness function to be maximized is: f( x) = n 1 i=1 n j=i+1 w ij [x i (1 x j )+x j (1 x i )]. In absence of test problems of significantly large sizes use scalable test problems. Aalto University: Summer

27 Maximum Cut Problem [KBH94] Test Problems for the Maximum Cut Problem Consider randomly generated graphs of small sizes. The global optimum can be a priori computed by complete enumeration. Construct two graphs of n = 20 vertices each. Vary the probability of placing an edge between arbitrary pairs of vertices. Weights of edges are uniformly chosen in the range [0,1]. a) Sparse graph: cut Probability of an edge is 0.1 b) Dense graph: cut Probability of an edge is 0.9. Aalto University: Summer

28 Scalable Graph Aalto University: Summer

29 Maximum Cut Problem [KBH94] The Scalable Graph For n = 10, the string (or its complement) yields the optimum cut-set with value: f = 87. Graph can be scaled up, for any even n. The optimal partition is described by the (n 2)/4-fold repetition of the bit pattern 01, followed by a 0, then another copy of the (n 2)/4-fold repetition of the bit pattern 01, followed by a 0 if n = 4k + 2 and the n/2-fold repetition of 01 if n is a multiple of 4. The objective function value is: f = (n 4) for n 4. cut100 is of size n = 100. Aalto University: Summer

30 Maximum Cut Problem [KBH94] Runs for the Maximum Cut Problem: Recall: cut is a sparse graph, and cut is a dense graph. Perform a total of 100 runs with population size: 50. Use 200 generations for the small graphs, and 1000 generations for the large one. cut cut cut100 f( x) N f( x) N f( x) N Aalto University: Summer

31 Maximum Cut Problem [KBH94] Conclusions: Maximum Cut Problem 1. With the dense and the sparse graphs, the global optimum is found by more than two thirds of the runs. 2. For the dense and sparse graphs, the genetic algorithm performs 10 4 function evaluations, and the search space is of size Thus, the genetic algorithm searches less than one percent of the search space. 3. Similarly for cut100. The optimum is found in six of the runs. Average value from remaining runs: f = , which is about 5% from the global optimum. Only ( )/ % % of the search space is explored. Aalto University: Summer

32 The Minimum Tardy Task Problem Minimum Tardy Task Problem Problem instance: Tasks: n, i > 0 Lengths: l 1 l 2... l n, l i > 0 Deadlines: d 1 d 2... d n, d i > 0 Weights: w 1 w 2... w n, w i > 0 Feasible solution: A one-to-one scheduling function g defined on S T, g : S Z + {0} that satisfies the following conditions for all i, j S: 1. If g(i) < g(j) then g(i) + l i g(j). 2. g(i) + l i d i. Objective function: The tardy task weight W = i T S w i, which is the sum of the weights of unscheduled tasks. Optimal solution: The schedule S with the minimum tardy task weight W. Aalto University: Summer

33 The Minimum Tardy Task Problem Fact S is feasible if and only if the tasks in S can be scheduled in increasing order by deadline without violating any deadline. Example Tasks: Lengths: Deadlines: Weights: Aalto University: Summer

34 The Minimum Tardy Task Problem Example Tasks: Lengths: Deadlines: Weights: S = {1,3,5,6} S is feasible with W = 74 S = {2,3,4,6,8} g(4) + l 4 = and d 4 = 8 i.e. g(4) + l 4 > d 4 Therefore S is infeasible. Aalto University: Summer

35 The Minimum Tardy Task Problem Example Tasks: Lengths: Deadlines: Weights: Fitness Function String Term 1 Term 2 Term w 1 + w 5 + w 6 8 i=1 w i w 4 +w 7 + w w 1 + w 5 + w 7 8 i=1 w i w w 1 + w 2 + w w 6 + w 7 + w w 1 + w 5 + w 6 + w 7 8 i=1 w i w w 1 + w 5 + w 7 + w 8 8 i=1 w i w 4 Aalto University: Summer

36 GA Specifications The Minimum Tardy Task Problem S can be represented by a vector x = (x 1, x 2,..., x n ) where x i {0,1}. The presence of task i in S means that x i = 1. The fitness function to be minimized is the sum of three terms: f( x) = n i=1 w i (1 x i ) + (1 s) n i=1 w i + n i=1 w i x i 1 R + ( li + i 1 j=1 l jx j d i ). In the third term, x j is for schedulable jobs only. The whole term keeps checking the string to see if a task could have been scheduled. It makes use of the indicator function: { 1 if t A 1 A (t) = 0 otherwise. s = 1 when x is feasible, and s = 0 when x is infeasible. Aalto University: Summer

37 Scalable Problem Instance The Minimum Tardy Task Problem Construct a problem instance of size n = 5 that is scalable. Tasks: Lengths: Deadlines: Weights: To construct a MTTP instance of size n = 5t from above model: The first five tasks of the large problem are identical to the 5-model problem instance. The length l j, deadline d j, and weight w j of the j th task, for j = 1,2,..., n, is given by: l j = l i, d j = d i + 24 m and { wi if j 3 mod5 or j 4 mod 5 w j = (m + 1) w i otherwise, where j i mod 5 for i = 1,2,3,4,5 and m = j/5. The tardy task weight for the globally optimal solution of this problem is 2 n. Aalto University: Summer

38 The Minimum Tardy Task Problem Runs for the Minimum Tardy Task Problem mttp20 Population Size: 50 Number of Generations: 200 Crossover: Uniform Probability of Crossover: 0.6 Mutation: Swap Probability of Mutation: 0.05 A total of 100 runs is performed. f 1 ( x) d( x, opt) N Aalto University: Summer

39 mttp100 The Minimum Tardy Task Problem Scalable problem of size n = 100. Population Size: 200 Number of Generations: 100 Crossover: Uniform Probability of Crossover: 0.6 Mutation: Swap Probability of Mutation: 0.05 Note: About % of the search space is investigated. f 2 ( x) replaces infeasible strings by feasible ones. f 1 ( x) d( x, opt) N f 2 ( x) d( x, opt) N Aalto University: Summer

40 The Minimum Set-Covering Problem Minimum Set-Covering Problem Problem instance: A set of objects E = {e 1,..., e m } and a family of subsets of E, F = {S 1,..., S n }, such that n i=1 S i = E. Each S i costs c i where c i > 0 for i = 1,..., n. Or equivalently, a 0/1 m n matrix A = a a 1n a m1... a mn such that n j=1 a ij 1 for i = 1,2,..., m. Aalto University: Summer

41 The Minimum Set-Covering Problem Minimum Set-Covering Problem Feasible solution: A vector x = (x 1, x 2,..., x n ) where x i {0,1}, such that n j=1 for i = 1,2,..., m. a ij x j 1 Objective function: A cost function P( x) = n j=1 c j x j where x = (x 1, x 2,..., x n ) is a feasible solution. Optimal solution: A feasible vector that gives the minimum cost, i.e., a vector that minimizes the objective function. Aalto University: Summer

42 The Minimum Set-Covering Problem Minimum Set-Covering Problem Fitness Function B = n i=1 c i. U: the set of elements from E for which the subsets represented by x have failed to cover. The function considers the minimum cost required to transform the infeasible string into a feasible one: where and f( x) = s = n i=1 c i x i + s(b + m i=1 c i ) { 0, if x is feasible 1, if x is infeasible c i = { minei U{c j e i S j }, if e i U 0, if e i / U Aalto University: Summer

43 The Minimum Set-Covering Problem Minimum Set-Covering Problem Repair heuristic: Transform an infeasible string into a feasible one by changing x j = 0 to x j = 1 when c i = min e i U{c j e i S j } is calculated. The repaired string is always feasible; the fitness function in this case is simply f ( x) = n i=1 c i x i. Aalto University: Summer

44 The Minimum Set-Covering Problem GA Stochastic Operators and Test Problems The GA is used with the one-point, two-point, uniform, and fusion crossover operators. Fusion crossover: If a = (a 1, a 2,..., a n ) and b = (b1, b 2,..., b n ) are the parent strings with fitness f a and f b, then the offspring c = (c 1, c 2,..., c n ) is determined in the following fashion. for all i {1,..., n} if a i = b i then c i := a i ; else a c i := i with probability p = f b f a +f b b i with probability 1 p Aalto University: Summer

45 The Minimum Set-Covering Problem GA Stochastic Operators and Test Problems The crossover rate is 0.6 Mutation rate p m = 1/n. Population size λ = 100. Each experiment was run for 5,000 generations. Problem m n Density (%) a.1 a b.1 b c.1 c d.1 d e.1 e Aalto University: Summer

46 The Minimum Set-Covering Problem Experimental Results Prob 1-pt 2-pt unif fusion greedy bt kn a a a a a Aalto University: Summer

47 The Minimum Set-Covering Problem Experimental Results - Continued Prob 1-pt 2-pt unif fusion greedy bt kn b b b b b c c c c c d d d d d e e e e e Aalto University: Summer

48 Conclusions on the MSCP The Minimum Set-Covering Problem Repairing infeasible strings according to the minimum cost principle yields better results - the final values are close to the best known solutions. The repair procedure competes well with the greedy heuristic and yields better solutions in most cases. The choice of the crossover operator has no significant impact on the quality of the results. Our simple greedy heuristic always outperforms the genetic algorithm without repairing. (Table not shown here) Aalto University: Summer

49 Terminal Assignment Problem [KC97] Terminal Assignment Problem Problem Instance: Terminals: l 1, l 2,..., l T Weights: w 1, w 2,..., w T Concentrators: r 1, r 2,..., r C Capacities: p 1, p 2,..., p C Feasible Solution: Vector: x = x 1 x 2... x T x i = j: i th terminal assigned to j. Assign all terminals to concentrators without exceeding the concentrators capacities. Objective Function: A function Z( x) = T i=1 cost ij, where cost ij is the distance between terminal i and concentrator j. Optimal Solution: A feasible vector x that yields the smallest possible Z( x). Aalto University: Summer

50 Encoding for TA: Terminal Assignment Problem [KC97] Use non-binary strings: x 1 x 2... x T The value of x i is the concentrator to which the i th terminal is assigned. Example: l 8 r 3 l 9 l 2 l 3 l 7 l 1 l 6 l 5 l 10 r 1 l 4 r 2 Aalto University: Summer

51 Fitness Function: It is the sum of two terms: Terminal Assignment Problem [KC97] 1. the objective function which calculates the total cost of all connections. 2. a penalty function used to penalize infeasible strings. It is the sum of two terms: offset term: the product of the number of terminals and the maximum distance on the grid. graded term: the product of the sum of excessive load of concentrators and the number of concentrators that are in fact overloaded. Aalto University: Summer

52 Experimental Runs Experimental Runs Ten problem instances of 100 terminals each. Each concentrator has capacity between 15 and 25. The number of concentrators is between 27 and 33. Number of generations: 20,000 Population size: 500 Crossover rate: 0.6 Mutation rates: Between and 0.1 for LibGA and for GeneSyS. Aalto University: Summer

53 Experimental Runs The results for the Genetic Algorithm are the best obtained after 20,000 generations on each problem instance. Probably for 100 1, 100 2, and 100 3, the solutions obtained by LibGA (and Procedure Greedy) are the global optima. Problem Greedy Genetic Algorithms Instances Algorithm GENEsYs LibGA Aalto University: Summer

54 The Edge Coloring Problem [KWS00] The Edge Coloring Problem Problem Instance A graph G = (V, E), V = {v 1,..., v n } is the set of vertices, E = {e 1,..., e m } is the set of edges. Feasible Solution Vector y = y 1 y 2... y m, y i = j means that the i th edge e i is given color j, 1 j m. The components of y are essentially the colors assigned to the m edges of G, where: Two adjacent edges have different colors. The colors are denoted by 1, 2,..., q, where 1 q m. Aalto University: Summer

55 The Edge Coloring Problem Objective function For a feasible vector y, let P( y) be the number of colors used in the coloring of the edges represented by y. More precisely, P( y) = max{y i y i is a component of y}. Optimal solution A feasible vector y that yields the smallest P( y). Aalto University: Summer

56 The Edge Coloring Problem Applications Practical applications include scheduling problems, partitioning problems, timetabling problems, and wavelength-routing in optical networks. NP-Hard Holyer proved that the edge coloring problem is NP-hard. Definition is maximum vertex degree of G. χ (G): the chromatic index of G, is the minimum number of colors required to color the edges of G. Vizing s Theorem If G is a nonbipartite, simple graph, then χ (G) = or χ (G) = + 1. Aalto University: Summer

57 Bipartite Graphs Edge-Coloring Bipartite Graphs Journal of Algorithms, February 2000 Article by A. Kapoor and R. Rizzi: O(mlog 2 + m log 2 m log 2 2 ) SIAM, Journal on Computation, 1982 Article by Hopcroft and Cole: O(mlog 2 + m log 2 m log 2 3 ) Aalto University: Summer

58 The Genetic Algorithm The Genetic Algorithm Encoding: Use non-binary strings of length m (number of edges): y 1 y 2... y m. The value of y j represents the color assigned to the j th edge. For example, if a graph has 12 edges then the following string is a possible coloring, where edge e 1 has color number 5, edge e 2 has color number 3, and so on Aalto University: Summer

59 The Genetic Algorithm The GA Operators generational genetic algorithms roulette wheel selection uniform crossover uniform mutation Each string in the population occupies a slot on the roulette wheel of size inversely proportional (since the edge coloring problem is a minimization problem) to the fitness value. Aalto University: Summer

60 The Genetic Algorithm Fitness Function The fitness function of y = y 1 y 2... y m is given by: f( y) = P( y) + s[( + 1) + T( y)] It is the sum of two terms: 1. the objective function P( y), which calculates the total number of colors used to color the edges of the graph. 2. a penalty term used to penalize infeasible strings. It is the sum of two parts: The first part is the maximum vertex degree to which one is added, i.e., + 1. It is an offset term. The second part of the penalty term, T( y), is the number of violated pairs. Aalto University: Summer

61 Grouping Genetic Algorithm [Fal93] Grouping Problems Many problems consist in partitioning a given set U of items into a collection of pairwise disjoint subsets U j under some constraint. In essence, we are asked to distribute the elements of U into small groups U j. Some distributions (i.e., groupings) are forbidden. Some examples: Bin packing problem, terminal assignment, graph coloring. The objective is to minimize a cost function over the set of all possible legal groupings. Aalto University: Summer

62 Grouping Genetic Algorithm The Grouping Genetic Algorithm Encoding To capture the grouping entity of the problem, the simple (standard) chromosome is augmented with a group part. For example, the chromosome of our hypothetical 12-edge graph: A B 7 C D 9 10 E The colors are labeled with letters, the edges with numbers. The first half is the object part (the entire chromosome of the previous section), the second half is the group part. Aalto University: Summer

63 Grouping Genetic Algorithm Fitness Function The fitness function of a string considers the object part of the string only and computes the fitness value as explained before. Aalto University: Summer

64 Grouping Genetic Algorithm Crossover Operator Two parents are randomly chosen from the mating pool and two crossing sites are chosen for each parent: A B 7 C D 9 10 E A B 8 11 C 2 9 D E 4 5 Inject the entire group part between the two crossing sites of the second parent into the first parent at the beginning of its crossing site: A B 8 11 C 2 9 D B 7 C D 9 10 E Aalto University: Summer

65 Grouping Genetic Algorithm Dealing with Redundancy Resolve redundancy by eliminating groups B, C, and D from their old membership (i.e., from the first parent string): A B 8 11 C 2 9 D E Remove duplicate edges (8 and 11) from their old membership (from the first parent string): A 5 B 8 11 C 2 9 D E Use the first fit heuristic to reassign lost edges (6 and 7): assign the edge to the first group that is able to service it, or to a randomly selected group if none is available. Reversing the roles of the two parents allows the construction of a second offspring. Aalto University: Summer

66 Grouping Genetic Algorithms The Mutation Operator Probabilistically remove some group from the chromosome and reassign its edges. Each edge is assigned to the first group that is able to take it. After crossover and mutation are performed on the group part, the object part is rearranged accordingly to reflect the changes that the group part has undergone. Aalto University: Summer

67 Experimental Runs Benchmarks Twenty eight simple graphs with the number of edges between 20 and 2596: Five book graphs from the Stanford GraphBase. Two miles graphs from the Stanford GraphBase (denoted by mls). Nine queen graphs from the Stanford GraphBase (q). Four problem instances based on the Mycielski transformation from the DIMACS collection (mcl). Seven complete graphs (not shown in tables). Fifteen multigraphs generated by repeatedly assigning edges to randomly chosen pairs of vertices until each vertex has a fixed degree. We denote the multigraphs we generated by rexmg10 x, rexmg20 x, and rexmg30 x, where 10, 20 and 30 are the number of vertices and x is the degree and has the values of 10, 20, 30, 40 and 50. Aalto University: Summer

68 Test results for the simple graphs Test results for the GraphBase and DIMACS problem instances Problem Instances Greedy Algorithm LibGA GGA Name Vertices Edges Density Best Freq. Mean Best Freq. Mean anna david homer huck jean gs mls mls q q q q q q q q q mcl mcl mcl mcl Aalto University: Summer

69 Test results for the multigraphs Test results for the multigraph problem instances Problem Instances Greedy Algorithm GGA Name Vertices Edges Density Best Freq. Mean Best Freq. Mean rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg rexmg Aalto University: Summer

70 Walsh and Haar Functions Walsh and Haar Functions Walsh Functions are used: as basis for the expansion of fitness functions to compute the average fitness values of schemata to design deceptive functions for the genetic algorithm. Haar Functions. Comparison between Walsh and Haar Functions. Fast Transforms. Aalto University: Summer

71 Walsh Functions Walsh Transforms: Orthogonal, rectangular functions. Practical as bases for fitness functions of genetic algorithms. Work with ±1 rather than 0 and 1. Associate with x = x l x l 1... x 2 x 1 an auxiliary string y = y l y l 1... y 2 y 1 y = aux(x l )aux(x l 1 )... aux(x 2 )aux(x 1 ) where { 1 if xi = 1 y i = aux(x i ) = 1 if x i = 0 The Walsh functions over auxiliary string variables form a set of 2 l monomials defined for y = aux(x): Ψ j (y) = Π l i=1 yj i i where j = j l j l 1... j 1 and j = Σ l i=1 j i2 i 1. Aalto University: Summer

72 Walsh Functions: Basis Example In Ψ 3 (y) where l = 3, j = 3, and j 3 = 0, j 2 = 1, j 1 = 1. Ψ 3 (y) = y 0 3 y1 2 y1 1 = y 1y 2. If x = 101 then: Ψ 3 (aux(101)) = Ψ 3 (aux(1)aux(0)aux(1)) = Ψ 3 ( 11 1) = ( 1)(1) = 1 {Ψ j (x) : j = 0,1,2,...,2 l 1} forms a basis for the fitness functions defined on [0,2 l ). Consequently, any fitness function defined on binary strings of length l can be represented by a linear combination of dicrete Walsh functions. Aalto University: Summer

73 Walsh Functions: Basis Example continued f(x) = Σ 2l 1 j=0 w jψ j (x) w j : Walsh coefficients, Ψ j (x): Walsh monomials. The Walsh coefficients are given by: w j = 1 2 lσ2l 1 x=0 f(x)ψ j(x). Aalto University: Summer

74 Walsh functions on [0,8) Walsh functions on [0,8) x Ψ 0 (x) Ψ 1 (x) Ψ 2 (x) Ψ 3 (x) Ψ 4 (x) Ψ 5 (x) Ψ 6 (x) Ψ 7 (x) ind j j 3 j 2 j mono y3 0y0 2 y0 1 y0 3 y0 2 y1 1 y0 3 y1 2 y0 1 y0 3 y1 2 y1 1 y1 3 y0 2 y0 1 y1 3 y0 2 y1 1 y1 3 y1 2 y0 1 y1 3 y1 2 y1 1 y j 3 3 yj 2 2 yj y 1 y 2 y 1 y 2 y 3 y 1 y 3 y 2 y 3 y 1 y 2 y 3 Aalto University: Summer

75 Lowest-order Hadamard matrix: [ ] 1 1 H 2 =. 1 1 Recursive relation: H N = H N2 H 2 Hadamard Matrices : Kronecker product (also known as direct product). H N = H N 2 H N2 H N2 H N2 H 8 = Aalto University: Summer

76 Schema Average Fitness Average fitness value of a schema can be expressed as a partial signed summation over Walsh coefficients. A schema h (hyperplane): h l, h l 1,..., h 2, h 1 where h i {0,1, }. f(h) = 1 h Σ x hf(x). f(h) = 1 h Σ x hσ 2l 1 j=0 w jψ j (x) = 1 h Σ2l 1 j=0 w jσ x h Ψ j (x), The lower the order of the schema, the fewer the number of terms in the summation. Aalto University: Summer

77 Schema Average Fitness Schema Fitness Averages w 1 0 w 0 + w 1 1 w 0 w 4 01 w 0 w 1 + w 2 w 3 11 w 0 w 2 w 4 + w w 0 w 1 + w 2 w 3 + w 4 w 5 + w 6 w 7 Fitness averages as partial Walsh sums of some schemata. Aalto University: Summer

78 Easy and Hard Problems for the GA Goldberg s minimal deceptive functions using Walsh coefficients Two Types of Minimal Deceptive Functions. Fully 3 bit Deceptive Function: If 111 is the optimal point, then all order-one schemata lead away from it. In other words: f( 1) < f( 0), f( 1 ) < f( 0 ), f(1 ) < f(0 ). Similarly, for order-two schemata: f( 00) > f( 01), f( 00) > f( 10), f( 00) > f( 11). Aalto University: Summer

79 Goldberg s ANODE Analysis of Deception Goldberg used neighborhood arguments in: ANODE: analysis of deception: an algorithm for determining whether and to what degree a coding-function combination is deceptive. The diagnostic is problem-instance dependent. Aalto University: Summer

80 Haar Functions Haar Functions Functions were proposed by the Hungarian mathematician Haar in Haar got his PhD from Germany. Thesis title: Zur Theorie der Orthogonalen Funktionensysteme. Haar functions take values of 1, 0 and 1, multiplied by powers of 2. Aalto University: Summer

81 Haar Functions Haar Functions H 0 (x) = H 1 (x) = H 2 (x) = H 3 (x) =. H 2q +m(x) =. H 2l 1(x) = 1 for 0 x < 2 { l 1 for 0 x < 2 l 1 1 for 2 l 1 x < 2 { l 2 for 0 x < 2 l 2 { 2 for 2 l 2 x < 2 l 1 0 elsewhere in [0,2 l ) 0 for 0 x < 2 l 1 2 for (2)2 l 2 x < (3)2 l 2 2 for (3)2 l 2 x < (4)2 l 2 { ( 2) q for (2m)2 l q 1 x < (2m + 1)2 l q 1 ( 2) q for (2m + 1)2 l q 1 x < (2m + 2)2 l q 1 0 elsewhere in [0,2 l ) { ( 2) l 1 for 2(2 l 1 1) x < 2 l 1 ( 2) l 1 for 2 l 1 x < 2 l 0 elsewhere in [0,2 l ) For every value of q = 0,1,..., l 1, we have m = 0,1,...,2 q 1. Aalto University: Summer

82 Haar functions on [0,8) Haar functions on [0,8) Aalto University: Summer

83 Haar Functions Haar funcions on [0,8) x H 0 (x) H 1 (x) H 2 (x) H 3 (x) H 4 (x) H 5 (x) H 6 (x) H 7 (x) index j degree q undef order m undef Haar functions H 2q +m(x) for l = 3. Aalto University: Summer

84 Haar Functions The set {H j (x) : j = 0,1,2,...,2 l 1} forms a basis for the fitness functions defined on the integers in [0,2 l ). f(x) = Σ 2l 1 j=0 h j H j (x), h j s, for j = 2 q + m, are the Haar coefficients given by: h j = 1 2 lσ2l 1 x=0 f(x) H j(x). The higher q, the smaller the subinterval with non-zero values for H j (x). Each Haar coefficient depends only on the local behavior of f(x). Aalto University: Summer

85 Haar Functions Haar coefficients and Haar functions Haar Coefficients Every Haar coefficient of degree q has at most 2 l q non-zero terms. The linear combination of each Haar coefficient h j, where j = 2 q + m, has at most 2 l q non-zero terms. h 0 has at most 2 l non-zero terms. Haar Functions For any fixed value x [0,2 l ), f(x) has at most l + 1 non-zero terms. Aalto University: Summer

86 Comparison between Walsh and Haar Functions Total number of terms required for the computation of the expansion of f(1011): Walsh is 256 (16 16); Haar is 46 (16 each for h 0 and h 1 ; 8 for h 3, 4 for h 6, and 2 terms for h 13 ). For l = 20, there are about more terms using Walsh expansion. w j depends on the behavior of f(x) at all 2 l points. h j depends only on the local behavior of f(x) at a few points which are close together When used to approximate continuous functions: Walsh expansions might diverge at some points, whereas Haar expansions always converge. Aalto University: Summer

87 Comparison between Walsh and Haar Functions Computing the number of non-zero terms Non-zero terms Total num of non-zero terms Schema Walsh Haar Walsh Haar **** ***d **d* *d** d*** **dd *d*d d**d *dd* d*d* dd** *ddd d*dd dd*d ddd* dddd Total Computing schemata for l = 4 with Walsh and Haar functions. Aalto University: Summer

88 Comparison between Walsh and Haar Functions Maximum number of non-zero terms for d**d For example: d**d: E = {0**0,0**1,1**0,1**1}. Average fitness of 1**0 with Walsh functions: f(1**0) = w 0 + w 1 w 8 w 9. E has 4 schemata, the maximum number of non-zero terms for all schemata represented by d**d is 4 4 = 16 (2 nd column). Each Walsh coefficient requires 16 terms for its computation. So total number of terms required in the computation of the expansion of f(1**0) is 4 16; and that of all schemata represented by d**d, 4 64 (column 4). Aalto University: Summer

89 Comparison between Walsh and Haar Functions Maximum number of non-zero terms for d**d With Haar functions: f(1**0) = h 0 h (h 12 + h 13 + h 14 + h 15 ). The third column has: 4 6. Two terms are required for the computation of each of h 12, h 13, h 14, and h 15, while 16 are needed for h 0 and 16 for h 1. Forty terms are required in the computation of the expansion of the other 3 schemata in E, therefore, the total is Aalto University: Summer

90 Fast Transforms Fast Transforms By taking advantage of the many repetitive computations performed with orthogonal transformations, the analysis can be implemented to form fast transforms. These are in-place algorithms and were modelled after the fast Fourier transform The fast transforms can be implemented with: on the order of at most l2 l computations in the case of fast Walsh transforms; and on the order of 2 l for the fast Haar transforms. Aalto University: Summer

91 Fast Transforms Fast Haar Transform Aalto University: Summer

92 Fast Transforms The Fast Walsh-Haar Transform Aalto University: Summer

93 Application Problems Applications II DNA Fragment Assembly Problem. Tracking a Criminal Suspect. Genetic Programming. Aalto University: Summer

94 Applications II: DNA Fragment Assembly [SM97 DNA Fragment Assembly In large scale DNA sequencing, we are given a collection of many fragments of short DNA sequences. The fragments are approximate substrings of a very long DNA molecule. The Fragment Assembly Problem consists in reconstructing the original sequence from the fragments. Aalto University: Summer

95 The DNA Fragment Assembly Problem DNA Fragment Assembly It is impossible to directly sequence contiguous stretches of more than a few hundred bases. On the other hand, there is technology to cut random pieces of long DNA molecule and to produce enough copies of the piece to sequence. A typical approach to sequence long DNA molecules is to sample and then sequence fragments from them. The problem is that these pieces have to be assembled. Aalto University: Summer

96 The DNA Fragment Assembly Problem DNA Fragment Assembly In sequencing DNA, the approximate length of the target sequence is known within 10 percent. It is impossible to sequence the entire molecule directly. Instead, we obtain a piece of the molecule starting at a random position in one of the strands and sequence it in the canonical direction for a certain length. Each such sequence is called a fragment. Aalto University: Summer

97 The DNA Fragment Assembly Problem Fragments of a DNA Molecule Each fragment corresponds to a substring of one of the strands of the target molecule. We do not know: which strand the fragment belongs to, the position of the fragment relative to the beginning of the strand, if the fragment contains errors. Aalto University: Summer

98 The DNA Fragment Assembly Problem A large number of fragments are obtained by a sequencing technique known as the shotgun method. The reconstruction of the target molecule s sequence is based on fragment overlap. Fragment lengths vary from 200 to 700 bases. Target sequences are between 30,000 and 100,000 base-pairs. The total number of fragments is between 500 and The problem consists in obtaining the whole sequence of the target DNA molecule. In other words, piecing together a collection of fragments. Aalto University: Summer

99 The DNA Fragment Assembly Problem DNA Fragment Assembly [All99] Aalto University: Summer

100 Complicating Factors The DNA Fragment Assembly Problem DNA sequencing is very challenging since: Real problem instances are very large. Many fragments contain errors: Base call errors Chimeras Vector contamination The orientation of the fragments is frequently unknown; and both strands must be analyzed. There might be a lack of coverage. Aalto University: Summer

101 The DNA Fragment Assembly Problem Modelling the Fragment Assembly Problem Models of the fragment assembly problem: Shortest Common Superstring Reconstruction Multicontig None addresses the biological issues completely. They all assume that the fragment collection is free of contamination and chimeras. Aalto University: Summer

102 Encoding for the Fragment Assembly Problem Given fragments 0,1,2,..., n 1, I = f[0], f[1],..., f[n 1] is an ordering of the fragments, where f[i] = j denotes that fragment j appears in position i of the ordering. Since the strings are permutations of the fragments, operators that have traditionally been used with permutation problems, such as the Travelling Salesperson Problem, can be used. Aalto University: Summer

103 The DNA Fragment Assembly Problem Fitness Functions for the Fragment Assembly Problem [PFB95] Two fitness functions have been tested. 1. Function F 1 gives the overlap strength of directly adjacent fragments in the ordering I. F 1 (I) = n 2 i=0 w f[i],f[i+1] The genetic algorithm attempts to maximize F 1 (I), thus favoring layouts that contain adjacent fragments with strong overlaps. Aalto University: Summer

104 The DNA Fragment Assembly Problem Fitness Functions for the Fragment Assembly Problem 2. Fitness function F 2 attempts to do more than F 1. F 2 considers the overlap strenths among all pairs of fragments. It penalizes layouts in which fragments with strong overlap are far apart. F 2 (I) = n 1 i=0 n 1 j=0 i j w f[i],f[j] The genetic algorithm attempts to minimize F 2 (I) by using the absolute value of the distance of the fragments in the layout. Aalto University: Summer

105 The DNA Fragment Assembly Problem Problem Instances The fragment data sets where artificially generated from actual DNA sequences. DNA sequenceaccession numberlength in base pairs HUMATPKOI M HUMMHCFIB X HUMAPOBF M The parent DNA sequence was: 1. replicated by the program, 2. fragmented at random locations to derive a fragment set. Genetic Algorithms were successful with 10KB-parent sequences and with a seven-fold base coverage on the average. Aalto University: Summer

106 Other Applications: Tracking Criminal Suspect [CJ91] Tracking a Criminal Suspect Through Face-Space with a Genetic Algorithm by C. Caldwell and V. Johnston Aalto University: Summer

107 Other Applications: Tracking Criminal Suspect [CJ91] Tracking a Criminal Suspect Humans have excellent facial recognition ability. GA relies on the fact that recognition is stronger than recallection. Witness rates twenty chosen pictures according to its resemlance to the culprit. Twenty more pictures are shown after crossover and mutation, etc. Aalto University: Summer

108 Other Applications: Genetic Programming [Koz92] Genetic Programming The structures undergoing adaptation in Genetic Programming are general, hierarchical computer programs of dynamically varying size and shape. The search space consists of all possible computer programs composed of functions and terminals appropriate to the problem domain. Aalto University: Summer

109 Concluding Remarks Conclusion Simple Genetic. Grouping Genetic Algorithms. Genetic Programming. Other Evolutionary-based Algorithms: Evolution Strategies and Evolutionary Programming. Seeding the initial population seems to work well. More experimentation with repair mechanism. Hybrid Algorithms and Parallel Genetic Algorithms. Aalto University: Summer

110 Concluding Remarks Static nature of the analysis. More dynamic models of genetic algorithms are needed. What problems are suitable for Genetic Algorithms? Weak classification of problems between hard and easy. The study of Genetic Algorithms is still in its embryonic stage. Aalto University: Summer

111 References References 1. [All99] C. Allex Computational Methods For Fast and Accurate DNA Fragment Assembly, PhD, University of Wisconsin-Madison, [CJ91] C. Caldwell and V. Johnson. Tracking a Criminal Suspect through Face-Space with a Genetic Algorithm, Proceedings of the Fourth International Conference on Genetic Algorithms, San Diego, CA, 1991, pp [CW93] A. Corcoran and R. Wainwright. LibGA: A User-friendly Workbench for Ordered-based Genetic Algorithm Research, Proceedings of the 1993 ACM/SIGAPP Symposium on Applied Computing, ACM Press, 1993, pp [Fal93] E. Falkenauer. The Grouping Genetic Algorithms - Widening the Scope of Genetic Algorithms, Belgian Journal of Operations Research, Statistics and Computer Science, vol. 33, 1993, pp [Gol89] D. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley, NY, [Hol75] J. Holland. Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor, [KB90] S. Khuri and A. Batarekh. Heuristics for the Integer Knapsack Problem, Proceedings of the X th International Conference of the Chilean Computer Science Society, Santiago, Chile, 1990, pp Aalto University: Summer

112 References 8. [KBH94] S. Khuri, T. Bäck and J. Heitkötter. An Evolutionary Approach to Combinatorial Optimization Problems, Proceedings of the 22 nd ACM Computer Science Conference, ACM Press, NY, 1994, pp [KC97] S. Khuri and T. Chiu. Heuristic Algorithms for the Terminal Assignment Problem, Proc. of the 1997 ACM/SIGAPP Symp. on App. Comp., ACM Press, [KWS00] S. Khuri, T. Walters and Y. Sugono. A Grouping Genetic Algorithm for Coloring the Edges of Graphs, Proceedings of the 2000 ACM/SIGAPP Symposium on Applied Computing, ACM Press, [Koz92] J. Koza. Genetic Programming: On the Programming of Computers by Means of Natural Selection, Cambridge, MA, MIT Press, [SM97] J. C.Setubal and J. Meidanis. Introduction to Computational Molecular Biology, PWS Publishing Company, Boston, [PFB95] R.J. Parsons, S. Forrest, C. Burks. Genetic Algorithms, Operators, and DNA Fragment Assembly, Machine Learning 21(1-2), pp , [Whi94] D. Whitley. A Genetic Algorithm Tutorial, Statistics and Computing, vol. 4, 1994, pp Aalto University: Summer

113 More Information Where to go for More Information Books and Articles: 1. T. Back, D. Fogel, and Z. Michalewicz. Handbook of Evolutionary Computation, Oxford University Press, UK, D. Fogel. Evolutionary Computation, IEEE Press, NJ, Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs, Springer-Verlag, NY, 3 rd ed., M. Mitchell. An Introduction to Genetic Algorithms, The MIT Press, Cambridge, MA, C. Prince, R. Wainwright, D. Schoenfield, and T. Tull. GATutor: A Graphical Tutorial System for Genetic Algorithms, SIGCSE Bulletin, vol. 26, ACM Press, 1994, pp H.P. Schwefel. Evolution and Optimum Seeking, John-Wiley, NY, Aalto University: Summer

Evolutionary Computation

Evolutionary Computation Evolutionary Computation - Computational procedures patterned after biological evolution. - Search procedure that probabilistically applies search operators to set of points in the search space. - Lamarck

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

CSC 4510 Machine Learning

CSC 4510 Machine Learning 10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on

More information

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia)

Evolutionary Computation. DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) Evolutionary Computation DEIS-Cesena Alma Mater Studiorum Università di Bologna Cesena (Italia) andrea.roli@unibo.it Evolutionary Computation Inspiring principle: theory of natural selection Species face

More information

Genetic Algorithm. Outline

Genetic Algorithm. Outline Genetic Algorithm 056: 166 Production Systems Shital Shah SPRING 2004 Outline Genetic Algorithm (GA) Applications Search space Step-by-step GA Mechanism Examples GA performance Other GA examples 1 Genetic

More information

The Role of Crossover in Genetic Algorithms to Solve Optimization of a Function Problem Falih Hassan

The Role of Crossover in Genetic Algorithms to Solve Optimization of a Function Problem Falih Hassan The Role of Crossover in Genetic Algorithms to Solve Optimization of a Function Problem Falih Hassan ABSTRACT The genetic algorithm is an adaptive search method that has the ability for a smart search

More information

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough

Search. Search is a key component of intelligent problem solving. Get closer to the goal if time is not enough Search Search is a key component of intelligent problem solving Search can be used to Find a desired goal if time allows Get closer to the goal if time is not enough section 11 page 1 The size of the search

More information

Evolutionary computation

Evolutionary computation Evolutionary computation Andrea Roli andrea.roli@unibo.it DEIS Alma Mater Studiorum Università di Bologna Evolutionary computation p. 1 Evolutionary Computation Evolutionary computation p. 2 Evolutionary

More information

Fundamentals of Genetic Algorithms

Fundamentals of Genetic Algorithms Fundamentals of Genetic Algorithms : AI Course Lecture 39 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, June 01, 2010 www.myreaders.info/html/artificial_intelligence.html

More information

Introduction to Walsh Analysis

Introduction to Walsh Analysis Introduction to Walsh Analysis Alternative Views of the Genetic Algorithm R. Paul Wiegand paul@tesseract.org ECLab George Mason University EClab - Summer Lecture Series p.1/39 Outline of Discussion Part

More information

Lecture 15: Genetic Algorithms

Lecture 15: Genetic Algorithms Lecture 15: Genetic Algorithms Dr Roman V Belavkin BIS3226 Contents 1 Combinatorial Problems 1 2 Natural Selection 2 3 Genetic Algorithms 3 31 Individuals and Population 3 32 Fitness Functions 3 33 Encoding

More information

Evolutionary computation

Evolutionary computation Evolutionary computation Andrea Roli andrea.roli@unibo.it Dept. of Computer Science and Engineering (DISI) Campus of Cesena Alma Mater Studiorum Università di Bologna Outline 1 Basic principles 2 Genetic

More information

Implicit Formae in Genetic Algorithms

Implicit Formae in Genetic Algorithms Implicit Formae in Genetic Algorithms Márk Jelasity ½ and József Dombi ¾ ¾ ½ Student of József Attila University, Szeged, Hungary jelasity@inf.u-szeged.hu Department of Applied Informatics, József Attila

More information

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web:   jxh Evolutionary Computation Theory Jun He School of Computer Science University of Birmingham Web: www.cs.bham.ac.uk/ jxh Outline Motivation History Schema Theorem Convergence and Convergence Rate Computational

More information

Lecture 22. Introduction to Genetic Algorithms

Lecture 22. Introduction to Genetic Algorithms Lecture 22 Introduction to Genetic Algorithms Thursday 14 November 2002 William H. Hsu, KSU http://www.kddresearch.org http://www.cis.ksu.edu/~bhsu Readings: Sections 9.1-9.4, Mitchell Chapter 1, Sections

More information

Crossover Gene Selection by Spatial Location

Crossover Gene Selection by Spatial Location Crossover Gene Selection by Spatial Location ABSTRACT Dr. David M. Cherba Computer Science Department Michigan State University 3105 Engineering Building East Lansing, MI 48823 USA cherbada@cse.msu.edu

More information

Schema Theory. David White. Wesleyan University. November 30, 2009

Schema Theory. David White. Wesleyan University. November 30, 2009 November 30, 2009 Building Block Hypothesis Recall Royal Roads problem from September 21 class Definition Building blocks are short groups of alleles that tend to endow chromosomes with higher fitness

More information

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms

V. Evolutionary Computing. Read Flake, ch. 20. Genetic Algorithms. Part 5A: Genetic Algorithms 4/10/17. A. Genetic Algorithms V. Evolutionary Computing A. Genetic Algorithms 4/10/17 1 Read Flake, ch. 20 4/10/17 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified model of genetics

More information

A Tractable Walsh Analysis of SAT and its Implications for Genetic Algorithms

A Tractable Walsh Analysis of SAT and its Implications for Genetic Algorithms From: AAAI-98 Proceedings. Copyright 998, AAAI (www.aaai.org). All rights reserved. A Tractable Walsh Analysis of SAT and its Implications for Genetic Algorithms Soraya Rana Robert B. Heckendorn Darrell

More information

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS

GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS GENETIC ALGORITHM FOR CELL DESIGN UNDER SINGLE AND MULTIPLE PERIODS A genetic algorithm is a random search technique for global optimisation in a complex search space. It was originally inspired by an

More information

Chapter 8: Introduction to Evolutionary Computation

Chapter 8: Introduction to Evolutionary Computation Computational Intelligence: Second Edition Contents Some Theories about Evolution Evolution is an optimization process: the aim is to improve the ability of an organism to survive in dynamically changing

More information

Feasibility-Preserving Crossover for Maximum k-coverage Problem

Feasibility-Preserving Crossover for Maximum k-coverage Problem Feasibility-Preserving Crossover for Maximum -Coverage Problem Yourim Yoon School of Computer Science & Engineering Seoul National University Sillim-dong, Gwana-gu Seoul, 151-744, Korea yryoon@soar.snu.ac.r

More information

V. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

V. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA Part 5A: Genetic Algorithms V. Evolutionary Computing A. Genetic Algorithms Read Flake, ch. 20 1 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified

More information

Crossover Techniques in GAs

Crossover Techniques in GAs Crossover Techniques in GAs Debasis Samanta Indian Institute of Technology Kharagpur dsamanta@iitkgp.ac.in 16.03.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 16.03.2018 1 / 1 Important

More information

IV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA

IV. Evolutionary Computing. Read Flake, ch. 20. Assumptions. Genetic Algorithms. Fitness-Biased Selection. Outline of Simplified GA IV. Evolutionary Computing A. Genetic Algorithms Read Flake, ch. 20 2014/2/26 1 2014/2/26 2 Genetic Algorithms Developed by John Holland in 60s Did not become popular until late 80s A simplified model

More information

Unit 1A: Computational Complexity

Unit 1A: Computational Complexity Unit 1A: Computational Complexity Course contents: Computational complexity NP-completeness Algorithmic Paradigms Readings Chapters 3, 4, and 5 Unit 1A 1 O: Upper Bounding Function Def: f(n)= O(g(n)) if

More information

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1 CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring 2017 1 / 1 Part I Greedy Algorithms: Tools and Techniques Chandra

More information

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem Jun He 1, Yuren Zhou 2, and Xin Yao 3 1 J. He is with the Department of Computer Science,

More information

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM 8. INTRACTABILITY I poly-time reductions packing and covering problems constraint satisfaction problems sequencing problems partitioning problems graph coloring numerical problems Lecture slides by Kevin

More information

Show that the following problems are NP-complete

Show that the following problems are NP-complete Show that the following problems are NP-complete April 7, 2018 Below is a list of 30 exercises in which you are asked to prove that some problem is NP-complete. The goal is to better understand the theory

More information

Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms

Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms Generalization of Dominance Relation-Based Replacement Rules for Memetic EMO Algorithms Tadahiko Murata 1, Shiori Kaige 2, and Hisao Ishibuchi 2 1 Department of Informatics, Kansai University 2-1-1 Ryozenji-cho,

More information

Metaheuristics and Local Search

Metaheuristics and Local Search Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :

More information

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Zebo Peng Embedded Systems Laboratory IDA, Linköping University TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic

More information

What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation

What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation Stephanie Forrest Dept. of Computer Science University of New Mexico Albuquerque, N.M. 87131-1386 Email:

More information

This means that we can assume each list ) is

This means that we can assume each list ) is This means that we can assume each list ) is of the form ),, ( )with < and Since the sizes of the items are integers, there are at most +1pairs in each list Furthermore, if we let = be the maximum possible

More information

3.4 Relaxations and bounds

3.4 Relaxations and bounds 3.4 Relaxations and bounds Consider a generic Discrete Optimization problem z = min{c(x) : x X} with an optimal solution x X. In general, the algorithms generate not only a decreasing sequence of upper

More information

how should the GA proceed?

how should the GA proceed? how should the GA proceed? string fitness 10111 10 01000 5 11010 3 00011 20 which new string would be better than any of the above? (the GA does not know the mapping between strings and fitness values!)

More information

CSC 421: Algorithm Design & Analysis. Spring 2018

CSC 421: Algorithm Design & Analysis. Spring 2018 CSC 421: Algorithm Design & Analysis Spring 2018 Complexity & Computability complexity theory tractability, decidability P vs. NP, Turing machines NP-complete, reductions approximation algorithms, genetic

More information

Computational statistics

Computational statistics Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f

More information

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle Directed Hamiltonian Cycle Chapter 8 NP and Computational Intractability Claim. G has a Hamiltonian cycle iff G' does. Pf. Suppose G has a directed Hamiltonian cycle Γ. Then G' has an undirected Hamiltonian

More information

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016) FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016) The final exam will be on Thursday, May 12, from 8:00 10:00 am, at our regular class location (CSI 2117). It will be closed-book and closed-notes, except

More information

Algorithms: COMP3121/3821/9101/9801

Algorithms: COMP3121/3821/9101/9801 NEW SOUTH WALES Algorithms: COMP3121/3821/9101/9801 Aleks Ignjatović School of Computer Science and Engineering University of New South Wales TOPIC 4: THE GREEDY METHOD COMP3121/3821/9101/9801 1 / 23 The

More information

Genetic Algorithms & Modeling

Genetic Algorithms & Modeling Genetic Algorithms & Modeling : Soft Computing Course Lecture 37 40, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html

More information

A Statistical Genetic Algorithm

A Statistical Genetic Algorithm A Statistical Genetic Algorithm Angel Kuri M. akm@pollux.cic.ipn.mx Centro de Investigación en Computación Instituto Politécnico Nacional Zacatenco México 07738, D.F. Abstract A Genetic Algorithm which

More information

Determination of Optimal Tightened Normal Tightened Plan Using a Genetic Algorithm

Determination of Optimal Tightened Normal Tightened Plan Using a Genetic Algorithm Journal of Modern Applied Statistical Methods Volume 15 Issue 1 Article 47 5-1-2016 Determination of Optimal Tightened Normal Tightened Plan Using a Genetic Algorithm Sampath Sundaram University of Madras,

More information

Quadratic Multiple Knapsack Problem with Setups and a Solution Approach

Quadratic Multiple Knapsack Problem with Setups and a Solution Approach Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 Quadratic Multiple Knapsack Problem with Setups and a Solution Approach

More information

Algorithm Design Strategies V

Algorithm Design Strategies V Algorithm Design Strategies V Joaquim Madeira Version 0.0 October 2016 U. Aveiro, October 2016 1 Overview The 0-1 Knapsack Problem Revisited The Fractional Knapsack Problem Greedy Algorithms Example Coin

More information

Adaptive Generalized Crowding for Genetic Algorithms

Adaptive Generalized Crowding for Genetic Algorithms Carnegie Mellon University From the SelectedWorks of Ole J Mengshoel Fall 24 Adaptive Generalized Crowding for Genetic Algorithms Ole J Mengshoel, Carnegie Mellon University Severinio Galan Antonio de

More information

Repeat resolution. This exposition is based on the following sources, which are all recommended reading:

Repeat resolution. This exposition is based on the following sources, which are all recommended reading: Repeat resolution This exposition is based on the following sources, which are all recommended reading: 1. Separation of nearly identical repeats in shotgun assemblies using defined nucleotide positions,

More information

Evolutionary Design I

Evolutionary Design I Evolutionary Design I Jason Noble jasonn@comp.leeds.ac.uk Biosystems group, School of Computing Evolutionary Design I p.1/29 This lecture Harnessing evolution in a computer program How to construct a genetic

More information

INVARIANT SUBSETS OF THE SEARCH SPACE AND THE UNIVERSALITY OF A GENERALIZED GENETIC ALGORITHM

INVARIANT SUBSETS OF THE SEARCH SPACE AND THE UNIVERSALITY OF A GENERALIZED GENETIC ALGORITHM INVARIANT SUBSETS OF THE SEARCH SPACE AND THE UNIVERSALITY OF A GENERALIZED GENETIC ALGORITHM BORIS MITAVSKIY Abstract In this paper we shall give a mathematical description of a general evolutionary heuristic

More information

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,

More information

CSE 421 Dynamic Programming

CSE 421 Dynamic Programming CSE Dynamic Programming Yin Tat Lee Weighted Interval Scheduling Interval Scheduling Job j starts at s(j) and finishes at f j and has weight w j Two jobs compatible if they don t overlap. Goal: find maximum

More information

A PARSIMONY APPROACH TO ANALYSIS OF HUMAN SEGMENTAL DUPLICATIONS

A PARSIMONY APPROACH TO ANALYSIS OF HUMAN SEGMENTAL DUPLICATIONS A PARSIMONY APPROACH TO ANALYSIS OF HUMAN SEGMENTAL DUPLICATIONS CRYSTAL L. KAHN and BENJAMIN J. RAPHAEL Box 1910, Brown University Department of Computer Science & Center for Computational Molecular Biology

More information

Introduction to Evolutionary Computation

Introduction to Evolutionary Computation Introduction to Evolutionary Computation 1 Evolutionary Computation Different lines of investigation in simulated evolution GA (Genetic Algorithms) ES (Evolution Strategies) EP (Evolutionary Programming)

More information

A Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem

A Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem Applied Mathematical Sciences, Vol. 7, 2013, no. 24, 1183-1190 HIKARI Ltd, www.m-hikari.com A Genetic Algorithm with Expansion and Exploration Operators for the Maximum Satisfiability Problem Anna Gorbenko

More information

For q 0; 1; : : : ; `? 1, we have m 0; 1; : : : ; q? 1. The set fh j(x) : j 0; 1; ; : : : ; `? 1g forms a basis for the tness functions dened on the i

For q 0; 1; : : : ; `? 1, we have m 0; 1; : : : ; q? 1. The set fh j(x) : j 0; 1; ; : : : ; `? 1g forms a basis for the tness functions dened on the i Comuting with Haar Functions Sami Khuri Deartment of Mathematics and Comuter Science San Jose State University One Washington Square San Jose, CA 9519-0103, USA khuri@juiter.sjsu.edu Fax: (40)94-500 Keywords:

More information

An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules

An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules An Effective Chromosome Representation for Evolving Flexible Job Shop Schedules Joc Cing Tay and Djoko Wibowo Intelligent Systems Lab Nanyang Technological University asjctay@ntuedusg Abstract As the Flexible

More information

A GENETIC ALGORITHM FOR FINITE STATE AUTOMATA

A GENETIC ALGORITHM FOR FINITE STATE AUTOMATA A GENETIC ALGORITHM FOR FINITE STATE AUTOMATA Aviral Takkar Computer Engineering Department, Delhi Technological University( Formerly Delhi College of Engineering), Shahbad Daulatpur, Main Bawana Road,

More information

On-line Bin-Stretching. Yossi Azar y Oded Regev z. Abstract. We are given a sequence of items that can be packed into m unit size bins.

On-line Bin-Stretching. Yossi Azar y Oded Regev z. Abstract. We are given a sequence of items that can be packed into m unit size bins. On-line Bin-Stretching Yossi Azar y Oded Regev z Abstract We are given a sequence of items that can be packed into m unit size bins. In the classical bin packing problem we x the size of the bins and try

More information

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University Algorithms NP -Complete Problems Dong Kyue Kim Hanyang University dqkim@hanyang.ac.kr The Class P Definition 13.2 Polynomially bounded An algorithm is said to be polynomially bounded if its worst-case

More information

8.5 Sequencing Problems

8.5 Sequencing Problems 8.5 Sequencing Problems Basic genres. Packing problems: SET-PACKING, INDEPENDENT SET. Covering problems: SET-COVER, VERTEX-COVER. Constraint satisfaction problems: SAT, 3-SAT. Sequencing problems: HAMILTONIAN-CYCLE,

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms What do you do when a problem is NP-complete? or, when the polynomial time solution is impractically slow? assume input is random, do expected performance. Eg, Hamiltonian path

More information

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 Definition: An Integer Linear Programming problem is an optimization problem of the form (ILP) min

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

Open Problems in Throughput Scheduling

Open Problems in Throughput Scheduling Open Problems in Throughput Scheduling Jiří Sgall Computer Science Institute of Charles University, Faculty of Mathematics and Physics, Malostranské nám. 25, CZ-11800 Praha 1, Czech Republic. sgall@iuuk.mff.cuni.cz

More information

Parallel Genetic Algorithms

Parallel Genetic Algorithms Parallel Genetic Algorithms for the Calibration of Financial Models Riccardo Gismondi June 13, 2008 High Performance Computing in Finance and Insurance Research Institute for Computational Methods Vienna

More information

1. Computação Evolutiva

1. Computação Evolutiva 1. Computação Evolutiva Renato Tinós Departamento de Computação e Matemática Fac. de Filosofia, Ciência e Letras de Ribeirão Preto Programa de Pós-Graduação Em Computação Aplicada 1.6. Aspectos Teóricos*

More information

Introduction to integer programming III:

Introduction to integer programming III: Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability

More information

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms

A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms A New Approach to Estimating the Expected First Hitting Time of Evolutionary Algorithms Yang Yu and Zhi-Hua Zhou National Laboratory for Novel Software Technology Nanjing University, Nanjing 20093, China

More information

Question Paper Code :

Question Paper Code : www.vidyarthiplus.com Reg. No. : B.E./B.Tech. DEGREE EXAMINATION, NOVEMBER/DECEMBER 2011. Time : Three hours Fourth Semester Computer Science and Engineering CS 2251 DESIGN AND ANALYSIS OF ALGORITHMS (Regulation

More information

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms [Extended Abstract] Tobias Storch Department of Computer Science 2, University of Dortmund, 44221 Dortmund,

More information

Preliminaries and Complexity Theory

Preliminaries and Complexity Theory Preliminaries and Complexity Theory Oleksandr Romanko CAS 746 - Advanced Topics in Combinatorial Optimization McMaster University, January 16, 2006 Introduction Book structure: 2 Part I Linear Algebra

More information

Combinatorial optimization problems

Combinatorial optimization problems Combinatorial optimization problems Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Optimization In general an optimization problem can be formulated as:

More information

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument 1 Basic Definitions A Problem is a relation from input to acceptable output. For example, INPUT: A list of integers x 1,..., x n OUTPUT: One of the three smallest numbers in the list An algorithm A solves

More information

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Totally unimodular matrices Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and

More information

Proceedings of the Seventh Oklahoma Conference on Articial Intelligence, pp , November, Objective Function

Proceedings of the Seventh Oklahoma Conference on Articial Intelligence, pp , November, Objective Function Proceedings of the Seventh Oklahoma Conference on Articial Intelligence, pp. 200-206, November, 1993. The Performance of a Genetic Algorithm on a Chaotic Objective Function Arthur L. Corcoran corcoran@penguin.mcs.utulsa.edu

More information

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems S 331: rtificial Intelligence Local Search 1 1 Tough real-world problems Suppose you had to solve VLSI layout problems (minimize distance between components, unused space, etc.) Or schedule airlines Or

More information

State of the art in genetic algorithms' research

State of the art in genetic algorithms' research State of the art in genetic algorithms' Genetic Algorithms research Prabhas Chongstitvatana Department of computer engineering Chulalongkorn university A class of probabilistic search, inspired by natural

More information

Introduction. Genetic Algorithm Theory. Overview of tutorial. The Simple Genetic Algorithm. Jonathan E. Rowe

Introduction. Genetic Algorithm Theory. Overview of tutorial. The Simple Genetic Algorithm. Jonathan E. Rowe Introduction Genetic Algorithm Theory Jonathan E. Rowe University of Birmingham, UK GECCO 2012 The theory of genetic algorithms is beginning to come together into a coherent framework. However, there are

More information

The Chromatic Number of Ordered Graphs With Constrained Conflict Graphs

The Chromatic Number of Ordered Graphs With Constrained Conflict Graphs The Chromatic Number of Ordered Graphs With Constrained Conflict Graphs Maria Axenovich and Jonathan Rollin and Torsten Ueckerdt September 3, 016 Abstract An ordered graph G is a graph whose vertex set

More information

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003 CS6999 Probabilistic Methods in Integer Programming Randomized Rounding April 2003 Overview 2 Background Randomized Rounding Handling Feasibility Derandomization Advanced Techniques Integer Programming

More information

The minimum G c cut problem

The minimum G c cut problem The minimum G c cut problem Abstract In this paper we define and study the G c -cut problem. Given a complete undirected graph G = (V ; E) with V = n, edge weighted by w(v i, v j ) 0 and an undirected

More information

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure CSE Weighted Interval Scheduling, Knapsack, RNA Secondary Structure Shayan Oveis haran Weighted Interval Scheduling Interval Scheduling Job j starts at s(j) and finishes at f j and has weight w j Two jobs

More information

On Two Class-Constrained Versions of the Multiple Knapsack Problem

On Two Class-Constrained Versions of the Multiple Knapsack Problem On Two Class-Constrained Versions of the Multiple Knapsack Problem Hadas Shachnai Tami Tamir Department of Computer Science The Technion, Haifa 32000, Israel Abstract We study two variants of the classic

More information

Exploration of population fixed-points versus mutation rates for functions of unitation

Exploration of population fixed-points versus mutation rates for functions of unitation Exploration of population fixed-points versus mutation rates for functions of unitation J Neal Richter 1, Alden Wright 2, John Paxton 1 1 Computer Science Department, Montana State University, 357 EPS,

More information

DETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES

DETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES DETECTING THE FAULT FROM SPECTROGRAMS BY USING GENETIC ALGORITHM TECHNIQUES Amin A. E. 1, El-Geheni A. S. 2, and El-Hawary I. A **. El-Beali R. A. 3 1 Mansoura University, Textile Department 2 Prof. Dr.

More information

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem.

An artificial chemical reaction optimization algorithm for. multiple-choice; knapsack problem. An artificial chemical reaction optimization algorithm for multiple-choice knapsack problem Tung Khac Truong 1,2, Kenli Li 1, Yuming Xu 1, Aijia Ouyang 1, and Xiaoyong Tang 1 1 College of Information Science

More information

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing lecture 16 -inspired S S2 n p!!! 1 S Syntactic Operations al Code:N Development x x x 1 2 n p S Sections I485/H400 course outlook Assignments: 35% Students will complete 4/5 assignments based on algorithms

More information

A GREEDY APPROXIMATION ALGORITHM FOR CONSTRUCTING SHORTEST COMMON SUPERSTRINGS *

A GREEDY APPROXIMATION ALGORITHM FOR CONSTRUCTING SHORTEST COMMON SUPERSTRINGS * A GREEDY APPROXIMATION ALGORITHM FOR CONSTRUCTING SHORTEST COMMON SUPERSTRINGS * 1 Jorma Tarhio and Esko Ukkonen Department of Computer Science, University of Helsinki Tukholmankatu 2, SF-00250 Helsinki,

More information

Bounded Approximation Algorithms

Bounded Approximation Algorithms Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within

More information

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Frank Neumann 1 and Andrew M. Sutton 2 1 Optimisation and Logistics, School of Computer Science, The

More information

CSE 202 Dynamic Programming II

CSE 202 Dynamic Programming II CSE 202 Dynamic Programming II Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Algorithmic Paradigms Greed. Build up a solution incrementally,

More information

Using Evolutionary Techniques to Hunt for Snakes and Coils

Using Evolutionary Techniques to Hunt for Snakes and Coils Using Evolutionary Techniques to Hunt for Snakes and Coils Abstract The snake-in-the-box problem is a difficult problem in mathematics and computer science that deals with finding the longest-possible

More information

Sorting Network Development Using Cellular Automata

Sorting Network Development Using Cellular Automata Sorting Network Development Using Cellular Automata Michal Bidlo, Zdenek Vasicek, and Karel Slany Brno University of Technology, Faculty of Information Technology Božetěchova 2, 61266 Brno, Czech republic

More information

Evolving Presentations of Genetic Information: Motivation, Methods, and Analysis

Evolving Presentations of Genetic Information: Motivation, Methods, and Analysis Evolving Presentations of Genetic Information: Motivation, Methods, and Analysis Peter Lee Stanford University PO Box 14832 Stanford, CA 94309-4832 (650)497-6826 peterwlee@stanford.edu June 5, 2002 Abstract

More information

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved. Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should

More information

Polynomial Approximation of Survival Probabilities Under Multi-point Crossover

Polynomial Approximation of Survival Probabilities Under Multi-point Crossover Polynomial Approximation of Survival Probabilities Under Multi-point Crossover Sung-Soon Choi and Byung-Ro Moon School of Computer Science and Engineering, Seoul National University, Seoul, 151-74 Korea

More information

Gene Pool Recombination in Genetic Algorithms

Gene Pool Recombination in Genetic Algorithms Gene Pool Recombination in Genetic Algorithms Heinz Mühlenbein GMD 53754 St. Augustin Germany muehlenbein@gmd.de Hans-Michael Voigt T.U. Berlin 13355 Berlin Germany voigt@fb10.tu-berlin.de Abstract: A

More information

Fall 2003 BMI 226 / CS 426 LIMITED VALIDITY STRUCTURES

Fall 2003 BMI 226 / CS 426 LIMITED VALIDITY STRUCTURES Notes III-1 LIMITED VALIDITY STRUCTURES Notes III-2 TRAVELING SALESPERSON PROBLEM (TSP) Given a (symmetric) matrix of distances between N cities Salesperson is to visit each city once and only once Goal

More information