Evolutionary Algorithms and Other Search Heuristics. Bioinspired Computation in Combinatorial Optimization

Size: px
Start display at page:

Download "Evolutionary Algorithms and Other Search Heuristics. Bioinspired Computation in Combinatorial Optimization"

Transcription

1 Bioinspired Computation in Combinatorial Optimization Algorithms and Their Computational Complexity Evolutionary Algorithms and Other Search Heuristics Most famous search heuristic: Evolutionary Algorithms (EAs) a bio-inspired heuristic Initialization Frank Neumann 1 Carsten Witt 2 1 The University of Adelaide cs.adelaide.edu.au/ frank 2 Technical University of Denmark cawi paradigm: evolution in nature, survival of the fittest actually it s only an algorithm, a randomized search heuristic (RSH) no Selection Variation Selection Stop? Tutorial at GECCO 2013 Copyright is held by the author/owner(s). GECCO 13 Companion, July 6 10, 2013, Amsterdam, The Netherlands. ACM /13/07. Goal: optimization Here: discrete search spaces, combinatorial optimization, in particular pseudo-boolean functions Optimize f : {0, 1} n! R Book available at 1/88 2/88 Why Do We Consider Randomized Search Heuristics? What RSHs Do We Consider? Not enough resources (time, money, knowledge) for a tailored algorithm Black Box Scenario rules out problem-specific algorithms We like the simplicity, robustness,... of Randomized Search Heuristics They are surprisingly successful. Point of view x Want a solid theory to understand how (and when) they work. f (x) Theoretically considered RSHs (1+1) EA (1+ )EA(o spring population) (µ+1) EA (parent population) (µ+1) GA (parent population and crossover) SEMO, DEMO, FEMO,... (multi-objective) Randomized Local Search (RLS) Metropolis Algorithm/Simulated Annealing (MA/SA) Ant Colony Optimization (ACO) Particle Swarm Optimization (PSO)... First of all: define the simple ones 3/88 4/88 567

2 The Most Basic RSHs What Kind of Theory Are We Interested in? (1+1) EA and RLS for maximization problems (1+1) EA RLS 1 Choose x 0 2 {0, 1} n uniformly at random. 2 For t := 0,...,1 1 Create y by flipping each bit of x t indep. with probab. 1/n. 2 If f (y) f (x t)setx t+1 := y else x t+1 := x t. 1 Choose x 0 2 {0, 1} n uniformly at random. 2 For t := 0,...,1 1 Create y by flipping one bit of x t uniformly. 2 If f (y) f (x t)setx t+1 := y else x t+1 := x t. Not studied here: convergence, local progress, models of EAs (e. g., infinite populations),... Treat RSHs as randomized algorithm! Analyze their runtime (computational complexity) on selected problems Definition Let RSH A optimize f.eachf-evaluation is counted as a time step. The runtime T A,f of A is the random first point of time such that A has sampled an optimal search point. Often considered: expected runtime, distribution of T A,f Asymptotical results w. r. t. n 5/88 6/88 How Do We Obtain Results? We use (rarely in their pure form): Coupon Collector s Theorem Concentration inequalities: Markov, Chebyshev, Cherno, Hoe ding,... bounds Markov chain theory: waiting times, first hitting times Rapidly Mixing Markov Chains Random Walks: Gambler s Ruin, drift analysis, martingale theory, electrical networks Random graphs (esp. random trees) Identifying typical events and failure events Potential functions and amortized analysis... Adapt tools from the analysis of randomized algorithms; understanding the stochastic process is often the hardest task. Early Results Analysis of RSHs already in the 1980s: Sasaki/Hajek (1988): SA and Maximum Matchings Sorkin (1991): SA vs. MA Jerrum (1992): SA and Cliques Jerrum/Sorkin (1993, 1998): SA/MA for Graph Bisection... High-quality results, but limited to SA/MA (nothing about EAs) and hard to generalize. Since the early 1990s Systematic approach for the analysis of RSHs, building up a completely new research area 7/88 8/88 568

3 This Tutorial How the Systematic Research Began Toy Problems 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References Simple example functions (test functions) OneMax(x 1,...,x n )=x x n LeadingOnes(x 1,...,x n )= P n i=1 Q i j=1 x j BinVal(x 1,...,x n )= P n i=1 2n i x i polynomials of fixed degree Goal: derive first runtime bounds and methods Artificially designed functions with sometimes really horrible definitions but for the first time these allow rigorous statements Goal: prove benefits and harm of RSH components, e. g., crossover, mutation strength, population size... 9/88 10/88 Agenda Example: OneMax 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References Theorem (e. g., Droste/Jansen/Wegener, 1998) The expected runtime of the RLS, (1+1) EA, (µ+1) EA, (1+ ) EA on OneMax is (n log n). Proof by modifications of Coupon Collector s Theorem. Theorem (e. g., Mühlenbein, 1992) The expected runtime of RLS and the (1+1) EA on OneMax is O(n log n). Holds also for population-based (µ+1) EA and for (1+ ) EA with small populations. 11/88 12/88 569

4 Proof of the O(n log n) bound Later Results Using Toy Problems Fitness levels: L i := {x 2 {0, 1} n OneMax(x) =i} (1+1) EA never decreases its current fitness level. From i to some higher-level set with prob. at least n 1 n i n n {z } {z } {z } choose a 0-bit flip this bit keep the other bits n i en Expected time to reach a higher-level set is at most en n i. Expected runtime is at most Xn 1 i=0 en n i = O(n log n). Find the theoretically optimal mutation strength (1/n for OneMax!). Bound the optimization time for linear functions (O(n log n)). optimal population size (often 1!) crossover vs. no crossover! Real Royal Road Functions multistarts vs. populations frequent restarts vs. long runs dynamic schedules... 13/88 14/88 RSHs for Combinatorial Optimization Agenda Analysis of runtime and approximation quality on well-known combinatorial optimization problems, e. g., sorting problems (is this an optimization problem?), covering problems, cutting problems, subsequence problems, traveling salesman problem, Eulerian cycles, minimum spanning trees, maximum matchings, scheduling problems, shortest paths,... We do not hope: to be better than the best problem-specific algorithms Instead: maybe reasonable polynomial running times In the following no fine-tuning of the results 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References 15/88 16/88 570

5 MinimumSpanningTrees: Given:UndirectedconnectedgraphG=(V,E) withnver0cesandmedgeswithposi0veinteger weights. Find:EdgesetEb Ewithminimalweight connec0ngallver0ces. Searchspace{0,1} m Edgee i ischoseniffx i =1 Consider(1+1)EA 17/88 Fitnessfunc0on: Decreasenumberofconnectedcomponents, findminimumspanningtree. f(s):=(c(s),w(s)). Minimiza0onoffwithrespecttothe lexicographicorder. 18/88 Firstgoal:ObtainaconnectedsubgraphofG. Howlongdoesittake? Connectedgraphinexpected0meO(mlogn) (fitness_basedpar00ons) Bijec0onforminimumspanningtrees: e 1 α(e 1 ) α(e 2 ) e e 3 2 α(e 3 ) k := E(T ) \ E(T) Bijection α: E(T ) \ E(T) E(T) \ E(T ) α(e i ) on the cycle of E(T) {e i } w(e i ) w(α(e i )) k accepted 2-bit flips that turn T into T 19/88 20/88 571

6 UpperBound Theorem: Theexpected0meun0l(1+1)EAconstructsa minimumspanningtreeisboundedbyo(m 2 (logn+ logw max )). Sketchofproof: w(s)weightcurrentsolu0ons. w opt weightminimumspanningtreet setofm+1opera0onstoreacht mb=m (n 1)1_bitflipsconcerningnon_T edges spanningtreet k2_bitflipsdefinedbybijec0on n knonaccepted2_bitflips averagedistancedecrease(w(s) w opt )/(m+1) 21/88 Proof 1_step(largertotalweightdecreaseof1_bitflips) 2_step(largertotalweightdecreaseof2_bitflips) Consider2_steps: Expectedweightdecreasebyafactor1 (1/(2n)) Probability(n/m 2 )foragood2_bitflip Expected0meun0lq2_stepsO(qm 2 /n) Consider1_steps: Expectedweightdecreasebyafactor1 (1/(2mb)) Probability(mb/m)foragood1_bitflip Expected0meun0lq1_stepsO(qm/mb) 1_stepsfaster showboundfor2_steps. 22/88 s ExpectedMul0plica0veDistance Decrease(akaDriLAnalysis) 1step r accepted operations that turn s into s opt D = f(s opt ) f(s) d max (1 1/r) D (1 1/r) t D t steps s opt Fitness values are integers t = O(r log D) steps to reach optimum Maximumdistance:w(s)_w opt D:=m wmax 1step:Expecteddistanceatmost(1 1/(2n))(w(s) w opt ) tsteps:expecteddistanceatmost(1 1/(2n)) t (w(s) w opt ) t := 2 (ln 2)n(log D + 1) : (1 1/(2n) ) t (w(s) w opt ) 1/2 Expected number of 2-steps 2t = O(n(log n + log w max ))(Markov) Expectedop0miza0on0me O(tm 2 /n)=o(m 2 (logn+logw max )). 23/88 24/88 572

7 Agenda Maximum Matchings 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem A matching in an undirected graph is a subset of pairwise disjoint edges; aim: find a maximum matching (solvable in poly-time) Simple example: path of odd length Maximum matching with more than half of edges 3 End 4 References 25/88 26/88 Maximum Matchings Maximum Matchings: Upper Bound A matching in an undirected graph is a subset of pairwise disjoint edges; aim: find a maximum matching (solvable in poly-time) Simple example: path of odd length Suboptimal matching Concept: augmenting path Alternating between edges being inside and outside the matching Starting and ending at free nodes not incident on matching Flipping all choices along the path improves matching Example: whole graph is augmenting path Fitness function f : {0, 1} #edges! R: one bit for each edge, value 1 i edge chosen value for legal matchings: size of matching otherwise penalty leading to empty matching Example: path with n + 1 nodes, n edges: bit string selects edges Theorem The expected time until (1+1) EA finds a maximum matching on a path of n edges is O(n 4 ). Interesting: how simple EAs find augmenting paths 26/88 27/88 573

8 Maximum Matchings: Upper Bound (Ctnd.) Fair Random Walk Proof idea for O(n 4 ) bound Consider the level of second-best matchings. Fitness value does not change (walk on plateau). If free edge: chance to flip one bit!! probability (1/n). Else steps flipping two bits! probability (1/n 2 ). Shorten or lengthen augmenting path At length 1, chance to flip the free edge! Scenario: fair random walk Initially, player A and B both have n 2 USD Repeat: flip a coin If heads: A pays 1 USD to B, tails: other way round Until one of the players is ruined. 0 n n 2 Length changes according to a fair random walk! equal probability for lengthenings and shortenings How long does the game take in expectation? Theorem: Fair random walk on {0,...,n} takes in expectation O(n 2 )steps. 28/88 29/88 Maximum Matchings: Upper Bound (Ctnd.) Maximum Matchings: Lower Bound Proof idea for O(n 4 ) bound Consider the level of second-best matchings. Fitness value does not change (walk on plateau). If free edge: chance to flip one bit!! probability (1/n). Else steps flipping two bits! probability (1/n 2 ). Shorten or lengthen augmenting path At length 1, chance to flip the free edge! Length changes according to a fair random walk, expectedo(n 2 ) two-bit flips su ce, expected optimization time O(n 2 ) O(n 2 )=O(n 4 ). Worst-case graph G h,` h 3 ` =2`0 +1 Augmenting path can get shorter but is more likely to get longer. (unfair random walk) Theorem For h 3, (1+1) EA has exponential expected optimization time 2 (`) on G h,`. Proof requires analysis of negative drift (simplified drift theorem). 30/88 31/88 574

9 Maximum Matching: Approximations Agenda Insight: do not hope for exact solutions but for approximations For maximization problems: solution with value a is called (1 + ")-approximation if OPT a apple 1+", where OPT optimal value. Theorem For " > 0, (1+1) EA finds a (1 + ")-approximation of a maximum matching in expected time O(m 2/"+2 ) (m number of edges). Proof idea: If current solution worse than (1 + ")-approximate, there is a short augmenting path (length apple 2/" + 1); flip it in one go. 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References 32/88 33/88 All_pairs_shortest_path(APSP)problem Given: Connected directed graph G =(V,E), V = n and E = m, and a function w : E N which assigns positive integer weights to the edges. Compute from each vertex v i V ashortestpath(pathofminimalweight) to every other vertex v j V \{v i } Representa0on: Individualsarepathsbetweentwopar0cular ver0cesv i andv j Initial Population: P := {I u,v =(u, v) (u, v) E} 575

10 Muta0on: Pick individual I u,v uniformly at random E (u): incoming edges of u s u E + (v): outgoing edges of v Pick uniformly at random an edge e =(x, y) E (u) E + (v) v t Steady State EA Muta0on_basedEA 1. Set P = {I u,v =(u, v) (u, v) E}. 2. Choose an individual I x,y P uniformly at random. 3. Mutate I x,y to obtain an individual I s,t. 4. If there is no individual I s,t P, P = P {I s,t }, else if f(i s,t) f(i s,t ), P =(P {I s,t}) \{I s,t } 5. Repeat Steps 2 4 forever. Add e New individual I s,t Lemma: Let l log n. Theexpectedtimeuntilhasfoundallshortestpaths with at most l edges is O(n 3 l). with at most l edges is O(n l). Proof idea: Consider two vertices u and v, u v. Let γ := (v 1 = u, v 2,...,v l +1 = v) be a shortest path from u to v consisting of l, l l, edgesing the sub-path γ =(v 1 = u, v 2,...,v j ) is a shortest path from u to v j. v u v j Popula0onsizeisupperboundedn 2 (foreachpairofver0cesatmostonepath) Pickshortestpathfromutov j andappend edge(v j,v j+1 ) Shortestpathfromutov j+1 ProbabilitytopickI u,vj isatleast1/n 2 Probabilitytoappendrightedgeisatleast1/(2n) Successwithprobabilityatleastp=1/(2n 3 ) Atmostlsuccessesneededtoobtainshortestpath fromutov 576

11 Considertypicalrunconsis0ngofT=cn 3 lsteps. Whatistheprobabilitythattheshortestpathfromutov hasbeenobtained? Weneedatmostlsuccesses,whereasuccesshappensin eachstepwithprobabilityatleastp=1/(2n 3 ) Define for each step i arandomvariablex i. X i =1ifstepi is a success X i =0ifstepi is not a success Prob(X i =1) p =1/(2n 3 ) HoldsforanyphaseofTsteps Analysis X = T i=1 X i X l??? Expected number of successes E(X) T/(2n 3 )= cn3 l 2n 3 = cl 2 Chernoff: Prob(X <(1 δ)e(x)) e E(X)δ2 /2 δ = 1 2 Prob(X <(1 1 2 )E(x)) e E(X)/8 e T/(16n3) = e cn3 l/(16n 3) = e cl/(16) Probability for failure of at least one pair of vertices at most: n 2 e cl/16 c large enough and l log n: No failure in any path with probability at least α =1 n 2 e cl/16 =1 o(1) Expected time upper bound by T/α = O(n 3 l) Shortestpathshavelengthatmostn_1. Setl=n_1 Theorem The expected optimization time of Steady State EA for the APSP problem is O(n 4 ). Remark: There are instances where the expected optimization of (µ +1)-EA is Ω(n 4 ) Crossover PicktwoindividualsI u,v andi s,t frompopula0on uniformlyatrandom. If v=s u u random. v s v=s t t t Ques0on: CancrossoverhelptoachieveabeCerexpectedop0miza0on0me? 577

12 Steady State GA 1. Set P = {I u,v =(u, v) (u, v) E}. 2. Choose r [0, 1] uniformly at random. 3. If r p c,choosetwoindividualsi x,y P and I x,y P uniformly at random and perform crossover to obtain an individual I s,t, else choose an individual I x,y P uniformly at random and mutate I x,y to obtain an individual I s,t. 4. If I s,t is a path from s to t then If there is no individual I s,t P, P = P {I s,t}, else if f(i s,t) f(i s,t), P =(P {I s,t}) \{I s,t}. 5. Repeat Steps 2 4 forever. Theorem: The expected optimization time of Steady State GA is O(n 3.5 log n). Mutation and l := n log n All shortest path of length at most l edges are obtained Show:Longerpathsareobtainedbycrossoverwithin thestated0mebound. p c is a constant AnalysisCrossover Longpathsbycrossover: Assump0on:Allshortestpathswithatmostl edgeshavealreadybeenobtained. Assumethatallshortestpathsoflengthk l havebeenobtained. Whatistheexpected0metoobtainallshortestpathsof lengthatmost3k/2? AnalysisCrossover Considerpairofver0cesxandyforwhichashortest pathofr,k<r 3k/2,edgesexists. Thereare2k_rpairsofshortestpathsoflengthatmostk thatcanbejoinedtoobtainshortestpathfromxtoy. Probabilityforonespecificpair:atleast1/n 4 Atleast2k+1_rpossiblepairs:probability atleast(2k+1_r)/n 4 ) k/(2n 4 ) Atmostn 2 shortestpathsoflengthr,k<r 3k/2 TimetocollectallpathsO(n 4 logn/k) (similartocouponcollectorstheorem) 578

13 Agenda AnalysisCrossover Sum up over the different values of k, namely n log n, c n log n, c2 n log n,..., c log c (n/ n log n) n log n, where c =3/2. Expected Optimization log c (n/ n log n) s=0 n 4 log n O c s = O(n 3.5 log n) n log n c s = O(n 3.5 log n) s=0 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References 49/88 Makespan Scheduling Fitness Function What about NP-hard problems?! Study approximation quality Makespan scheduling on 2 machines: n objects with weights/processing times w 1,...,w n 2 machines (bins) Minimize the total weight of fuller bin = makespan. Formally, find I {1,...,n} minimizing ( X max w i, X ) w i. i2i i /2I Problem encoding: bit string x 1,...,x n reserves a bit for each object, put object i in bin x i + 1. Fitness function ( nx f (x 1,...,x n ):=max w i x i, to be minimized. Consider (1+1) EA and RLS. i=1 ) nx w i (1 x i ) i=1 Sometimes also called the Partition problem. This is an easy NP-hard problem, good approximations possible 50/88 51/88 579

14 Types of Results Worst-case results Success probabilities and approximations Su cient Conditions for Progress Abbreviate S := w w n ) perfect partition has cost S 2. Suppose we know s = size of smallest object in the fuller bin, f (x) > S 2 + s 2 for the current search point x then the solution is improvable by a single-bit flip. An average-case analysis Aparameterizedanalysis S 2 s If f (x) < S 2 + s 2, no improvements can be guaranteed. Lemma If smallest object in fuller bin is always bounded by s then (1+1) EA and RLS reach f -value apple S 2 + s 2 in expected O(n2 ) steps. 52/88 53/88 Su cient Conditions for Progress Su cient Conditions for Progress Abbreviate S := w w n ) perfect partition has cost S 2. Abbreviate S := w w n ) perfect partition has cost S 2. Suppose we know s = size of smallest object in the fuller bin, f (x) > S 2 + s 2 for the current search point x then the solution is improvable by a single-bit flip. Suppose we know s = size of smallest object in the fuller bin, f (x) > S 2 + s 2 for the current search point x then the solution is improvable by a single-bit flip. S 2 s S 2 s If f (x) < S 2 + s 2, no improvements can be guaranteed. If f (x) < S 2 + s 2, no improvements can be guaranteed. Lemma If smallest object in fuller bin is always bounded by s then (1+1) EA and RLS reach f -value apple S 2 + s 2 in expected O(n2 ) steps. Lemma If smallest object in fuller bin is always bounded by s then (1+1) EA and RLS reach f -value apple S 2 + s 2 in expected O(n2 ) steps. 53/88 53/88 580

15 Su cient Conditions for Progress Su cient Conditions for Progress Abbreviate S := w w n ) perfect partition has cost S 2. Abbreviate S := w w n ) perfect partition has cost S 2. Suppose we know s = size of smallest object in the fuller bin, f (x) > S 2 + s 2 for the current search point x then the solution is improvable by a single-bit flip. Suppose we know s = size of smallest object in the fuller bin, f (x) > S 2 + s 2 for the current search point x then the solution is improvable by a single-bit flip. S 2 s s S 2 s s If f (x) < S 2 + s 2, no improvements can be guaranteed. If f (x) < S 2 + s 2, no improvements can be guaranteed. Lemma If smallest object in fuller bin is always bounded by s then (1+1) EA and RLS reach f -value apple S 2 + s 2 in expected O(n2 ) steps. Lemma If smallest object in fuller bin is always bounded by s then (1+1) EA and RLS reach f -value apple S 2 + s 2 in expected O(n2 ) steps. 53/88 53/88 Worst-Case Results Worst-Case Instance Theorem On any instance to the makespan scheduling problem, the (1+1) EA and RLS reach a solution with approximation ratio 4 3 in expected time O(n2 ). Use study of object sizes and previous lemma. Instance W" = {w 1,...,w n } is defined by w 1 := w 2 := 1 " 3 4 (big objects) and w i := 1/3+"/2 n 2 for 3 apple i apple n, " very small constant; n even Sum is 1; there is a perfect partition. But if one bin with big and one bin with small objects: value 2 " 3 2. Move a big object in the emptier bin ) value ( " 2 )+(1 " 3 4 )= " 4! Need to move "n small objects at once for improvement: very unlikely. Theorem There is an instance W" such that the (1+1) EA and RLS need with prob. (1) at least n (n) steps to find a solution with a better ratio than 4/3 ". (n) smallobjects With constant probability in this situation, n (n) needed to escape. 54/88 55/88 581

16 Worst Case PRAS by Parallelism Previous result shows: success dependent on big objects Worst Case PRAS by Parallelism (Proof Idea) Set s := 2 " Assuming w 1 w n,wehavew i apple " S 2 for i s. Theorem On any instance, the (1+1) EA and RLS with prob. 2 find a (1 + ")-approximation within O(n ln(1/")) steps. cd1/"e ln(1/") 2 O(d1/"e ln(1/")) parallel runs find a (1 + ")-approximation with prob. 3/4inO(nln(1/")) parallel steps. Parallel runs form a polynomial-time randomized approximation scheme (PRAS)! {z } s 1largeobjects analyze probability of distributing large objects in an optimal way, small objects greedily ) error apple " S 2, {z } small objects Random search rediscovers algorithmic idea of early algorithms. 56/88 57/88 Average-Case Analyses Makespan Scheduling Known Averge-Case Results Models: each weight drawn independently at random, namely 1 uniformly from the interval [0, 1], 2 exponentially distributed with parameter 1 (i. e., Prob(X t) =e t for t 0). Approximation ratio no longer meaningful, we investigate: discrepancy = absolute di erence between weights of bins. How close to discrepancy 0 do we come? Deterministic, problem-specific heuristic LPT Sort weights decreasingly, put every object into currently emptier bin. Known for both random models: LPT creates a solution with discrepancy O((log n)/n). What discrepancy do the (1+1) EA and RLS reach in poly-time? 58/88 59/88 582

17 Average-Case Analysis of the (1+1) EA A Parameterized Analysis Theorem In both models, the (1+1) EA reaches discrepancy O((log n)/n) after O(n c+4 log 2 n) steps with probability 1 O(1/n c ). Almost the same result as for LPT! Proof exploits order statistics: If X (i) (i-th largest) in fuller bin, X (i+1) in emptier one, and discrepancy > 2(X (i) X (i+1) ) > 0, then objects can be swapped; discrepancy falls Consider such di erence objects. Have seen: problem is hard for (1+1) EA/RLS in the worst case, but not so hard on average. What parameters make the problem hard? Definition A problem is fixed-parameter tractable (FPT) if there is a problem parameter k such that it can be solved in time f (k) poly(n), where f (k) does not depend on n. Intuition: for small k, we have an e cient algorithm. W. h. p. X (i) X (i+1) = O((log n)/n) (for i = (n)). } X (i) X (i+1) Considered parameters (Sutton and Neumann, 2012): 1 Value of optimal solution 2 No. jobs on fuller machine in optimal solution 3 Unbalance of optimal solution 60/88 61/88 Value of Optimal Solution No. Objects on Fuller Machine Recall approximation result: decent chance to distribute k big jobs optimally if k small. Since w 1 w n,alreadyw k apple S/k. Consequence: optimal distribution of first k objects! can reach makespan S/2 + S/k by greedily treating the other objects. Theorem (1+1) EA and RLS find solution of makespan apple S/2+S/k with probability ((2k) ek ) in time O(n log k). Multistarts have success probability 1/2 after O(2 (e+1)k k ek n log k) evaluations. 2 (e+1)k k ek log k does not depend on n! a randomized FPT-algorithm. Suppose: optimal solution puts only k objects on fuller machine. Notion: k is called critical path size. Intuition: Theorem Good chance of putting k objects on same machine if k small, other objects can be moved greedily. For critical path size k, multistart RLS finds optimum in O(2 k (en) ck n log n) evaluations with probability 1/2. Due to term n ck, result is somewhat weaker than FPT (a so-called XP-algorithm). Still, for constant k polynomial. Remark: with (1+1)-EA, get an additional log w 1 -term. 62/88 63/88 583

18 Unbalance of Optimal Solution Agenda Consider discrepancy of optimum := 2(OPT S/2). Question/decision problem: Is w k w k+1? Observation: If w k+1, optimal solution will put w k+1,...,w n on emptier machine. Crucial to distribute first k objects optimally. Theorem Multistart RLS with biased mutation (touches objects w 1,...,w k with prob. 1/(kn) each) solves decision problem in O(2 k n 3 log n) evaluations with probability 1/2. Again, a randomized FPT-algorithm. 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References 64/88 65/88 The Problem The Vertex Cover Problem: Given an undirected graph G=(V,E). Simplesingle_objec0veevolu0onaryalgorithmsfail!!! The Problem IntegerLinearProgram(ILP) min P n i=1 x i s.t. x i + x j 1 {i, j} 2 E x i 2 {0, 1} LinearProgram(LP) min P n i=1 x i s.t. x i + x j 1 {i, j} 2 E x i 2 [0, 1] 66/88 67/88 584

19 Evolutionary Algorithm Evolutionary Algorithm 68/88 69/88 70/88 71/88 585

20 Expected time O(4 OPT poly(n)) 72/88 73/88 Linear Programming Combination with Linear Programming LP-relaxation is half integral, i.e. Approximations 74/88 75/88 586

21 Agenda 1 The origins: example functions and toy problems A simple toy problem: OneMax for (1+1) EA 2 Combinatorial optimization problems Minimum spanning trees Maximum matchings Shortest paths Makespan scheduling Covering problems Traveling salesman problem 3 End 4 References EuclideanTSP GivennpointsintheplaneandEuclideandistances betweentheci0es. Findashortesttourthatvisitseachcityexactly onceandreturntotheorigin. NP_hard,PTAS,FPTwhennumberofinnerpointsis theparameter. 76/88 77/88 Representa0onandMuta0on Representa0on:Permuta0onofthenci0es Forexample:(3,4,1,2,5) Inversion(inv)asmuta0onoperator: Selecti,jfrom{1, n}uniformlyatrandomandinvert thepartfromposi0onitoposi0onj. Inv(2,5)appliedto(3,4,1,2,5)yields(3,5,2,1,4) (1+1)EA x a random permutation of [n]. repeat forever y MUTATE(x) if f(y) <f(x) then x y Muta0on: (1+1)EA:krandominversion, kchosenaccordingto 1+Pois(1) 78/88 79/88 587

22 Intersec0onandMuta0on kinnerpoints Convexhullcontainingn_kpoints 80/88 81/88 Angleboundedsetofpoints Theremaybeanexponen0alnumberofinversiontoendupin alocalop0mumifpointsareinarbitraryposi0ons(englertetal,2007). WeassumethatthesetVisanglebounded V is angle-bounded by > 0 if for any three points u, v, w 2 V,0< < < where denotes the angle formed by the line from u to v and the line from v to w. u v IfVisangle_boundedthenwegetalowerboundonanimprovementdependingonε w Assump0ons: Progress d max :Maximumdistancebetweenanytwopoints d min :Minimumdistancebetweenanytwopoints Visangle_boundedbyε Wheneverthecurrenttourisnotintersec0on_ free,wecanguaranteeacertainprogress Lemma: Let x be a permutation such that is not intersection-free. Let y be the permutation constructed from an inversion on x that replaces two intersecting edges with two non-intersecting edges.then, f(x) f(y) > 2d min 1 cos( ) cos( ). 82/88 83/88 588

23 Atourxiseither Intersec0onfree Nonintersec0onfree Tours Intersec0onfreetouraregood.Thepointsonthe convexhullarealreadyintherightorder (QuintasandSupnick,1965). Claim:Wedonotspendtoomuch0meonnon intersec0onfreetours. Timespendonintersec0ngtours Lemma: Let (x (1),x (2),...,x (t),...) denote the sequence of permutations generated by the (1+1)-EA. Let be an indicator variable defined on permutations of [n] as ( 1 x contains intersections; (x) = 0 otherwise. Then E P t=1 (x (t) ) = O n 3 dmax d min 1 cos( ) 1 cos( ). Foranmxmgrid: For points on an m m grid this bound becomes O(n 3 m 5 ). 84/88 85/88 Summary and Conclusions ParameterizedResult Lemma: Suppose V has k inner points and x is an intersection-free tour on V. Then there is a sequence of at most 2k inversions that transforms x into an optimal permutation. Theorem: Let V be a set of points quantized on an m m and k be the number of inner points. Then the expected optimisation time of the (1+1)-EA on V is O(n 3 m 5 )+O(n 4k (2k 1)!). Runtime analysis of RSHs in combinatorial optimization Starting from toy problems to real problems Insight into working principles using runtime analysis General-purpose algorithms successful for wide range of problems Interesting, general techniques Runtime analysis of new approaches possible! An exciting research direction. Thank you! 86/88 87/88 589

24 References F. Neumann and C. Witt (2010): Algorithms and Their Computational Complexity. Springer. A. Auger and B. Doerr (2011): Theory of Randomized Search Heuristics Foundations and Recent Developments. World Scientific Publishing F. Neumann and I. Wegener (2007): Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science 378(1): O. Giel and I. Wegener (2003): Evolutionary algorithms and the maximum matching problem. Proc. of STACS 03, LNCS 2607, , Springer B. Doerr, E. Happ and C. Klein (2012): Crossover can provably be useful in evolutionary computation. Theoretical Computer Science 425: C. Witt (2005): Worst-case and average-case approximations by simple randomized search heuristics. Proc. of STACS 2005, LNCS 3404, 44 56, Springer. T. Friedrich, J. He, N. Hebbinghaus, F. Neumann and C. Witt (2010): Approximating covering problems by randomized search heuristics using multi-objective models. Evolutionary Computation 18(4): S. Kratsch and F. Neumann (2009): Fixed-parameter evolutionary algorithms and the vertex cover problem. In Proc. of GECCO 2009, ACM. A. M. Sutton and F. Neumann (2012): AparameterizedruntimeanalysisofevolutionaryalgorithmsfortheEuclideantravelingsalespersonproblem. Proc. of AAAI /88 590

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms

How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms How to Treat Evolutionary Algorithms as Ordinary Randomized Algorithms Technical University of Denmark (Workshop Computational Approaches to Evolution, March 19, 2014) 1/22 Context Running time/complexity

More information

Black Box Search By Unbiased Variation

Black Box Search By Unbiased Variation Black Box Search By Unbiased Variation Per Kristian Lehre and Carsten Witt CERCIA, University of Birmingham, UK DTU Informatics, Copenhagen, Denmark ThRaSH - March 24th 2010 State of the Art in Runtime

More information

The Fitness Level Method with Tail Bounds

The Fitness Level Method with Tail Bounds The Fitness Level Method with Tail Bounds Carsten Witt DTU Compute Technical University of Denmark 2800 Kgs. Lyngby Denmark arxiv:307.4274v [cs.ne] 6 Jul 203 July 7, 203 Abstract The fitness-level method,

More information

A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling

A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling A Parameterized Runtime Analysis of Simple Evolutionary Algorithms for Makespan Scheduling Andrew M. Sutton and Frank Neumann School of Computer Science, University of Adelaide, Adelaide, SA 5005, Australia

More information

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms

A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms A Gentle Introduction to the Time Complexity Analysis of Evolutionary Algorithms Pietro S. Oliveto Department of Computer Science, University of Sheffield, UK Symposium Series in Computational Intelligence

More information

Plateaus Can Be Harder in Multi-Objective Optimization

Plateaus Can Be Harder in Multi-Objective Optimization Plateaus Can Be Harder in Multi-Objective Optimization Tobias Friedrich and Nils Hebbinghaus and Frank Neumann Max-Planck-Institut für Informatik, Campus E1 4, 66123 Saarbrücken, Germany Abstract In recent

More information

Simulated Annealing Beats Metropolis in Combinatorial Optimization

Simulated Annealing Beats Metropolis in Combinatorial Optimization Electronic Colloquium on Computational Complexity, Report No. 89 (2004) Simulated Annealing Beats Metropolis in Combinatorial Optimization Ingo Wegener FB Informatik, LS2, Univ. Dortmund, Germany ingo.wegener@uni-dortmund.de

More information

When to Use Bit-Wise Neutrality

When to Use Bit-Wise Neutrality When to Use Bit-Wise Neutrality Tobias Friedrich Department 1: Algorithms and Complexity Max-Planck-Institut für Informatik Saarbrücken, Germany Frank Neumann Department 1: Algorithms and Complexity Max-Planck-Institut

More information

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization

Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Runtime Analyses for Using Fairness in Evolutionary Multi-Objective Optimization Tobias Friedrich 1, Christian Horoba 2, and Frank Neumann 1 1 Max-Planck-Institut für Informatik, Saarbrücken, Germany 2

More information

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search

Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Rigorous Analyses for the Combination of Ant Colony Optimization and Local Search Frank Neumann 1, Dirk Sudholt 2, and Carsten Witt 2 1 Max-Planck-Institut für Informatik, 66123 Saarbrücken, Germany, fne@mpi-inf.mpg.de

More information

Runtime Analysis of a Simple Ant Colony Optimization Algorithm

Runtime Analysis of a Simple Ant Colony Optimization Algorithm Algorithmica (2009) 54: 243 255 DOI 10.1007/s00453-007-9134-2 Runtime Analysis of a Simple Ant Colony Optimization Algorithm Frank Neumann Carsten Witt Received: 22 January 2007 / Accepted: 20 November

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights

Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Runtime Analysis of Evolutionary Algorithms for the Knapsack Problem with Favorably Correlated Weights Frank Neumann 1 and Andrew M. Sutton 2 1 Optimisation and Logistics, School of Computer Science, The

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

Easy Problems vs. Hard Problems. CSE 421 Introduction to Algorithms Winter Is P a good definition of efficient? The class P

Easy Problems vs. Hard Problems. CSE 421 Introduction to Algorithms Winter Is P a good definition of efficient? The class P Easy Problems vs. Hard Problems CSE 421 Introduction to Algorithms Winter 2000 NP-Completeness (Chapter 11) Easy - problems whose worst case running time is bounded by some polynomial in the size of the

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS

METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS METHODS FOR THE ANALYSIS OF EVOLUTIONARY ALGORITHMS ON PSEUDO-BOOLEAN FUNCTIONS Ingo Wegener FB Informatik, LS2, Univ. Dortmund, 44221 Dortmund, Germany wegener@ls2.cs.uni-dortmund.de Abstract Many experiments

More information

P,NP, NP-Hard and NP-Complete

P,NP, NP-Hard and NP-Complete P,NP, NP-Hard and NP-Complete We can categorize the problem space into two parts Solvable Problems Unsolvable problems 7/11/2011 1 Halting Problem Given a description of a program and a finite input, decide

More information

An Analysis on Recombination in Multi-Objective Evolutionary Optimization

An Analysis on Recombination in Multi-Objective Evolutionary Optimization An Analysis on Recombination in Multi-Objective Evolutionary Optimization Chao Qian, Yang Yu, Zhi-Hua Zhou National Key Laboratory for Novel Software Technology Nanjing University, Nanjing 20023, China

More information

Runtime Analysis of a Binary Particle Swarm Optimizer

Runtime Analysis of a Binary Particle Swarm Optimizer Runtime Analysis of a Binary Particle Swarm Optimizer Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische

More information

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation

Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Downloaded from orbit.dtu.dk on: Oct 12, 2018 Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation Witt, Carsten Published in: 29th International Symposium on Theoretical

More information

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions

A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions A Lower Bound Analysis of Population-based Evolutionary Algorithms for Pseudo-Boolean Functions Chao Qian,2, Yang Yu 2, and Zhi-Hua Zhou 2 UBRI, School of Computer Science and Technology, University of

More information

P, NP, NP-Complete. Ruth Anderson

P, NP, NP-Complete. Ruth Anderson P, NP, NP-Complete Ruth Anderson A Few Problems: Euler Circuits Hamiltonian Circuits Intractability: P and NP NP-Complete What now? Today s Agenda 2 Try it! Which of these can you draw (trace all edges)

More information

Algorithms Design & Analysis. Approximation Algorithm

Algorithms Design & Analysis. Approximation Algorithm Algorithms Design & Analysis Approximation Algorithm Recap External memory model Merge sort Distribution sort 2 Today s Topics Hard problem Approximation algorithms Metric traveling salesman problem A

More information

NP-Completeness. Until now we have been designing algorithms for specific problems

NP-Completeness. Until now we have been designing algorithms for specific problems NP-Completeness 1 Introduction Until now we have been designing algorithms for specific problems We have seen running times O(log n), O(n), O(n log n), O(n 2 ), O(n 3 )... We have also discussed lower

More information

Runtime Analysis of Binary PSO

Runtime Analysis of Binary PSO Runtime Analysis of Binary PSO Dirk Sudholt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund, Germany Carsten Witt Fakultät für Informatik, LS 2 Technische Universität Dortmund Dortmund,

More information

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms

On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms On the Impact of Objective Function Transformations on Evolutionary and Black-Box Algorithms [Extended Abstract] Tobias Storch Department of Computer Science 2, University of Dortmund, 44221 Dortmund,

More information

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved. Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should

More information

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 TECHNISCHE UNIVERSITÄT DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Crossover can be constructive when computing unique input output sequences

Crossover can be constructive when computing unique input output sequences Crossover can be constructive when computing unique input output sequences Per Kristian Lehre and Xin Yao The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA),

More information

A An Overview of Complexity Theory for the Algorithm Designer

A An Overview of Complexity Theory for the Algorithm Designer A An Overview of Complexity Theory for the Algorithm Designer A.1 Certificates and the class NP A decision problem is one whose answer is either yes or no. Two examples are: SAT: Given a Boolean formula

More information

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem

A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem A Comparison of GAs Penalizing Infeasible Solutions and Repairing Infeasible Solutions on the 0-1 Knapsack Problem Jun He 1, Yuren Zhou 2, and Xin Yao 3 1 J. He is with the Department of Computer Science,

More information

Computing Single Source Shortest Paths using Single-Objective Fitness Functions

Computing Single Source Shortest Paths using Single-Objective Fitness Functions Computing Single Source Shortest Paths using Single-Objective Fitness Functions Surender Baswana Somenath Biswas Benjamin Doerr Tobias Friedrich Piyush P. Kurur Department of Computer Science and Engineering

More information

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness

Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Evolutionary Algorithms How to Cope With Plateaus of Constant Fitness and When to Reject Strings of The Same Fitness Thomas Jansen and Ingo Wegener FB Informatik, LS 2, Univ. Dortmund, 44221 Dortmund,

More information

When to use bit-wise neutrality

When to use bit-wise neutrality Nat Comput (010) 9:83 94 DOI 10.1007/s11047-008-9106-8 When to use bit-wise neutrality Tobias Friedrich Æ Frank Neumann Published online: 6 October 008 Ó Springer Science+Business Media B.V. 008 Abstract

More information

Preliminaries and Complexity Theory

Preliminaries and Complexity Theory Preliminaries and Complexity Theory Oleksandr Romanko CAS 746 - Advanced Topics in Combinatorial Optimization McMaster University, January 16, 2006 Introduction Book structure: 2 Part I Linear Algebra

More information

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size.

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size. 10 2.2. CLASSES OF COMPUTATIONAL COMPLEXITY An optimization problem is defined as a class of similar problems with different input parameters. Each individual case with fixed parameter values is called

More information

NP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch]

NP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch] NP-Completeness Andreas Klappenecker [based on slides by Prof. Welch] 1 Prelude: Informal Discussion (Incidentally, we will never get very formal in this course) 2 Polynomial Time Algorithms Most of the

More information

Introduction to Complexity Theory

Introduction to Complexity Theory Introduction to Complexity Theory Read K & S Chapter 6. Most computational problems you will face your life are solvable (decidable). We have yet to address whether a problem is easy or hard. Complexity

More information

P is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k.

P is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k. Complexity Theory Problems are divided into complexity classes. Informally: So far in this course, almost all algorithms had polynomial running time, i.e., on inputs of size n, worst-case running time

More information

Unit 1A: Computational Complexity

Unit 1A: Computational Complexity Unit 1A: Computational Complexity Course contents: Computational complexity NP-completeness Algorithmic Paradigms Readings Chapters 3, 4, and 5 Unit 1A 1 O: Upper Bounding Function Def: f(n)= O(g(n)) if

More information

P, NP, NP-Complete, and NPhard

P, NP, NP-Complete, and NPhard P, NP, NP-Complete, and NPhard Problems Zhenjiang Li 21/09/2011 Outline Algorithm time complicity P and NP problems NP-Complete and NP-Hard problems Algorithm time complicity Outline What is this course

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531

REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 53 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms Computer Science 385 Analysis of Algorithms Siena College Spring 2011 Topic Notes: Limitations of Algorithms We conclude with a discussion of the limitations of the power of algorithms. That is, what kinds

More information

Outline. 1 NP-Completeness Theory. 2 Limitation of Computation. 3 Examples. 4 Decision Problems. 5 Verification Algorithm

Outline. 1 NP-Completeness Theory. 2 Limitation of Computation. 3 Examples. 4 Decision Problems. 5 Verification Algorithm Outline 1 NP-Completeness Theory 2 Limitation of Computation 3 Examples 4 Decision Problems 5 Verification Algorithm 6 Non-Deterministic Algorithm 7 NP-Complete Problems c Hu Ding (Michigan State University)

More information

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash Equilibrium Price of Stability Coping With NP-Hardness

More information

UNIVERSITY OF DORTMUND

UNIVERSITY OF DORTMUND UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods

More information

More Approximation Algorithms

More Approximation Algorithms CS 473: Algorithms, Spring 2018 More Approximation Algorithms Lecture 25 April 26, 2018 Most slides are courtesy Prof. Chekuri Ruta (UIUC) CS473 1 Spring 2018 1 / 28 Formal definition of approximation

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms What do you do when a problem is NP-complete? or, when the polynomial time solution is impractically slow? assume input is random, do expected performance. Eg, Hamiltonian path

More information

Methods for finding optimal configurations

Methods for finding optimal configurations CS 1571 Introduction to AI Lecture 9 Methods for finding optimal configurations Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Search for the optimal configuration Optimal configuration search:

More information

Multiplicative Drift Analysis

Multiplicative Drift Analysis Multiplicative Drift Analysis Benjamin Doerr Daniel Johannsen Carola Winzen Max-Planck-Institut für Informatik Campus E1 4 66123 Saarbrücken, Germany arxiv:1101.0776v1 [cs.ne] 4 Jan 2011 Submitted January

More information

Algorithms. Outline! Approximation Algorithms. The class APX. The intelligence behind the hardware. ! Based on

Algorithms. Outline! Approximation Algorithms. The class APX. The intelligence behind the hardware. ! Based on 6117CIT - Adv Topics in Computing Sci at Nathan 1 Algorithms The intelligence behind the hardware Outline! Approximation Algorithms The class APX! Some complexity classes, like PTAS and FPTAS! Illustration

More information

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003 CS6999 Probabilistic Methods in Integer Programming Randomized Rounding April 2003 Overview 2 Background Randomized Rounding Handling Feasibility Derandomization Advanced Techniques Integer Programming

More information

NP-completeness. Chapter 34. Sergey Bereg

NP-completeness. Chapter 34. Sergey Bereg NP-completeness Chapter 34 Sergey Bereg Oct 2017 Examples Some problems admit polynomial time algorithms, i.e. O(n k ) running time where n is the input size. We will study a class of NP-complete problems

More information

Limitations of Algorithm Power

Limitations of Algorithm Power Limitations of Algorithm Power Objectives We now move into the third and final major theme for this course. 1. Tools for analyzing algorithms. 2. Design strategies for designing algorithms. 3. Identifying

More information

Computational Complexity. IE 496 Lecture 6. Dr. Ted Ralphs

Computational Complexity. IE 496 Lecture 6. Dr. Ted Ralphs Computational Complexity IE 496 Lecture 6 Dr. Ted Ralphs IE496 Lecture 6 1 Reading for This Lecture N&W Sections I.5.1 and I.5.2 Wolsey Chapter 6 Kozen Lectures 21-25 IE496 Lecture 6 2 Introduction to

More information

Now just show p k+1 small vs opt. Some machine gets k=m jobs of size at least p k+1 Can also think of it as a family of (1 + )-approximation algorithm

Now just show p k+1 small vs opt. Some machine gets k=m jobs of size at least p k+1 Can also think of it as a family of (1 + )-approximation algorithm Review: relative approx. alg by comparison to lower bounds. TS with triangle inequality: A nongreedy algorithm, relating to a clever lower bound nd minimum spanning tree claim costs less than TS tour Double

More information

Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization

Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization Experimental Supplements to the Theoretical Analysis of EAs on Problems from Combinatorial Optimization Patrick Briest, Dimo Brockhoff, Bastian Degener, Matthias Englert, Christian Gunia, Oliver Heering,

More information

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches

Metaheuristics and Local Search. Discrete optimization problems. Solution approaches Discrete Mathematics for Bioinformatics WS 07/08, G. W. Klau, 31. Januar 2008, 11:55 1 Metaheuristics and Local Search Discrete optimization problems Variables x 1,...,x n. Variable domains D 1,...,D n,

More information

CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis

CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis Eli Upfal Eli Upfal@brown.edu Office: 319 TA s: Lorenzo De Stefani and Sorin Vatasoiu cs155tas@cs.brown.edu It is remarkable

More information

CS/COE

CS/COE CS/COE 1501 www.cs.pitt.edu/~nlf4/cs1501/ P vs NP But first, something completely different... Some computational problems are unsolvable No algorithm can be written that will always produce the correct

More information

NP-Complete Problems and Approximation Algorithms

NP-Complete Problems and Approximation Algorithms NP-Complete Problems and Approximation Algorithms Efficiency of Algorithms Algorithms that have time efficiency of O(n k ), that is polynomial of the input size, are considered to be tractable or easy

More information

Metaheuristics and Local Search

Metaheuristics and Local Search Metaheuristics and Local Search 8000 Discrete optimization problems Variables x 1,..., x n. Variable domains D 1,..., D n, with D j Z. Constraints C 1,..., C m, with C i D 1 D n. Objective function f :

More information

ECS122A Handout on NP-Completeness March 12, 2018

ECS122A Handout on NP-Completeness March 12, 2018 ECS122A Handout on NP-Completeness March 12, 2018 Contents: I. Introduction II. P and NP III. NP-complete IV. How to prove a problem is NP-complete V. How to solve a NP-complete problem: approximate algorithms

More information

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization

A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Algorithmica (2017) 78:641 659 DOI 10.1007/s00453-016-0262-4 A Runtime Analysis of Parallel Evolutionary Algorithms in Dynamic Optimization Andrei Lissovoi 1 Carsten Witt 2 Received: 30 September 2015

More information

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week Cuts, Branch & Bound, and Lagrangean Relaxation Week 11 1 Integer Linear Programming This week we will discuss solution methods for solving integer linear programming problems. I will skip the part on complexity theory, Section 11.8, although this is

More information

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria 12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1

Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1 Runtime Analysis of Evolutionary Algorithms: Basic Introduction 1 Per Kristian Lehre University of Nottingham Nottingham NG8 1BB, UK PerKristian.Lehre@nottingham.ac.uk Pietro S. Oliveto University of Sheffield

More information

With high probability

With high probability With high probability So far we have been mainly concerned with expected behaviour: expected running times, expected competitive ratio s. But it would often be much more interesting if we would be able

More information

U N I V E R S I T Y OF D O R T M U N D

U N I V E R S I T Y OF D O R T M U N D U N I V E R S I T Y OF D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence

More information

Computational Complexity and Intractability: An Introduction to the Theory of NP. Chapter 9

Computational Complexity and Intractability: An Introduction to the Theory of NP. Chapter 9 1 Computational Complexity and Intractability: An Introduction to the Theory of NP Chapter 9 2 Objectives Classify problems as tractable or intractable Define decision problems Define the class P Define

More information

1 Primals and Duals: Zero Sum Games

1 Primals and Duals: Zero Sum Games CS 124 Section #11 Zero Sum Games; NP Completeness 4/15/17 1 Primals and Duals: Zero Sum Games We can represent various situations of conflict in life in terms of matrix games. For example, the game shown

More information

Scheduling on Unrelated Parallel Machines. Approximation Algorithms, V. V. Vazirani Book Chapter 17

Scheduling on Unrelated Parallel Machines. Approximation Algorithms, V. V. Vazirani Book Chapter 17 Scheduling on Unrelated Parallel Machines Approximation Algorithms, V. V. Vazirani Book Chapter 17 Nicolas Karakatsanis, 2008 Description of the problem Problem 17.1 (Scheduling on unrelated parallel machines)

More information

R ij = 2. Using all of these facts together, you can solve problem number 9.

R ij = 2. Using all of these facts together, you can solve problem number 9. Help for Homework Problem #9 Let G(V,E) be any undirected graph We want to calculate the travel time across the graph. Think of each edge as one resistor of 1 Ohm. Say we have two nodes: i and j Let the

More information

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved. Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 P and NP P: The family of problems that can be solved quickly in polynomial time.

More information

Randomized Algorithms

Randomized Algorithms Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

This means that we can assume each list ) is

This means that we can assume each list ) is This means that we can assume each list ) is of the form ),, ( )with < and Since the sizes of the items are integers, there are at most +1pairs in each list Furthermore, if we let = be the maximum possible

More information

Random Walks and Quantum Walks

Random Walks and Quantum Walks Random Walks and Quantum Walks Stephen Bartlett, Department of Physics and Centre for Advanced Computing Algorithms and Cryptography, Macquarie University Random Walks and Quantum Walks Classical random

More information

arxiv: v1 [cs.dm] 29 Aug 2013

arxiv: v1 [cs.dm] 29 Aug 2013 Collecting Coupons with Random Initial Stake Benjamin Doerr 1,2, Carola Doerr 2,3 1 École Polytechnique, Palaiseau, France 2 Max Planck Institute for Informatics, Saarbrücken, Germany 3 Université Paris

More information

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1

REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 U N I V E R S I T Ä T D O R T M U N D REIHE COMPUTATIONAL INTELLIGENCE S O N D E R F O R S C H U N G S B E R E I C H 5 3 1 Design und Management komplexer technischer Prozesse und Systeme mit Methoden

More information

CS6901: review of Theory of Computation and Algorithms

CS6901: review of Theory of Computation and Algorithms CS6901: review of Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational

More information

CS 583: Algorithms. NP Completeness Ch 34. Intractability

CS 583: Algorithms. NP Completeness Ch 34. Intractability CS 583: Algorithms NP Completeness Ch 34 Intractability Some problems are intractable: as they grow large, we are unable to solve them in reasonable time What constitutes reasonable time? Standard working

More information

P and NP. Warm Up: Super Hard Problems. Overview. Problem Classification. Tools for classifying problems according to relative hardness.

P and NP. Warm Up: Super Hard Problems. Overview. Problem Classification. Tools for classifying problems according to relative hardness. Overview Problem classification Tractable Intractable P and NP Reductions Tools for classifying problems according to relative hardness Inge Li Gørtz Thank you to Kevin Wayne, Philip Bille and Paul Fischer

More information

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators

Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Geometric Semantic Genetic Programming (GSGP): theory-laden design of variation operators Andrea Mambrini University of Birmingham, UK NICaiA Exchange Programme LaMDA group, Nanjing University, China 7th

More information

By allowing randomization in the verification process, we obtain a class known as MA.

By allowing randomization in the verification process, we obtain a class known as MA. Lecture 2 Tel Aviv University, Spring 2006 Quantum Computation Witness-preserving Amplification of QMA Lecturer: Oded Regev Scribe: N. Aharon In the previous class, we have defined the class QMA, which

More information

Lecture 24 Nov. 20, 2014

Lecture 24 Nov. 20, 2014 CS 224: Advanced Algorithms Fall 2014 Prof. Jelani Nelson Lecture 24 Nov. 20, 2014 Scribe: Xiaoyu He Overview Today we will move to a new topic: another way to deal with NP-hard problems. We have already

More information

CS3719 Theory of Computation and Algorithms

CS3719 Theory of Computation and Algorithms CS3719 Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational tool - analysis

More information

Black-Box Search by Unbiased Variation

Black-Box Search by Unbiased Variation Electronic Colloquium on Computational Complexity, Revision 1 of Report No. 102 (2010) Black-Box Search by Unbiased Variation Per Kristian Lehre and Carsten Witt Technical University of Denmark Kgs. Lyngby,

More information

Lecture notes on OPP algorithms [Preliminary Draft]

Lecture notes on OPP algorithms [Preliminary Draft] Lecture notes on OPP algorithms [Preliminary Draft] Jesper Nederlof June 13, 2016 These lecture notes were quickly assembled and probably contain many errors. Use at your own risk! Moreover, especially

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Lecture Algorithm Configura4on and Theore4cal Analysis Outline Algorithm Configuration Theoretical Analysis 2 Algorithm Configuration Question: If an EA toolbox is available (which

More information

Lecture 14 (last, but not least)

Lecture 14 (last, but not least) Lecture 14 (last, but not least) Bounded A TM, revisited. Fixed parameter algorithms. Randomized (coin flipping) algorithms. Slides modified by Benny Chor, based on original slides by Maurice Herlihy,

More information

Computational Complexity

Computational Complexity Computational Complexity Problems, instances and algorithms Running time vs. computational complexity General description of the theory of NP-completeness Problem samples 1 Computational Complexity What

More information

Local Search & Optimization

Local Search & Optimization Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Outline

More information

Probability & Computing

Probability & Computing Probability & Computing Stochastic Process time t {X t t 2 T } state space Ω X t 2 state x 2 discrete time: T is countable T = {0,, 2,...} discrete space: Ω is finite or countably infinite X 0,X,X 2,...

More information

6 Markov Chain Monte Carlo (MCMC)

6 Markov Chain Monte Carlo (MCMC) 6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 5: Divide and Conquer (Part 2) Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ A Lower Bound on Convex Hull Lecture 4 Task: sort the

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 21: Introduction to NP-Completeness 12/9/2015 Daniel Bauer Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways

More information

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Artificial Intelligence, Computational Logic PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE Lecture 4 Metaheuristic Algorithms Sarah Gaggl Dresden, 5th May 2017 Agenda 1 Introduction 2 Constraint

More information