Approximate Pareto Curves for the Asymmetric Traveling Salesman Problem

Similar documents
ON APPROXIMATING MULTI-CRITERIA TSP BODO MANTHEY 1

Multi-Criteria TSP: Min and Max Combined

An Improved Approximation Algorithm for the Asymmetric TSP with Strengthened Triangle Inequality

Improved Approximation Algorithms for Metric Maximum ATSP and Maximum 3-Cycle Cover Problems

A New Approximation Algorithm for the Asymmetric TSP with Triangle Inequality By Markus Bläser

Worst Case Performance of Approximation Algorithm for Asymmetric TSP

A simple LP relaxation for the Asymmetric Traveling Salesman Problem

Approximability of Connected Factors

Lecture 6 January 21, 2013

2 Notation and Preliminaries

Data Structures in Java

Approximation Algorithms for Asymmetric TSP by Decomposing Directed Regular Multigraphs

Approximation Algorithms for Re-optimization

Computing Cycle Covers without Short Cycles

ABHELSINKI UNIVERSITY OF TECHNOLOGY

The Greedy Algorithm for the Symmetric TSP

Decision Problems TSP. Instance: A complete graph G with non-negative edge costs, and an integer

Mathematics for Decision Making: An Introduction. Lecture 8

NP-Complete Problems and Approximation Algorithms

NP and Computational Intractability

Approximation complexity of min-max (regret) versions of shortest path, spanning tree, and knapsack

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Multi-criteria approximation schemes for the resource constrained shortest path problem

arxiv: v1 [cs.dm] 26 Apr 2010

P,NP, NP-Hard and NP-Complete

Welcome to... Problem Analysis and Complexity Theory , 3 VU

CS/COE

NP Complete Problems. COMP 215 Lecture 20

Fast algorithms for even/odd minimum cuts and generalizations

Algorithms: COMP3121/3821/9101/9801

Not all counting problems are efficiently approximable. We open with a simple example.

Integer Linear Programming (ILP)

July 18, Approximation Algorithms (Travelling Salesman Problem)

Algorithms and Complexity Theory. Chapter 8: Introduction to Complexity. Computer Science - Durban - September 2005

NP-Complete Reductions 2

Lecture 24 Nov. 20, 2014

Algorithm Design Strategies V

The minimum G c cut problem

Lecture 4: NP and computational intractability

Multiple Sequence Alignment: Complexity, Gunnar Klau, January 12, 2006, 12:

Bounds on the Traveling Salesman Problem

NP-Completeness. Until now we have been designing algorithms for specific problems

Easy Problems vs. Hard Problems. CSE 421 Introduction to Algorithms Winter Is P a good definition of efficient? The class P

Notes for Lecture 21

Travelling Salesman Problem

1 Matchings in Non-Bipartite Graphs

NP-Completeness. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

CSCI3390-Lecture 18: Why is the P =?NP Problem Such a Big Deal?

ECS122A Handout on NP-Completeness March 12, 2018

On the Integrality Ratio for the Asymmetric Traveling Salesman Problem

10.3 Matroids and approximation

Correctness of Dijkstra s algorithm

The traveling salesman problem

An algorithm to increase the node-connectivity of a digraph by one

Lecture 15 - NP Completeness 1

NP-Completeness. ch34 Hewett. Problem. Tractable Intractable Non-computable computationally infeasible super poly-time alg. sol. E.g.

Four Shortest Vertex-Disjoint Paths in Planar Graphs

arxiv:quant-ph/ v1 15 Apr 2005

Limitations of Algorithm Power

Decision Making Based on Approximate and Smoothed Pareto Curves

An algorithm for the satisfiability problem of formulas in conjunctive normal form

Limits to Approximability: When Algorithms Won't Help You. Note: Contents of today s lecture won t be on the exam

Algorithms and Theory of Computation. Lecture 19: Class P and NP, Reduction

NP Completeness and Approximation Algorithms

Traveling Salesman Problem

arxiv: v1 [math.co] 28 Oct 2016

Graduate Algorithms CS F-21 NP & Approximation Algorithms

Polynomial-time Reductions

DAA Unit IV Complexity Theory Overview

ON TESTING HAMILTONICITY OF GRAPHS. Alexander Barvinok. July 15, 2014

Approximation Hardness for Small Occurrence Instances of NP-Hard Problems

Approximation results for the weighted P 4 partition problem

Today s Outline. CS 362, Lecture 24. The Reduction. Hamiltonian Cycle. Reduction Wrapup Approximation algorithms for NP-Hard Problems

Sensitivity Analysis for Intuitionistic Fuzzy Travelling Salesman Problem

On the Complexity of Budgeted Maximum Path Coverage on Trees

The Computational Complexity of Graph Contractions I: Polynomially Solvable and NP-Complete Cases*

Chapter 34: NP-Completeness

Approximation Hardness of TSP with Bounded Metrics. Abstract. The general asymmetric (and metric) TSP is known

Introduction to Complexity Theory

Lecture 2: January 18

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

ACO Comprehensive Exam October 14 and 15, 2013

The Time Complexity of Constraint Satisfaction

An Approximation Algorithm for MAX-2-SAT with Cardinality Constraint

The Maximum Flow Problem with Disjunctive Constraints

Topics in Theoretical Computer Science April 08, Lecture 8

CSE 202 Homework 4 Matthias Springer, A

Exponential-Time Approximation of Weighted Set Cover

Optimal Fractal Coding is NP-Hard 1

The On-line Asymmetric Traveling Salesman Problem 1

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

8.5 Sequencing Problems

Precoloring extension on chordal graphs

Computational Complexity and Intractability: An Introduction to the Theory of NP. Chapter 9

NP-Complete Reductions 1

Dominating Set. Chapter 7

VIII. NP-completeness

Branch-and-Bound for the Travelling Salesman Problem

A lower bound for scheduling of unit jobs with immediate decision on parallel machines

Lecture 5: Efficient PAC Learning. 1 Consistent Learning: a Bound on Sample Complexity

Transcription:

Approximate Pareto Curves for the Asymmetric Traveling Salesman Problem Bodo Manthey Universität des Saarlandes, Informatik Postfach 550, 6604 Saarbrücken, Germany manthey@cs.uni-sb.de arxiv:07.257v [cs.ds] 4 Nov 2007 We present a randomized approximation algorithm for the asymmetric traveling salesman problem (ATSP) with several objective functions. Our algorithm achieves an approximation ratio of log n+ε for multi-criteria ATSP with triangle inequality and an approximation ratio of γ +ε for multi-criteria ATSP with γ-triangle inequality. Multi-Criteria ATSP. Asymmetric Traveling Salesman Problem The asymmetric traveling salesman problem (ATSP) is one of the most fundamental problems in combinatorial optimization. An instance of ATSP is a directed complete graph G = (V,E) with edge weights w : E N that satisfy the triangle inequality, i. e., w(a,c) w(a,b)+w(b,c) for all distinct vertices a,b,c V. The goal is to find a Hamiltonian cycle of minimum weight. The weight of a Hamiltonian cycle (or, more general, of a subset of E) is the sum of the weight if its edges. Aspecial caseofatspisγ-atspforγ [ 2,], whichisatsprestricted toinstances that satisfy the γ-triangle inequality: w(a,c) γ (w(a,b) +w(b,c) ) for all distinct a,b,c V. For γ =, we obtain the usual triangle inequality, and ATSP and -ATSP are the same problem. Since ATSP and γ-atsp are NP-hard, we are in need of approximation algorithms. The currently best approximation algorithm for ATSP achieves an approximation ratio of 2 3 logn, where n is the number of vertices of the instance [5]. (In this paper, log denotes the logarithm to base 2.) Multi-criteria γ-atsp can be approximated within a constant factor for all γ [ 2,), namely with an approximation ratio of min{ γ γ, +γ } [2,3]. 2 γ γ 3 Cycle covers are an important tool for designing approximation algorithms for the traveling salesman problem. A cycle cover of a graph is a set of vertex-disjoint cycles such that every vertex is part of exactly one cycle. Hamiltonian cycles are special cases of cycle covers that consist of just one cycle. In contrast to Hamiltonian cycles, cycle covers of minimum weight can be computed efficiently using matching algorithms [].

.2 Multi-Criteria Optimization In many optimization problems, there is not only a single objective function, but there are several such functions. Consider, for instance, buying a car: We (probably) want to buy a cheap car that is fast and has a good gas mileage. How do we decide which car suits us best? With respect to any single criterion, making the decision is easy. But with multiple criteria involved, there is no natural notion of a best choice. The aim of multi-criteria optimization is to cope with this problem. To transfer the concept of a best choice to multi-criteria optimization, the notion of Pareto curves was introduced (cf. Ehrgott [4]). A Pareto curve is a set of solutions that can be considered optimal. To make this more precise, consider a k-criteria optimization problem with instances I, solutions sol(x) for every instance X I, and k weight functions w,...,w k that map X I and Y sol(x) to N. Throughout this paper, our aim is to minimize the objective functions. A solution Y sol(x) dominates a solution Z sol(x) if w i (Y,X) w i (Z,X) for all i and w i (Y,X) < w i (Z,X) for at least one i. This means that Y is strictly preferable to Z. A Pareto curve of solutions contains all solutions that are not dominated by any other solution. Unfortunately, Pareto curves cannot be computed efficiently in many cases: First, they are often of exponential size. Second, because of straightforward reductions from knapsack problems, they are NP-hard to compute. Thus, we have to be content with approximations to them. For simpler notation, let w(y,x) = (w (Y,X),...,w k (Y,X)). We omit the instance X if it is clear from the context. Inequalities are meant component-wise. A set P sol(x) of solutions is called an α-approximate Pareto curve for X I if the following holds: For every solution Z sol(x), there exists a Y P with w(y) αw(z). We have α, and a -approximate Pareto curve is a Pareto curve. (This is not precisely true if there are several solutions whose objective values agree. This, however, is inconsequential in our case, and we will not elaborate here for the sake of simplicity.) Papadimitriou and Yannakakis [9] showed that ( + ε)-approximate Pareto curves of size polynomial in the instance size and /ε exist. The technical requirement for their result is that the objective values of solutions in sol(x) are bounded from above by 2 p(n) for some polynomial p, where N is the size of X. This is fulfilled in most natural optimization problems and in particular in our case. We will briefly sketch the proof of their result since we will exploit it in our algorithm. Let X be any instance, let ε > 0, and consider any solution Y sol(x). For every i {,...,k}, there is a unique l i N such that w i (Y) [ (+ε) l i,( +ε) l i+ ). We call the vector l = (l,...,l k ) the ε-signature of Y and of w(y). Since w(y) 2 p(n), we have l i p(n) log +ε 2, which is bounded by a polynomial q in the input size N and /ε. There are at most q k different ε-signatures, which is polynomial for fixed k. For every l, we put one solution with that ε-signature into P if one exists. This yields a ( + ε)-approximate Pareto curve: For every Z sol(x), either Z P or there is another solution Y P with the same ε-signature. Thus, w(y) (+ε) w(x). A fully polynomial time approximation scheme (FPTAS) for a multi-criteria optimization problem computes ( + ε)-approximate Pareto curves in time polynomial in the size of the instance and /ε. Papadimitriou and Yannakakis [9], based on a result of Mulmuley et al. [8], showed that the multi-criteria minimum-weight matching problem admits a randomized 2

FPTAS, i. e., the algorithm succeeds in computing a ( + ε)-approximate Pareto curve with constant probability. The randomized FPTAS for matching can be extended to a randomized FPTAS for the multi-criteria cycle cover problem [7]. Manthey and Ram [7] designed a randomized approximation algorithm for multi-criteria γ-atsp, which achieves an approximation ratio of 2 + γ3 3γ for γ < / 3 0.58. For 2 larger values of γ, and in particular for ATSP, the approximation ratio achieved by their algorithm can be arbitrarily bad..3 New Results We devise the first approximation algorithm for multi-criteria ATSP and γ-atsp. Our algorithm achieves an approximation ratio of logn+ε for multi-criteria ATSP and a ratio of γ +εformulti-criteriaγ-atspforallγ [ 2,). Here, nisthenumberofverticesandεcan be made arbitrarily small. Our algorithm is randomized and its running-time is polynomial in the input size and /ε. To achieve a polynomial running-time, we devise a technique for sparsifying intermediate solutions since we would otherwise obtain approximate Pareto curves of super-polynomial size. 2 Approximation Algorithm Algorithm is an adaption to multi-criteria ATSP of the algorithm of Frieze et al. [6]. Therefore, we briefly describe their algorithm: We compute a cycle cover of minimum weight. If this cycle cover is already a Hamiltonian cycle, then we are done. Otherwise, we choose arbitrarily one vertex of every cycle. Then we proceed recursively on the subset of vertices thus chosen to obtain a cycle that contains all these vertices. The cycle cover plus this cycle form a Eulerian graph. We traverse the Eulerian cycle and take shortcuts whenever visiting vertices more than once. The approximation ratio of log n follows from two observations: First, the weight of every cycle cover computed is bounded from above by the weight of the optimal Hamiltonian cycle. Second, we need at most logn recursive calls since in every recursive call we have at most half as many vertices as in the previous call. Let us also explain how the algorithm achieves an approximation ratio of γ for γ- ATSP and γ < in case of a single objective function w [2]. Let H be a minimum-weight Hamiltonian cycle. Then we compute a cycle cover C 0, select a vertex of every cycle of C 0 to obtain V, compute a cycle cover C on V, and so on until, for some l, the set H l = C l is a Hamiltonian cycle on V l. For j = k,k 2,...,0, we combine C j and H j+, traverse the Eulerian cycle thus obtained, and take shortcuts to obtain a Hamiltonian cycle H j on V j until we have a Hamiltonian cycle H 0 on V. We do this such that all edges of H j+ are removed by these shortcuts. Let us quickly estimate the weight w(h 0 ) of our Hamiltonian cycle. We have w(c j ) w(h) for all j {0,...,l}. By the γ-triangle inequality, we have w(h j ) w(c j )+γ w(h j+ ) since all edges of H j+ have been removed. Thus, w(h 0 ) w(c 0 )+γ (w(c )+γ (w(c 2 )+γ (...(w(c l )+γ w(c l ))...))) w(h)+γ (w(h)+γ (w(h)+γ (...(w(h)+γ w(h))...))) w(h) (+γ +γ 2 +...) = w(h) γ. 3

Algorithm Approximation algorithm for multi-criteria ATSP and γ-atsp. Input: complete graph G = (V,E) with n vertices, edge weights w : E N satisfying the γ-triangle inequality, ε > 0, k N Output: approximate Pareto curve P TSP for multi-criteria γ-atsp : ε ε 2 /log 3 n; F ; j 2: compute a ( + ε )-approximate Pareto curve Q of cycle covers on V with a success probability of at least (2plogn) (p is defined in Lemma 2) 3: P 0 {(C,w(C),V, ) C Q} 4: while P j do 5: P j 6: for all π = (C,w,V,π ) P j do 7: if (V,C ) is connected then 8: F F {(C,w,V,π )} 9: else 0: select one vertex of every component of (V,C ) to obtain Ṽ : compute a (+ε )-approximate Pareto curve Q of cycle covers on Ṽ with a success probability of at least (2plogn) 2: w w +γ j w( C) 3: P j P j {( C, w,ṽ,π) C Q} 4: while there are π,π P j with the same ε -signature do 5: remove one of them arbitrarily 6: j j + 7: P TSP 8: for all (C,w,V,π ) F do 9: H C 20: while π = (C,w,V,π ) do 2: construct a Hamiltonian cycle H on V from H C by taking shortcuts such that all edges of H are removed 22: π π 23: H H 24: P TSP P TSP {H} We generalize these ideas to multi-criteria and obtain Algorithm : Instead of a cycle cover, we compute an approximate Pareto curve of cycle covers. Then we iterate by computing approximate Pareto curves of cycle covers on vertex sets V for every cycle cover C in the previous set. The set V contains exactly one vertex of every cycle of C. Unfortunately, it can happen that we construct a super-polynomial number of solutions in this way. To cope with this, we remove some intermediate solutions if there are other intermediate solutions whose weight is close by. We call this process, which is done in lines 4 to 5 of Algorithm, sparsification. (We define the ε -signature of π = (C,w,V,π ) to be the ε -signature of w.) In lines 2 and 3 of Algorithm, an initial ( + ε )-approximate Pareto curve of cycle covers is computed. In the loop in lines 4 to 6, the algorithm computes iteratively Pareto curves of cycle covers. The set P j contains configurations π = (C,w,V,π ), where C is a cycle cover on V, π is the predecessor configuration, and w is the weight of C plus 4

the weight of its predecessor cycle covers, each weighted with an appropriate power of γ as in the analysis of the algorithm for a single objective function. If, in the course of this computation, we obtain Hamiltonian cycles, these are putinto F (line 8). In lines 4 and 5, the sparsification takes place. Finally, in lines 7 to 24, Hamiltonian cycles are constructed from the cycle covers computed. Let us now come to the analysis of the algorithm. The following theorem is the main result of this paper. It follows from Lemmas 2, 3, and 6 below. Theorem. For every ε > 0, Algorithm is a randomized (logn+ε)-approximation for multi-criteria ATSP and a randomized ( γ +ε)-approximation for multi-criteria γ-atsp. Its running-time is polynomial in the input size and /ε. We observe that for every j and every π = (C,w,V,π ) P j, we have V n/2 j. For j = 0, this holds by construction (line 3). For j > 0 and π = (C,V,w,π ) P j, we have V n/2 j by the induction hypothesis. Since every cycle involves at least two vertices, we have V V /2 n/2 j. This yields also that P j for j logn is empty: Assume that such a P j contained a ( C,Ṽ, w,π). Then Ṽ n/2 logn < 2. Thus, Ṽ. Let π = (C,w,V,π ), then this implies that (V,C ) had been connected. Thus, we would enter line 8 rather than lines 0 to 3, a contradiction. Let us now analyze the running-time. After that, we examine the approximation performance and finally the success probability. Lemma 2. The running-time of Algorithm is polynomial in the input size and /ε. Proof. Let N be the size of the instance at hand, and let p = p(n,/ε ) be a two-variable polynomial that boundsthe number of different ε -signatures of solutions to instances of size at most N. We abbreviate polynomial in the input size and /ε simply by polynomial. This is equivalent to polynomial in the input size and /ε by the choice of ε. The approximate Pareto curves can be computed in polynomial time with the success probability of at least (2plogn) by executing the randomized FPTAS log(2plogn) times. Thus, all operations can be implemented to run in polynomial time provided that the cardinalities of all finalized sets P j are bounded from above by a polynomial p for all j. Then, for some j in line for, at most p (+ε )-approximate Pareto curves are constructed, each one in polynomial time. For every ε -signature and every j, the set P j contains at most one set with that specific ε -signature. The lemma follows since the number p of ε -signatures is bounded by a polynomial. Let us now analyze the approximation ratio. First, we will assume that all randomized computations of (+ε )-approximate cycle covers are successful. After that we analyze the probability that one of them fails. Lemma 3. Assume that in all executions of line of Algorithm an ( + ε )-approximate Pareto curve of cycle covers is successfully computed. Then Algorithm achieves an approximation ratio of logn γ j +ε for multi-criteria γ-atsp for γ [ 2,]. If γ <, then logn γ j γj = γ. For γ =, we have logn γ j = logn logn. Thus, we obtain the approximation ratios claimed in Theorem. 5

Proof. Let r = logn γ j, and let H be any Hamiltonian cycle on V. We have to show that the set P TSP of solutions computed by Algorithm contains a Hamiltonian cycle H with w(h) (r +ε) w( H). Claim 4. Let π = (C,w,V,π ) F. Then, in lines 7 to 24, a Hamiltonian cycle H is constructed from π with w(h) w. Proof. Assume that π l = π for some l and that π j = (C j,v j,w (j),π j ) for j {0,...,l}. (We have π =, C = C l, V = V l, w (l) = w, π l = π, and V 0 = V.) Let H l = C l be the Hamiltonian cycle on V l. In lines 7 to 24, we successively construct Hamiltonian cycles H j on V j from C j and H j+. We have w(h j ) w(c j )+γ w(h j+ ) by the γ-triangle inequality. Thus, w(h) = w(h 0 ) l γj w(c j ) = w by construction of w. What remains to be proved is that, for every Hamiltonian cycle H, there exists a π = (C,w,V,π ) in F such that w (r +ε) w( H). Claim 5. For every l, either there exists a π = (C,w,V,π ) P l or a (C,w,π ) F with w (+ε ) l+ l γj w( H). Proof. We prove the claim by induction on l. For l = 0, the claim boils down to the existence of a (C,w,V, ) P 0 with w(c ) (+ε ) w( H). Such a C exists because in line 2, a (+ε )-approximate Pareto curve of cycle covers is computed. Now assume that the claim holds for l. If F contains a (C,w,π ) that satisfies the claim for l, then we are done since (+ε ) l l γj (+ε ) l+ l γj. Otherwise, P l contains π = (C,V,w,π ) with w (+ε ) l l γj w( H). Let V be the set of vertices constructed from π in line 0, and let H be H restricted to V by taking shortcuts. By the triangle inequality, we have w( H ) w( H). After line, Q contains a cycle cover C with w(c ) (+ε ) w( H ). Let π = (C,w,V,π ) with w = w +γ l w(c ). Then l w (+ε ) l γ j +γ l (+ε ) w( H). What remains to be analyzed is the sparsification in lines 4 to 5. If π P j after sparsification, we are done since l (+ε ) l γ j +(+ε ) γ l (+ε ) l+ l γ j. Otherwise, P l contains a π = ( C, w,ṽ, π ) with the same ε -signature as π. Thus, l l w (+ε ) (+ε ) l γ j +γ l (+ε ) w( H) (+ε ) l+ γ j w( H), and π fulfills the requirements of the claim. 6

Let r be defined as in the proof of Lemma 3. By Claims 4 and 5, we obtain an approximation ratio of ( + ε ) logn r since P l is empty for l logn. What remains to be proved is that this is at most r +ε: ) logn ) ( r (+ε ) logn r (+ ε2 ε 2 log 3 r exp( n log 2 r + ε ) r +ε. n logn Thefirstinequalityfollowsfromourchoiceofε. Thesecondinequalityholdssince ( + x y) y exp(x). The third inequality holds because exp(x 2 ) + x for x [0,0.7] (we assume ε/logn < 0.7 without loss of generality.) The fourth inequality holds since r logn. Lemma 6. The probability that, in a run of Algorithm, in every execution of line an ε -approximate Pareto curve of cycle covers is successfully computed is at least /2. Proof. Line of Algorithm is executed at most p logn times. Each one of them is successful with a probability of at least (2plogn). Thus, the probability that all of them are successful is at least ( 2p logn ) p logn exp( /2) > /2. To be precise, the probability is at least /2 for p logn 2. References [] Ravindra K. Ahuja, Thomas L. Magnanti, and James B. Orlin. Network Flows: Theory, Algorithms, and Applications. Prentice-Hall, 993. [2] Markus Bläser, Bodo Manthey, and Jiří Sgall. An improved approximation algorithm for the asymmetric TSP with strengthened triangle inequality. Journal of Discrete Algorithms, 4(4):623 632, 2006. [3] L. Sunil Chandran and L. Shankar Ram. On the relationship between ATSP and the cycle cover problem. Theoretical Computer Science, 370(-3):28 228, 2007. [4] Matthias Ehrgott. Multicriteria Optimization. Springer, 2005. [5] Uriel Feige and Mohit Singh. Improved approximation ratios for traveling salesperson tours and paths in directed graphs. In Moses Charikar, Klaus Jansen, Omer Reingold, and José D. P. Rolim, editors, Proc. of the 0th Int. Workshop on Approximation Algorithms for Combinatorial Optimization Problems (APPROX), volume 4627 of Lecture Notes in Computer Science, pages 04 8. Springer, 2007. [6] Alan M. Frieze, Giulia Galbiati, and Francesco Maffioli. On the worst-case performance of some algorithms for the traveling salesman problem. Networks, 2():23 39, 982. [7] Bodo Manthey and L. Shankar Ram. Approximation algorithms for multi-criteria traveling salesman problems. Algorithmica, to appear. 7

[8] Ketan Mulmuley, Umesh V. Vazirani, and Vijay V. Vazirani. Matching is as easy as matrix inversion. Combinatorica, 7():05 3, 987. [9] Christos H. Papadimitriou and Mihalis Yannakakis. On the approximability of trade-offs andoptimal access ofwebsources. InProc. of the 4st Ann. IEEE Symp. on Foundations of Computer Science (FOCS), pages 86 92. IEEE Computer Society, 2000. 8