Lecture 20: LP Relaxation and Approximation Algorithms. 1 Introduction. 2 Vertex Cover problem. CSCI-B609: A Theorist s Toolkit, Fall 2016 Nov 8

Similar documents
Lecture 22: Hyperplane Rounding for Max-Cut SDP

Topics in Theoretical Computer Science April 08, Lecture 8

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

SDP Relaxations for MAXCUT

Lecture 17 (Nov 3, 2011 ): Approximation via rounding SDP: Max-Cut

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 6: Provable Approximation via Linear Programming

Linear Programming. Scheduling problems

Provable Approximation via Linear Programming

Integer Linear Programs

More Approximation Algorithms

Lecture 11 October 7, 2013

Increasing the Span of Stars

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

Lec. 2: Approximation Algorithms for NP-hard Problems (Part II)

Lecture 4. 1 FPTAS - Fully Polynomial Time Approximation Scheme

Approximation Basics

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Duality of LPs and Applications

CMPUT 675: Approximation Algorithms Fall 2014

Lecture 13 March 7, 2017

Approximation Algorithms and Hardness of Approximation. IPM, Jan Mohammad R. Salavatipour Department of Computing Science University of Alberta

1 The Knapsack Problem

Lecture 17 November 8, 2012

11.1 Set Cover ILP formulation of set cover Deterministic rounding

Lecture 8: The Goemans-Williamson MAXCUT algorithm

Lecture 16: Constraint Satisfaction Problems

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover

CS 583: Approximation Algorithms: Introduction

P,NP, NP-Hard and NP-Complete

FPT hardness for Clique and Set Cover with super exponential time in k

Aditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016

Algorithms: COMP3121/3821/9101/9801

Topic: Primal-Dual Algorithms Date: We finished our discussion of randomized rounding and began talking about LP Duality.

Lecture Semidefinite Programming and Graph Partitioning

Discrepancy Theory in Approximation Algorithms

This means that we can assume each list ) is

Network Design and Game Theory Spring 2008 Lecture 2

A New Approximation Algorithm for the Asymmetric TSP with Triangle Inequality By Markus Bläser

Convex and Semidefinite Programming for Approximation

Lecture 10. Semidefinite Programs and the Max-Cut Problem Max Cut

Notes for Lecture 2. Statement of the PCP Theorem and Constraint Satisfaction

Approximation Algorithms

Lecture 20: Course summary and open problems. 1 Tools, theorems and related subjects

Lecture 21 (Oct. 24): Max Cut SDP Gap and Max 2-SAT

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 8 Luca Trevisan September 19, 2017

Lecture #21. c T x Ax b. maximize subject to

Week Cuts, Branch & Bound, and Lagrangean Relaxation

On the efficient approximability of constraint satisfaction problems

Scheduling on Unrelated Parallel Machines. Approximation Algorithms, V. V. Vazirani Book Chapter 17

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017

CPSC 536N: Randomized Algorithms Term 2. Lecture 2

16.1 Min-Cut as an LP

Approximation algorithm for Max Cut with unit weights

Discrete Optimization 2010 Lecture 8 Lagrangian Relaxation / P, N P and co-n P

This is the author s final accepted version.

A better approximation ratio for the Vertex Cover problem

1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u)

Algorithms. Outline! Approximation Algorithms. The class APX. The intelligence behind the hardware. ! Based on

Lecture 6,7 (Sept 27 and 29, 2011 ): Bin Packing, MAX-SAT

K-center Hardness and Max-Coverage (Greedy)

Vertex Cover in Graphs with Locally Few Colors

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Essential facts about NP-completeness:

Lecture 19: Interactive Proofs and the PCP Theorem

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016

Advanced Algorithms 南京大学 尹一通

Topics in Approximation Algorithms Solution for Homework 3

A An Overview of Complexity Theory for the Algorithm Designer

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

Sparse analysis Lecture II: Hardness results for sparse approximation problems

8 Approximation Algorithms and Max-Cut

4. How to prove a problem is NPC

On the Complexity of Budgeted Maximum Path Coverage on Trees

Lecture 4. 1 Estimating the number of connected components and minimum spanning tree

Chapter 34: NP-Completeness

Convergence of Random Walks and Conductance - Draft

Lecture 18: PCP Theorem and Hardness of Approximation I

On the Complexity of the Minimum Independent Set Partition Problem

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

Propp-Wilson Algorithm (and sampling the Ising model)

Intractable Problems Part Two

Approximation algorithms for cycle packing problems

CS264: Beyond Worst-Case Analysis Lecture #18: Smoothed Complexity and Pseudopolynomial-Time Algorithms

How to Assign Papers to Referees Objectives, Algorithms, Open Problems p.1/21

Lecture 5: Computational Complexity

CS261: A Second Course in Algorithms Lecture #18: Five Essential Tools for the Analysis of Randomized Algorithms

Packing and Covering Dense Graphs

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 2 Luca Trevisan August 29, 2017

Lecture 18: Inapproximability of MAX-3-SAT

1 Column Generation and the Cutting Stock Problem

Solutions to Exercises

Lecture 4: Random-order model for the k-secretary problem

CS264: Beyond Worst-Case Analysis Lecture #15: Smoothed Complexity and Pseudopolynomial-Time Algorithms

Homework 4 Solutions

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

Lecture 7: Spectral Graph Theory II

3.3 Easy ILP problems and totally unimodular matrices

A Different Perspective For Approximating Max Set Packing

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko

Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

Transcription:

CSCI-B609: A Theorist s Toolkit, Fall 2016 Nov 8 Lecture 20: LP Relaxation and Approximation Algorithms Lecturer: Yuan Zhou Scribe: Syed Mahbub Hafiz 1 Introduction When variables of constraints of an 0 1 Integer Linear Program (ILP) is replaced by a weaker constraint such that each variable belongs to the interval [0, 1] (i.e. decimal) instead of fixed values {0, 1} (integers); then this relaxation generates a new Linear Programming problem. For example, the constraint x i {0, 1} becomes 0 x i 1. This relaxation technique is useful to convert an NP-hard optimization problem (ILP) into a related problem that is solvable in polynomial time. Moreover, the solution to the relaxed linear program can be employed to acquire information about the optimal solution to the original problem. On the other hand, Approximation Algorithms are algorithms used to find approximate solutions to the optimization problems. Linear programming relaxation is an established technique for designing such approximation algorithms for the NP-hard optimization problems (ILP). α-approximation: An algorithm is α approximation for any minimization problem if it always outputs a solution with the objective value: ALG α OPT, where α 1 and OPT is the optimal solution for the original problem. For maximization problem (Max-Cut problem), i.e. ALG OPT α. Initially in Section [2], we discuss how to use Linear Programming to approximate the vertex cover problem, which will give such a solution that is comparable with the minimum vertex cover. Lastly, we show how to round a Linear Programming relaxation in order to approximate the set cover problem in Section [3]. 2 Vertex Cover problem A vertex cover of a graph is a set of vertices such that each edge of the graph is incident to at least one vertex of the set. The problem of finding a minimum vertex cover is a traditional NP-hard optimization problem that has an approximation algorithm. Its decision version is called the vertex cover problem. Formulation of vertex cover 1

Lecture 20: LP Relaxation and Approximation Algorithms 2 Input: undirected graph G = (V, E) Goal: Find S V such that 1) {i, j} E, {i, j} S φ 2) S is minimized Now from the recent definition of α approximation, 2-approximation for vertex cover means that the solution (vertex cover) it produces is at most 2 times of the smallest vertex cover of the given graph, this is guaranteed. Formulation of ILP version of vertex cover: Let { 1 if i S x i = 0 if i S minimize subject to i V x i x i {0, 1}, i V x i + x j 1, {i, j} E. This is still NP-hard for the presence of integral constraint of x i {0, 1}. Now we can relax the integral constraint by x i [0, 1], which introduces the following Formulation of LP Relaxation version of vertex cover: minimize subject to i V x i 0 x i 1, i V x i + x j 1, {i, j} E. This is LP and solvable in polynomial time. Each solution for ILP remains valid for LP because of the relaxation procedure. Claim 1. OPT LP OPT ILP, for a minimization problem (Vertex Cover). LP shouldn t be arbitrarily smaller rather lower bounded so that LP can work as some kind of estimator for ILP. Now we discuss about the extra solutions (fractional) that are introduced specially for the relaxation. For example, if a graph has three vertices and each are connected, then one solution of the ILP problem is (1, 1, 0) where OPT ILP = 2. From the LP relaxation if two solutions arise like (0.7, 0.7, 0.3) or (0.5, 0.5, 0.5); then OPT LP is 1.7 or 1.5 respectively. It shows OPT LP OPT ILP.

Lecture 20: LP Relaxation and Approximation Algorithms 3 Rounding procedure: This process converts fractional solutions to integral solutions, its true that it may loose some objective value, which corresponds to the approximation factor. The following rounding technique for vertex cover problem is straightforward The rounding solution x i = { 1 if x i 1 2 0 if x i < 1 2 By this rounding procedure our toy example from above, the fractional soution turns to integral solution like (1, 1, 0) and (1, 1, 1) where objective values are 2 and 3 respectively. Claim 2. Round 2 OPT ILP. Proof. x i i V i V 2x i = 2 i V x i = 2 OPT LP 2 OPT ILP = Round 2 OPT ILP. From this proof, we have the following immediate claim Claim 3. Solving LP relaxation using rounding technique is 2-approximation for Vertex Cover. Claim 4. LP relaxation using rounding technique returns a feasible solution. Proof. Since (i, j) x i + x j 1, hence there exists at least one vertex which is greater than 1 2 ; i.e., x i 1 2 x i = 1. Which guarantees the feasible solution existence. 2.1 Integrality gaps From the last three claims, we have the following corollary Corollary 5. OPT ILP Round 2 OPT LP and OPT ILP [ OPT LP, 2 OPT LP ]. Now we introduce an important concept like integrality gap to demonstrate the best approximation guarantee can be achieved by using the LP relaxation as a lower bound. For a minimization problem, it is the worst case ratio, i.e. IG = OPT ILP OPT LP. Which indicates that the LP relaxation with rounding Vertex Cover has IG [1, 2]. This tells that if we have an instance of a undirected graph, then we couldn t give the optimal vertex cover solution rather an estimate of the vertex cover with factor 2. So, as we said earlier, LP is an estimator of ILP.

Lecture 20: LP Relaxation and Approximation Algorithms 4 At this point, we have questions like how we can improve this estimator. The answer can be given in two ways, 1. we can use different LP 2. we can perform better analysis. Firstly, we describe the second option better analysis for LP. We have the follwong observation Observation 6. Can t beat the factor of 4 3. For example, let s assume for a triangular graph (three vertices), OPT LP = 1.5 and OPT ILP = 2, then the IG is already 4 3. For better Integrality Gap we consider the complete graph K n with n vertices. Now if a solution (feasible) for the LP Vertex Cover problem sets x i = 1, i, then OPT 2 LP n. It s 2 trivial that for full connectivity, the smallest vertex cover gives OPT ILP = n 1. Hence, IG = 2 2 = 2 o( 1 ), so this is such an instance where LP is off by factor of 2. Asymptotically, n n it becomes exactly 2 when n. We can say that K n is a 2-integrality gap instance for LP and a certificate on the bad estimator of LP. This observations help us to understand the NP-hardness of the problem. It cannot be approximated arbitrarily well unless P = N P. The theorem of Dinur et al. [2] shows that computing a vertex cover is within a factor of 1.36 of the smallest vertex cover is as hard as computing the smallest vertex cover itself; hence the approximation ratio is also 1.36. We have another observation from Knot and Regev if we assume Unique Games Conjecture holds, then it cannot be approximated within any constant factor better than 1 2 ɛ. Furthermore, the current best approximation ratio for Vertex Cover is 2 θ( log n ) shown by Karakostas [1]. 3 Set Cover Problem Given a set of elements {1, 2,..., n} (called the universe, U) and a collection of m sets whose union equals the universe, the set cover problem is to identify the smallest sub-collection of the sets whose union equals the universe. Formulation of set cover Input: Universe, U = [n] = {1, 2,..., n}; Collection S 1, S 2,..., S m U. Goal: Find smallest I {1, 2,..., m} such that i I S i = U

Lecture 20: LP Relaxation and Approximation Algorithms 5 Formulation of ILP version of set cover: Let { 1 if i I x i = 0 if i I minimize subject to i [m] x i x i {0, 1}, i [m] i:u S i x i 1, u U. Formulation of LP relaxation version of set cover: minimize subject to i [m] x i 0 x i 1, i [m] i:u S i x i 1, u U. Rounding technique: This LP relaxation gives optimal fractional solution where x i [0, 1] for each set S i. However, we cannot apply rounding technique alike previous vertex cover problem here. Because an element can remain uncovered when it belongs to many sets with x i 0 for each of them. In this case, if we know that an element can belong to at most k number of sets, then we can use 1 as a threshold to round the fractional solutions to integral k solutions; like { x 1 if x i 1 k i = 0 if x i < 1 k Note that it is a feasible cover being k-approximation. However, in worst case k could be really large like n. So we are trying to find a better approximation guarantee. 2 RandomPick technique: We consider here each x i as a probability and x feasible solution as a probability distribution on options for choosing the subsets, e.g. x 1 is the probability to select S 1 set. The algorithm works as follows: Input: fractional feasible solution from LP relaxation, x 1,..., x m. I = φ for i = 1 to n I = I i, w.p. x i.

Lecture 20: LP Relaxation and Approximation Algorithms 6 Return I. The following claim is trivial, i.e. the expected cost of the sets we can pick is exactly the objective function of the LP relaxation and bounded by ILP. Claim 7. E( I ) = x i = OPT LP OPT ILP. i [m] Still, there is high probability such that an element could remain uncovered and the probability an element will be covered is not exactly 1. We first investigate what is the probability that an element is covered. Claim 8. Pr[u i I S i] 1 1 e. Proof. We first figure out Pr[u i I S i], i.e. probability that an element could not be covered. Pr[u S i ] = Pr[i I] = (1 x i ) e x i i I i:u S i i:u S i i:u S i We know the fact that 1 t e t, t 0 from Taylor series expansion of e t and from the constraint we have x i 1. So i:u S i Pr[u i I S i ] e i:u S i x i e 1 = 1 e. Eventually,. Pr[u i I S i ] 1 1 e Remark 1. LP relaxation using the RandomPick technique returns a set I that covers any element w.p. 1 1 e. RandomizedRound technique: To confirm that no element remains uncovered, we iterate RandomPick technique 2 ln n times. The algorithm is as follows Iterate 2 ln n times. At each iteration j, I j RandomPick Return I = 2 ln n j=1 I j. Remark 2. E[ I ] j E[ I j ] 2 ln n x i = 2 ln n OPT ILP i [m]

Lecture 20: LP Relaxation and Approximation Algorithms 7 Claim 9. Pr[u i I S i] 1 1 n 2. Proof. Pr[u i I S i ] = 1 2 ln n j=1 Pr[u i I j S i ] 1 (e 1 ) 2 ln n 1 1 n 2. Now we figure out how probable does I cover the full universe U by the following corollary. Corollary 10. Pr[I covers U] 1 1. n Proof. For any u U, we already saw, Pr[u i I j S i ] 1. n 2 Now for the whole universe we apply union bound on these to get Pr[ u U : u not covered] = 1 n U = 1 2 n n = 1 2 n. Finally, Pr[I covers U] 1 1 n. To observe that LP relaxation with RandmizedRound techniques gives us O(ln n)-approximation algorithm, we have the following theorem Theorem 11. RandomizedRound returns I s.t. I is a set cover for U and I (4 ln n + 2) OPT ILP with probability 1 1 0.4 (for large enough n). 2 n Proof. At first we evaluate with the help of Markov bound, Pr[ I (4 ln n + 2) OPT ILP ] = 1 Pr[ I > (4 ln n + 2) OPT ILP ] E[ I ] 1 (4 ln n + 2) OPT ILP 1 1 2 = 1 2. By Corollary 10 we already know Pr[I covers U] 1 1. Hence, by union bound n Pr[ I (4 ln n + 2) OPT ILP & I covers U] 1 (1 1 2 ) (1 (1 1 n )) = 1 2 1 n

Lecture 20: LP Relaxation and Approximation Algorithms 8 References [1] G. Karakostas. A better approximation ratio for the Vertex Cover problem. ECCC Report TR04-084, 2004. [2] I. Dinur and S. Safra. The importance of being biased. Proceedings of the 34th Annual ACM Symposium on Theory of Computing, pages 33-42, 2002.