- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

Similar documents
Duality of LPs and Applications

BBM402-Lecture 20: LP Duality

Linear Programming. Scheduling problems

Topic: Primal-Dual Algorithms Date: We finished our discussion of randomized rounding and began talking about LP Duality.

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi

Maximum flow problem

SDP Relaxations for MAXCUT

Lecture #21. c T x Ax b. maximize subject to

Review Solutions, Exam 2, Operations Research

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

CS 6820 Fall 2014 Lectures, October 3-20, 2014

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n

Lectures 6, 7 and part of 8

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Combinatorial Optimization

A Review of Linear Programming

Linear and Combinatorial Optimization

Linear and Integer Programming - ideas

6. Linear Programming

Advanced Algorithms 南京大学 尹一通

1 Integer Decomposition Property

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Example Problem. Linear Program (standard form) CSCI5654 (Linear Programming, Fall 2013) Lecture-7. Duality

Maximum flow problem CE 377K. February 26, 2015

The Shortest Vector Problem (Lattice Reduction Algorithms)

Minimum cost transportation problem

A notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations

Agenda. Soviet Rail Network, We ve done Greedy Method Divide and Conquer Dynamic Programming

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Linear Programming Duality P&S Chapter 3 Last Revised Nov 1, 2004

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

CS261: A Second Course in Algorithms Lecture #9: Linear Programming Duality (Part 2)

Reconnect 04 Introduction to Integer Programming

Linear Programming Duality

Algorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem

Integer programming: an introduction. Alessandro Astolfi

Today: Linear Programming (con t.)

Chapter 7. Network Flow. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Advanced Linear Programming: The Exercises

Discrete Optimization

Cutting Plane Methods II

1. Algebraic and geometric treatments Consider an LP problem in the standard form. x 0. Solutions to the system of linear equations

3.7 Cutting plane methods

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017

Lagrange Relaxation and Duality

CS261: A Second Course in Algorithms Lecture #8: Linear Programming Duality (Part 1)

Lecture 15 (Oct 6): LP Duality

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming

Introduction to Semidefinite Programming I: Basic properties a

Discrete Optimization 2010 Lecture 8 Lagrangian Relaxation / P, N P and co-n P

Spring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization

Convex Optimization & Lagrange Duality

Lecture 5 January 16, 2013

The Primal-Dual Algorithm P&S Chapter 5 Last Revised October 30, 2006

GRAPH ALGORITHMS Week 7 (13 Nov - 18 Nov 2017)

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover

Multicommodity Flows and Column Generation

1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u)

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

3. Linear Programming and Polyhedral Combinatorics

Chapter 1. Preliminaries

EE364a Review Session 5

Solving the MWT. Recall the ILP for the MWT. We can obtain a solution to the MWT problem by solving the following ILP:

1 The independent set problem

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

CO 250 Final Exam Guide

Mathematical Programs Linear Program (LP)

Midterm Review. Yinyu Ye Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.

Lecture 5. Theorems of Alternatives and Self-Dual Embedding

Notes taken by Graham Taylor. January 22, 2005

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

7.5 Bipartite Matching

Convex and Semidefinite Programming for Approximation

Convex Optimization. Ofer Meshi. Lecture 6: Lower Bounds Constrained Optimization

3. Linear Programming and Polyhedral Combinatorics

Chapter 1 Linear Programming. Paragraph 5 Duality

6.854J / J Advanced Algorithms Fall 2008

Integer Programming (IP)

I.3. LMI DUALITY. Didier HENRION EECI Graduate School on Control Supélec - Spring 2010

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

DEPARTMENT OF STATISTICS AND OPERATIONS RESEARCH OPERATIONS RESEARCH DETERMINISTIC QUALIFYING EXAMINATION. Part I: Short Questions

Mathematics for Decision Making: An Introduction. Lecture 13

Scheduling on Unrelated Parallel Machines. Approximation Algorithms, V. V. Vazirani Book Chapter 17

IE 5531: Engineering Optimization I

Lagrange duality. The Lagrangian. We consider an optimization program of the form

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

Algorithm Design and Analysis

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Linear Programming. Chapter Introduction

An introductory example

Flows and Cuts. 1 Concepts. CS 787: Advanced Algorithms. Instructor: Dieter van Melkebeek

Linear Programming: Chapter 5 Duality

Linear Programming. Jie Wang. University of Massachusetts Lowell Department of Computer Science. J. Wang (UMass Lowell) Linear Programming 1 / 47

Integer Programming Duality

Introduction to Linear Programming

Lecture 6,7 (Sept 27 and 29, 2011 ): Bin Packing, MAX-SAT

Transcription:

LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs - LP-duality theorem, complementary slackness conditions - Min-max relations and LP-duality: max-flow min-cut example - LP-based approximate algorithms design: LP-rounding and Primal-dual schema - Integrality gap - Example: Maximum Matching and Minimum Vertex Cover: LP-duality point of view

Considering the decision version of an optimization problem: - Does it have Yes certificates? (Is it in NP?) - Does it have No certificates? (Is it in co-np?) Problems that have both Yes and No certificates are said to be well-characterized. Well-characterized problems are in NP co-np, which gives a hope of finding polynomial time algorithms for those. Min-max relations often help prove that a problem is well-characterized. [And many min-max relations are actually special cases of the LP-duality.] Example König-Egervary theorem: In any bipartite graph, max matching M M = min v.cover U U gives us No certificates for the questions - If G is bipartite, is min v.cover U U k? - If G is bipartite, is max matching M M k?

In general, max matching M M = min v.cover U U is not true. [Example: the Petersen graph.] Moreover, the Minimum vertex cover is NP-hard, and under the NP co-np assumption, it doesn t have No certificates. However, it does have approximate No certificates, that is, certificates for sufficiently small values of k. Since max matching M M min v.cover U U 2 max matching M M, if k < min v.cover U U / 2, we have No certificates for the is min v.cover U U k question. We say we have factor 2 No certificates for the Minimum vertex cover. In this case, our approximate No certificates are polynomial time computable. For the shortest lattice vector problem, we have factor n approximate No certificates, but we don t know how to compute those in polynomial time.

Primal and dual linear programs, the LP-duality theorem Minimization and maximization linear programs in the standard form: minimize n j=1 c j x j subject to n j=1 a ij x j b i, i = 1,, m x j 0, j = 1,, n maximize n j=1 c j x j subject to n j=1 a ij x j b i, i = 1,, m x j 0, j = 1,, n The function being optimized is called the objective function. Any solution that satisfies all the constraints is called a feasible solution. LP problems are well-characterized. Existence of Yes certificates is obvious. LP-duality provides us with No certificates.

Let s start with a minimization LP problem. We call it the primal program. To get a No certificate for the question is the minimum of the objective function a? we need to lower bound the objective function. One way of doing that is via carefully constructed linear combination of the constraints. That is, we introduce new variables y i 0 and use them as the linear combination coefficients: y 1 n j=1 a 1j x j + + y m n j=1 a mj x j m i=1 y i b i, as the coefficients are non-negative. What do we need to take care of here? That the objective function is no less than the right-hand side above. Since x j 0, we need the following conditions: m i=1 a ij y i c j, j = 1,, n And to get the best possible lower bound, we want m i=1 y i b i to be as large as possible. The result is an LP maximization problem, and we call it the dual program. A straightforward argument shows that dual to the dual program is the primal program.

Obviously, any feasible solution to the dual program gives a lower bound on the primal program s objective function. Conversely, any feasible solution to the primal program gives an upper bound on the dual program s objective function. That is, for any two feasible solutions {x j } and {y i } we have n j=1 c j x j m i=1 y i b i, and this is the Weak LP-duality theorem, which in many cases suffices for constructing approximation algorithms. In fact, a stronger min-max claim holds. LP-duality theorem The primal program has finite optimum iff the dual has finite optimum. For any two optimum solutions {x j } and {y i }, the values of the primal and dual objective functions are equal. Thus, the LP problem is well-characterized.

From the obvious chain of inequalities for two feasible solutions {x j } and {y i } n j=1 c j x j n j=1 ( m i=1 a ij y i ) x j = m i=1 ( n j=1 a ij x j ) y i m i=1 y i b i, we see that the solutions are both optimum iff the two inequalities are, in fact, equalities. This gives us the Complementary slackness conditions of optimality: for each 1 j n: either x j = 0 or m i=1 a ij y i = c j and for each 1 i m: either y i = 0 or n j=1 a ij x j = b i

LP-duality and max-flow min-cut theorem We ll now consider an example where LP-duality brings an exact min-max relation. Here s our setting: G = (V, E) is a directed graph with two distinguished nodes, source s and sink t, and positive edge capacities c: E R +. Flow f: E R + must satisfy the following conditions: - for any edge e, f(e) c(e) [capacity constraints] - for any node v, other than s and t, the total flow into v equals the total flow out of v [flow conservation] s-t cut is defined by a partition of V into two sets X and X c, so that s is in X and t is in X c, and consists of all the edges going from X to X c. The capacity of the cut, c(x, X c ), is the sum of capacities of all its edges. Max-flow problem: Find a flow with the maximum out-value (= total-out total-in) at s. Min-cut problem: Find an s-t cut with the minimum capacity.

Capacity of any s-t cut is an upper bound on the value of any feasible flow. Using LP-duality, we can show that it s possible to find a flow and an s-t cut with the equal values, which are actually the optimum values. To formulate the max-flow problem as a linear program, we introduce a fictitious edge of infinite capacity from t to s, which lets us require flow conservation at s and t. Our primal problem is then: maximize f ts subject to f ij c ij, for any edge (i, j) from E j: (j, i) f ji j: (i, j) f ij 0, for any node i from V f ij 0, for any edge (i, j) from E [Since the second type constraints must hold at each node and their total sum is 0, they actually must be satisfied with equality at each node, thus, they re equivalent to the flow conservation conditions.]

We ll now construct the dual program for our max-flow LP. Because we have two types of the constraints in the primal, we ll introduce two sets of the variables {d ij } (i, j) from E and {p i } i from V for the dual. Then we get: minimize (i, j) c ij d ij subject to d ij p i + p j 0, for any edge (i, j) (t, s) from E p s p t 1 d ij 0, for any edge (i, j) from E p i 0, for any node i from V The LP-duality theorem tells that the optimum for the primal program (the max flow) is equal to the optimum for the dual. Now, what if we require all the variables from the dual program to be 0/1 variables? Claim A: The restriction to 0/1 turns the dual program into the minimum s-t cut problem. Claim B: The optimum to the dual program is attained on 0/1 values of the variables. Taken together, these bring us the max-flow min-cut theorem.

Proving Claim A is easy. Claim B requires some work. The first step is to notice that the optimum for the unconstrained version is the same as for the version when the variables are restricted to [0, 1] interval. Then, we can show that the optimum for the version with the variables restricted to [0, 1] interval is actually attained at an integral point. [The dual convex polyhedron has integral vertices.] We call the dual program the LP-relaxation of the min s-t cut integer program. Feasible solutions to the dual program are called fractional solutions to the integer program (fractional s-t cuts for the min s-t cut integer program). Finally, we can derive certain properties of the optimum solutions to our max-flow and min-cut problems using the complementary slackness conditions: - edges from X to X c in the optimum cut must be saturated by the optimum flow - edges from X c to X must carry no flow.

LP-relaxation and LP-duality in algorithm design Many combinatorial optimizations problems can be stated as integer programs. LP-relaxations of those programs provide a natural way of bounding the optimum solution cost, which is often a key step in the design of exact (when relaxations are exact) or approximation algorithms. Two basic techniques are LP-rounding and primal-dual schema. LP-rounding: solve the LP-relaxation and convert the fractional solution obtained into an integral solution in such a way that the cost doesn t change much. The approximation guarantee is established by comparing the cost of the integral and fractional solutions. Primal-dual schema: an integral solution to the primal program and a feasible solution to the dual program are constricted iteratively. The approximation guarantee is established by comparing the cost of the two solutions. Method of dual fitting is used in analyzing combinatorially obtained approximation algorithms. Will see the applications in the later chapters.

Integrality gap, approximation guarantee, and rounding factor For a minimization problem Π and its LP-relaxation, let LP-OPT(I) denote the cost of an optimal fractional solution to instance I, and OPT(I) denote the cost of an optimal integral solution. We define the integrality gap of the relaxation to be sup I OPT(I) / LP-OPT(I) [For a maximization problem, the integrality gap is the infimum of the above ratio.] If the LP-relaxation has an integral optimal solution, then the integrality gap is 1, and we call the relaxation exact. Given an optimal fractional solution to instance I, we can turn it into an integral one, with the cost A(I), by rounding. Then the rounding factor for instance I is A(I) / LP-OPT(I). Thus, rounding factor of I = (approximation factor of I) x (integrality gap of I) So, (uniformly) minimizing the rounding factor, we can improve the approximation guarantee. Of course, the rounding factor can never be less than the integrality gap.

Maximum Matching and Minimum Vertex Cover: LP-duality point of view These two problems present us an example of duality and non-exact relaxation. Maximum matching (no edge weights) can be stated in this way: maximize (i, j) x ij subject to (i, j) x ij 1, for any node i from V x ij is in {0, 1}, for any edge (i, j) from E Minimum vertex cover (no vertex weights) can be stated in this way: minimize i y i subject to y i + y j 1, for any edge (i, j) from E y i is in {0, 1}, for any vertex i from V Their LP-relaxations are dual. The same applies to the weighted versions, if we choose the vertex and edge weights in a consistent way.

The integrality gap of LP-relaxation for Minimum Vertex Cover Taking K n, we see that the integrality gap can approach 2. We can show that the gap does not exceed 2. Given a feasible fractional solution {y i }, we choose this rounding: z i = 1 if y i 0.5 z i = 0 if y i < 0.5 This clearly gives us a valid solution. Now, the integrality gap sup ( i z i / i y i ) And it s easy to see that i z i 2 i y i.