Duality, Warm Starting, and Sensitivity Analysis for MILP

Similar documents
Bilevel Integer Optimization: Theory and Algorithms

Integer Programming Duality

IP Duality. Menal Guzelsoy. Seminar Series, /21-07/28-08/04-08/11. Department of Industrial and Systems Engineering Lehigh University

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Bilevel Integer Linear Programming

The Value function of a Mixed-Integer Linear Program with a Single Constraint

Bilevel Integer Programming

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Multiobjective Mixed-Integer Stackelberg Games

Decomposition-based Methods for Large-scale Discrete Optimization p.1

Duality for Mixed-Integer Linear Programs

Generation and Representation of Piecewise Polyhedral Value Functions

Introduction to Integer Linear Programming

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

A Benders Algorithm for Two-Stage Stochastic Optimization Problems With Mixed Integer Recourse

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting

Separation, Inverse Optimization, and Decomposition. Some Observations. Ted Ralphs 1 Joint work with: Aykut Bulut 1

The Value function of a Mixed-Integer Linear Program with a Single Constraint

Asteroide Santana, Santanu S. Dey. December 4, School of Industrial and Systems Engineering, Georgia Institute of Technology

Separation, Inverse Optimization, and Decomposition. Some Observations. Ted Ralphs 1 Joint work with: Aykut Bulut 1

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming

A New Subadditive Approach to Integer Programming: Theory and Algorithms

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009

Integer Programming ISE 418. Lecture 16. Dr. Ted Ralphs

IE418 Integer Programming

where X is the feasible region, i.e., the set of the feasible solutions.

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009

Introduction to Integer Programming

Column Generation in Integer Programming with Applications in Multicriteria Optimization

Cutting Planes in SCIP

Advances in Bayesian Network Learning using Integer Programming

Lagrangian Relaxation in MIP

On the Value Function of a Mixed Integer Linear Program

Bilevel Integer Linear Programming

Introduction to integer programming II

Polyhedral Approach to Integer Linear Programming. Tepper School of Business Carnegie Mellon University, Pittsburgh

Integer programming: an introduction. Alessandro Astolfi

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

IP Cut Homework from J and B Chapter 9: 14, 15, 16, 23, 24, You wish to solve the IP below with a cutting plane technique.

Subadditive Approaches to Mixed Integer Programming

1 Solution of a Large-Scale Traveling-Salesman Problem... 7 George B. Dantzig, Delbert R. Fulkerson, and Selmer M. Johnson

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Benders Decomposition

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory

The CPLEX Library: Mixed Integer Programming

Integer Programming for Bayesian Network Structure Learning

On the Value Function of a Mixed Integer Linear Optimization Problem and an Algorithm for its Construction

Integer programming (part 2) Lecturer: Javier Peña Convex Optimization /36-725

Discrete (and Continuous) Optimization WI4 131

Integer Programming Duality in Multiple Objective Programming

Operations Research Methods in Constraint Programming

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Column Generation. ORLAB - Operations Research Laboratory. Stefano Gualandi. June 14, Politecnico di Milano, Italy

Integer Programming Chapter 15

Integer Programming. Wolfram Wiesemann. December 6, 2007

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

AM 121: Intro to Optimization! Models and Methods! Fall 2018!

Math Models of OR: Branch-and-Bound

3.10 Lagrangian relaxation

Strong Dual for Conic Mixed-Integer Programs

Lecture 9: Dantzig-Wolfe Decomposition

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

Cutting Plane Separators in SCIP

Decision Procedures An Algorithmic Point of View

Stochastic Integer Programming

Computational Complexity. IE 496 Lecture 6. Dr. Ted Ralphs

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Improving Branch-And-Price Algorithms For Solving One Dimensional Cutting Stock Problem

LP Relaxations of Mixed Integer Programs

Column Generation. MTech Seminar Report. Soumitra Pal Roll No: under the guidance of

Cut-Generating Functions for Integer Linear Programming

Operations Research Lecture 6: Integer Programming

Software for Integer and Nonlinear Optimization

A Branch-and-cut Algorithm for Integer Bilevel Linear Programs

Large-scale optimization and decomposition methods: outline. Column Generation and Cutting Plane methods: a unified view

Integer Programming ISE 418. Lecture 13. Dr. Ted Ralphs

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Mixed-integer nonlinear programming, Conic programming, Duality, Cutting

Duality in Optimization and Constraint Satisfaction

Recoverable Robustness in Scheduling Problems

Overview of course. Introduction to Optimization, DIKU Monday 12 November David Pisinger

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA

BBM402-Lecture 20: LP Duality

Integer Programming for Bayesian Network Structure Learning

A notion of Total Dual Integrality for Convex, Semidefinite and Extended Formulations

18 hours nodes, first feasible 3.7% gap Time: 92 days!! LP relaxation at root node: Branch and bound

On the knapsack closure of 0-1 Integer Linear Programs

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

Interior-Point versus Simplex methods for Integer Programming Branch-and-Bound

Pedro Munari - COA 2017, February 10th, University of Edinburgh, Scotland, UK 2

Disconnecting Networks via Node Deletions

Projection in Logic, CP, and Optimization

Optimization Methods in Logic

ELE539A: Optimization of Communication Systems Lecture 16: Pareto Optimization and Nonconvex Optimization

Reformulation and Decomposition of Integer Programs

Linear Programming: Simplex

LOWER BOUNDS FOR INTEGER PROGRAMMING PROBLEMS

Transcription:

Duality, Warm Starting, and Sensitivity Analysis for MILP Ted Ralphs and Menal Guzelsoy Industrial and Systems Engineering Lehigh University SAS Institute, Cary, NC, Friday, August 19, 2005

SAS Institute 1 Outline of Talk A little bit of theory Duality Sensitivity analysis Warm starting A little bit of computation SYMPHONY 5.1 Examples

SAS Institute 2 Introduction to Duality For an optimization problem z = min{f(x) x X}, called the primal problem, an optimization problem w = max{g(u) u U} such that w z is called a dual problem. It is a strong dual if w = z. Uses for the dual problem Bounding Deriving optimality conditions Sensitivity analysis Warm starting

SAS Institute 3 Some Previous Work R. Gomory (and W. Baumol) ( 60 73) G. Roodman ( 72) E. Johnson (and Burdet) ( 72 81) R. Jeroslow (and C. Blair) ( 77-85) A. Geoffrion and R. Nauss ( 77) D. Klein and S. Holm ( 79 84) L. Wolsey (and L. Schrage) ( 81 84)... D. Klabjan ( 02)

SAS Institute 4 Duals for ILP Let P = {x R n Ax = b, x 0} nonempty for A Q m n, b Q m. We consider the (bounded) pure integer linear program min x P Z n c x for c R n. The most common dual for this ILP is the well-known Lagrangian dual. The Lagrangian dual is not generally strong. Blair and Jeroslow discussed how to make the Lagrangian dual strong by for ILP by introducing a quadratic penalty term. How do we derive a strong dual? notion of dual (Wolsey). Consider the following more formal w g IP = max g:r m R {g(b) g(ax) c x, x 0} (1) = max g:r m R {g(b) g(d) z IP (d), d R m }, (2) where z IP (d) = min x P I (d) c x is the value function and P I (d) = {x Z n Ax = d, x 0}

SAS Institute 5 Subadditive Duality Solutions to the dual (2) bound the value function from below. Any function that agrees with the value function at b, including the value function itself, is optimal. This shows the dual (2) is strong. Question: Under what restrictions on the function g does this remain a strong dual? g linear results in the dual of the LP relaxation not strong. g convex also results in the dual of the LP relaxation not strong. (Jeroslow) g subadditive strong (Gomory; Johnson; Jeroslow). In this case, the dual simplifies to w s IP = max{f(b) f(a i ) c i, f subadditive, f(0) = 0}, This is called the subadditive dual

SAS Institute 6 Optimality Conditions Weak Duality c x f(b) for all x P Z n and dual feasible f. The primal problem is infeasible if w s IP =. The dual problem is infeasible if z IP =. Strong Duality If either the primal problem or the dual problem has a finite optimal value, then There exists x P Z n and f dual feasible such that c x = f (b). c j f (a j ) 0 j 1,..., n. If x j > 0, then c j f (a j ) = 0. If the primal problem is infeasible, then either the dual is infeasible or w s IP =. If the dual problem is infeasible, then either the primal is infeasible or z IP =.

SAS Institute 7 Subadditivity and Valid Inequalities These results on valid inequalities follow directly from strong duality. Proposition 1. subadditive with f(a j ) π j j 1,..., n and f(b) π 0. π x π 0 x P Z n if and only if there exists f Proposition 2. conv(p Z n ) = {x R n + f(b) f subadditive, f(0) = 0}. n i=1 f(a j)x j These results are important for sensitivity analysis and warm starting. We can similarly derive theorems of the alternative and other generalizations of standard results from LP.

SAS Institute 8 Optimal Solutions to the Subadditive Dual The subadditive dual has most of the nice properties of the LP dual. With an optimal solution, we can calculate reduced costs, perform local sensitivity analysis, etc. How do we find optimal solutions to the subadditive dual? Ideally, these would be a by-product of a practical algorithm. If the primal function has a finite optimum, then the value function is an optimal solution. The value function has a closed form and is a member of the class C of Gomory functions defined by: 1. f C if f(v) = λv for λ Q r, r N. 2. If f C, then f C. 3. If f, g C and α, β 0, then αf + βg C. 4. If f, g C, then max{f, g} C.

SAS Institute 9 Optimal Solutions from Gomory s Cutting Plane Procedure An optimal dual solution f that is a member of the class of Chvátal functions (Gomory functions without requirement 4) can be obtaind as a by-product of Gomory s cutting plane procedure. f (d) = m λ r i d i + i=1 r λ r m+if i (d) i=1 for some finite r, where f j (d) is defined recursively by f 0 (d) = λ 0 d, λ 0 Q m. and f j (d) = m i=1 λ j 1 i d i + j 1 i=1 λ j 1 m+i f i (d) λ j+i = (λ j 1 1,..., λ j 1 m+j 1 ) 0

x Z E + SAS Institute 10 Another Approach to Generating Optimal Dual Functions Klabjan recently suggested another approach for the case when A and b are both nonnegative. Given a vector α R m, we define a generator subadditive function f α :R m + R by f α (d) = αd max i E (αa i c i )x i A E x d where E = {i N : αa i > c i } and A E is the submatrix of A consisting of the columns with indices in E. This can be seen as a generalization of the LP dual. Klabjan showed that there always exists an α R m such that f α is an optimal dual function.

SAS Institute 11 Example b = [ 1 2 2 1 ] c = [ 10 1 3 5 0 2 0.5 2 3 4 5 3 1 3 1 ] 1 0 0 0 1 1 1 0 0 0 1 1 1 0 1 A = 0 1 0 0 1 0 0 1 1 0 1 1 0 1 1 0 0 1 0 0 1 0 1 0 1 1 0 1 1 1 0 0 0 1 0 1 1 0 0 1 0 1 1 1 1

SAS Institute 12 Optimality Conditions from Branch and Bound Consider the implicit optimality conditions associated with branch and bound. Let P 1,..., P s be a partition of P into (nonempty) subpolyhedra. Let LP i be the linear program min x i P i c x i associated with the subpolyhedron P i. Let B i be an optimal basis for LP i. Then the following is a valid lower bound L = min{c B i(b i ) 1 b + γ i 1 i s}, where γ i is the constant factor associated with the nonbasic variables fixed at nonzero bounds. A similar function yields an upper bound. We call a partition that yields lower and upper bounds equal is called an optimal partition.

SAS Institute 13 Optimal Dual Functions from Branch and Bound The function L(d) = min{c B i(b i ) 1 d + γ i 1 i s}, is not subadditive, but provides an optimal solution to (2). The corresponding upper bounding function is U(c) = min{c B i(b i ) 1 b + β i 1 i s, ˆx i P I } These functions can be used for local sensitivity analysis, just as one would do in linear programming. For changes in the right-hand side, the lower bound remains valid. For changes in the objective function, the upper bound remains valid. It is possible to use a given partition to compute new bounds, but this requires additional computation.

SAS Institute 14 Some Additional Details The method presented only applies to branch and bound. Cut generation complicates matters because cuts may not be valid after changes to rim vectors. Bounds tightened by reduced cost may also not be valid. We have to deal with infeasibility of subproblems. One can compute an allowable range, as in LP, but it may be very small or empty. The bounds produced may not be useful. Question: What else can we do? Answers: Continue solving from a warm start to improve bounds. Perform a parametric analysis.

SAS Institute 15 Warm Starting Question: What is warm starting? Question: Why are we interested in it? There are many examples of algorithms that solve a sequence of related ILPs. Decomposition algorithms Stochastic ILP Parametric/Multicriteria ILP Determining irreducible inconsistent subsystem For such problems, warm starting can potentially yield big improvements. Warm starting is also important for performing sensitivity analysis outside of the allowable range.

SAS Institute 16 Warm Starting Information Question: What is warm starting information? Many optimization algorithms can be viewed as iterative procedures for satisfying a set of optimality conditions, often based on duality. These conditions provide a measure of distance from optimality. Warm starting information can be seen as additional input data that allows an algorithm to quickly get close to optimality. In integer linear programming, the duality gap is the usual measure. A good starting partition may reduce the initial duality gap. It is not at all obvious what makes a good starting partition. The most obvious choice is to use the optimal partition from a previous computation.

SAS Institute 17 Parametric Analysis For global sensitivity analysis, we need to solve parametric programs. Along with Saltzman and Wiecek, we have developed an algorithm for determining all Pareto outcomes for a bicriteria MILP. The algorithm consists of solving a sequence of related ILPs and is asymptotically optimal. Such an algorithm can be used to perform global sensitivity analysis by constructing a slice of the value function. Warm starting can be used to improve efficiency.

SAS Institute 18 Bicriteria MILPs The general form of a bicriteria (pure) ILP is vmax [cx, dx], s.t. Ax b, x Z n. Solutions don t have single objective function values, but pairs of values called outcomes. A feasible ˆx is called efficient if there is no feasible x such that c x cˆx and d x dˆx, with at least one inequality strict. The outcome corresponding to an efficient solution is called Pareto. The goal of a bicriteria ILP is to enumerate Pareto outcomes.

SAS Institute 19 A Really Brief Overview of SYMPHONY SYMPHONY is an open-source software package for solving and analyzing mixed-integer linear programs (MILPs). SYMPHONY can be used in three distinct modes. Black box solver: Solve generic MILPs (command line or shell). Callable library: Call SYMPHONY from a C/C++ code. Framework: Develop a customized black box solver or callable library. Fully integrated with the Computational Infrastructure for Operations Research (COIN-OR) libraries. The interface and features of SYMPHONY 5.1 give it the look and feel an LP solver. SYMPHONY Solvers - Generic MILP - Multicriteria MILP - Traveling Salesman Problem - Vehicle Routing Problem - Mixed Postman Problem - Set Partitioning Problem - Matching Problem - Network Routing

SAS Institute 20 Basic Sensitivity Analysis SYMPHONY will calculate bounds after changing the objective or righthand side vectors. int main(int argc, char **argv) { OsiSymSolverInterface si; si.parsecommandline(argc, argv); si.loadproblem(); si.setsymparam(osisymsensitivityanalysis, true); si.initialsolve(); int ind[2]; double val[2]; ind[0] = 4; val[0] = 7000; ind[1] = 7; val[1] = 6000; lb = si.getlbfornewrhs(2, ind, val); }

SAS Institute 21 Warm Starts for MILP To allow resolving from a warm start, we have defined a SYMPHONY warm start class, which is derived from CoinWarmStart. The class stores a snapshot of the search tree, with node descriptions including: lists of active cuts and variables, branching information, warm start information, and current status (candidate, fathomed, etc.). The tree is stored in a compact form by storing the node descriptions as differences from the parent. Other auxiliary information is also stored, such as the current incumbent. A warm start can be saved at any time and then reloaded later. The warm starts can also be written to and read from disk.

SAS Institute 22 Warm Starting Procedure After modifying parameters If only parameters have been modified, then the candidate list is recreated and the algorithm proceeds as if left off. This allows parameters to be tuned as the algorithm progresses if desired. After modifying problem data Currently, we only allow modification of rim vectors. After modification, all leaf nodes must be added to the candidate list. After constructing the candidate list, we can continue the algorithm as before. There are many opportunities for improving the basic scheme, especially when solving a known family of instances (Geoffrion and Nauss)

SAS Institute 23 Warm Starting Example (Parameter Modification) The following example shows a simple use of warm starting to create a dynamic algorithm. int main(int argc, char **argv) { OsiSymSolverInterface si; si.parsecommandline(argc, argv); si.loadproblem(); si.setsymparam(osisymfindfirstfeasible, true); si.setsymparam(osisymsearchstrategy, DEPTH_FIRST_SEARCH); si.initialsolve(); si.setsymparam(osisymfindfirstfeasible, false); si.setsymparam(osisymsearchstrategy, BEST_FIRST_SEARCH); si.resolve(); }

SAS Institute 24 Warm Starting Example (Problem Modification) The following example shows how to warm start after problem modification. int main(int argc, char **argv) { OsiSymSolverInterface si; CoinWarmStart ws; si.parsecommandline(argc, argv); si.loadproblem(); si.setsymparam(osisymnodelimit, 100); si.initialsolve(); ws = si.getwarmstart(); si.resolve(); si.setobjcoeff(0, 1); si.setobjcoeff(200, 150); si.setwarmstart(ws); si.resolve(); }

SAS Institute 25 Using Warm Starting: Generic Mixed-Intger Programming Applying the code from the previous slide to the MIPLIB 3 problem p0201, we obtain the results below. Note that the warm start doesn t reduce the number of nodes generated, but does reduce the solve time dramatically. CPU Time Tree Nodes Generate warm start 28 100 Solve orig problem (from warm start) 3 118 Solve mod problem (from scratch) 24 122 Solve mod problem (from warm start) 6 198

SAS Institute 26 Using Warm Starting: Generic Mixed-Intger Programming Here, we show the effect of using warm starting to solve generic MILPs whose objective functions have been perturbed. The coefficients were perturbed by a random percentage between α and α for α = 1, 10, 20. Table 1: Results of using warm starting to solve multi-criteria optimization problems.

SAS Institute 27 Using Warm Starting: Network Routing Table 2: Results of using warm starting to solve multi-criteria optimization problems.

SAS Institute 28 Using Warm Starting: Stochastic Integer Programming Problem Tree Size Tree Size % Gap % Gap CPU CPU Without WS With WS Without WS With WS Without WS With WS storm8 1 1 - - 14.75 8.71 storm27 5 5 - - 69.48 48.99 storm125 3 3 - - 322.58 176.88 LandS27 71 69 - - 6.50 4.99 LandS125 37 29 - - 15.72 12.72 LandS216 39 35 - - 30.59 24.80 dcap233 200 39 61 - - 256.19 120.86 dcap233 300 111 89 0.387-1672.48 498.14 dcap233 500 21 36 24.701 14.831 1003 1004 dcap243 200 37 53 0.622 0.485 1244.17 1202.75 dcap243 300 64 220 0.0691 0.0461 1140.12 1150.35 dcap243 500 29 113 0.357 0.186 1219.17 1200.57 sizes3 225 165 - - 789.71 219.92 sizes5 345 241 - - 964.60 691.98 sizes10 241 429 0.104 0.0436 1671.25 1666.75

SAS Institute 29 Example: Bicriteria ILP Consider the following bicriteria ILP: vmax [8x 1, x 2 ] s.t. 7x 1 + x 2 56 The following code solves this model. 28x 1 + 9x 2 252 3x 1 + 7x 2 105 x 1, x 2 0 int main(int argc, char **argv) { OsiSymSolverInterface si; si.parsecommandline(argc, argv); si.setobj2coeff(1, 1); si.loadproblem(); si.multicriteriabranchandbound(); }

SAS Institute 30 Example: Pareto Outcomes for Example

SAS Institute 31 Example: Bicriteria Solver By examining the supported solutions and break points, we can easily determine p(θ), the optimal solution to the ILP with objective 8x 1 + θ. θ range p(θ) x 1 x 2 (, 1.333) 64 8 0 (1.333, 2.667) 56 + 6θ 7 6 (2.667, 8.000) 40 + 12θ 5 12 (8.000, 16.000) 32 + 13θ 4 13 (16.000, ) 15θ 0 15

SAS Institute 32 Example: Graph of Price Function

SAS Institute 33 Using Warm Starting: Bicriteria Optimization Table 3: problems. Results of using warm starting to solve bicriteria optimization

SAS Institute 34 Conclusion We have briefly introduced the issues surrounding warm starting and sensitivity analysis for integer programming. An examination of early literature has yielded some ideas that can be useful in today s computational environment. We presented a new version of the SYMPHONY solver supporting warm starting and sensitivity analysis for MILPs. We have also demonstrated SYMPHONY s multicriteria optimization capabilities. This work has only scratched the surface of what can be done. In future work, we plan on refining SYMPHONY s warm start and sensitivity analysis capabilities. We will also provide more extensive computational results.