Lecture 9: Dantzig-Wolfe Decomposition

Similar documents
Multicommodity Flows and Column Generation

Lecture 8: Column Generation

Lecture 8: Column Generation

Integer Programming ISE 418. Lecture 16. Dr. Ted Ralphs

Notes on Dantzig-Wolfe decomposition and column generation

Lagrangian Relaxation in MIP

Column Generation. MTech Seminar Report. Soumitra Pal Roll No: under the guidance of

Column Generation for Extended Formulations

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Column Generation. ORLAB - Operations Research Laboratory. Stefano Gualandi. June 14, Politecnico di Milano, Italy

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory

Linear Programming Inverse Projection Theory Chapter 3

Part 4. Decomposition Algorithms

Integer program reformulation for robust branch-and-cut-and-price

A Hub Location Problem with Fully Interconnected Backbone and Access Networks

Linear and Integer Programming - ideas

Resource Constrained Project Scheduling Linear and Integer Programming (1)

Integer Programming Reformulations: Dantzig-Wolfe & Benders Decomposition the Coluna Software Platform

Travelling Salesman Problem

3.7 Cutting plane methods

Reformulation and Decomposition of Integer Programs

Column Generation in Integer Programming with Applications in Multicriteria Optimization

Column generation for extended formulations

Cut and Column Generation

A generic view of Dantzig Wolfe decomposition in mixed integer programming

3.10 Lagrangian relaxation

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model

Decomposition Methods for Integer Programming

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Introduction to Bin Packing Problems

Cutting Plane Separators in SCIP

Integer Programming: Cutting Planes

Lecture #21. c T x Ax b. maximize subject to

Network Flows. 7. Multicommodity Flows Problems. Fall 2010 Instructor: Dr. Masoud Yaghini

Classification of Dantzig-Wolfe Reformulations for MIP s

LP Relaxations of Mixed Integer Programs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Lecture 7: Lagrangian Relaxation and Duality Theory

The Fixed Charge Transportation Problem: A Strong Formulation Based On Lagrangian Decomposition and Column Generation

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Introduction to Integer Linear Programming

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming

Discrete Optimization

0-1 Reformulations of the Network Loading Problem

Decomposition methods in optimization

Benders Decomposition

3.10 Column generation method

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Discrete (and Continuous) Optimization WI4 131

IP Cut Homework from J and B Chapter 9: 14, 15, 16, 23, 24, You wish to solve the IP below with a cutting plane technique.

Introduction to Integer Programming

Separation, Inverse Optimization, and Decomposition. Some Observations. Ted Ralphs 1 Joint work with: Aykut Bulut 1

Operations Research Lecture 6: Integer Programming

Integer Program Reformulation for Robust. Branch-and-Cut-and-Price Algorithms

Combinatorial Optimization

Fundamental Domains for Integer Programs with Symmetries

Interior-Point versus Simplex methods for Integer Programming Branch-and-Bound

1 Solution of a Large-Scale Traveling-Salesman Problem... 7 George B. Dantzig, Delbert R. Fulkerson, and Selmer M. Johnson

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Introduction to optimization and operations research

Optimization WS 13/14:, by Y. Goldstein/K. Reinert, 9. Dezember 2013, 16: Linear programming. Optimization Problems

3.10 Column generation method

A Capacity Scaling Procedure for the Multi-Commodity Capacitated Network Design Problem. Ryutsu Keizai University Naoto KATAYAMA

Feasibility Pump Heuristics for Column Generation Approaches

On the Polyhedral Structure of a Multi Item Production Planning Model with Setup Times

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n

Lecture 10: Linear programming duality and sensitivity 0-0

Decomposition and Reformulation in Integer Programming

Applications. Stephen J. Stoyan, Maged M. Dessouky*, and Xiaoqing Wang

1 Column Generation and the Cutting Stock Problem

Separation, Inverse Optimization, and Decomposition. Some Observations. Ted Ralphs 1 Joint work with: Aykut Bulut 1

CS 6820 Fall 2014 Lectures, October 3-20, 2014

Lift-and-Project Inequalities

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Sequential pairing of mixed integer inequalities

Exercises - Linear Programming

Duality of LPs and Applications

Decomposition-based Methods for Large-scale Discrete Optimization p.1

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

Linear Programming: Simplex

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2)

IE 5531: Engineering Optimization I

An introductory example

Integer programming: an introduction. Alessandro Astolfi

Fixed-charge transportation problems on trees

Lecture 10: Linear programming. duality. and. The dual of the LP in standard form. maximize w = b T y (D) subject to A T y c, minimize z = c T x (P)

Cutting Planes in SCIP

Branch-and-Price-and-Cut for the Split Delivery Vehicle Routing Problem with Time Windows

BBM402-Lecture 20: LP Duality

3.4 Relaxations and bounds

Large-scale optimization and decomposition methods: outline. Column Generation and Cutting Plane methods: a unified view

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

Introduction to integer programming II

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting

Transcription:

Lecture 9: Dantzig-Wolfe Decomposition (3 units) Outline Dantzig-Wolfe decomposition Column generation algorithm Relation to Lagrangian dual Branch-and-price method Generated assignment problem and multi-commodity flow problem References 1 / 30

Dantzig-Wolfe decomposition Consider the following integer programming problem: (P) min (c 1 ) T x 1 +(c 2 ) T x 2 + +(c k ) T x k s.t. A 1 x 1 +A 2 x 2 + +A K x K = b D 1 x 1 d 1 D K x K d K x 1 Z n 1 +,..., x K Z n k + The constraint matrix has a block angular structure. Let X k = {x k Z n k D k x k d k }. 2 / 30

The sets are independent for k = 1,..., K, only the joint constraint K k=1 Ak x k = b link together the different sets of variables. Dualizing the joint constraint, we have the following Lagrangian dual of (P): (D) max L(u), u where K L(u) = min{ ((c k ) T u T A k )x k + b T u x k X k, k} = k=1 K L k (u) + b T u. k=1 where L k (u) = min{((c k ) T u T A k )x k x k X k }. 3 / 30

We have discussed Lagrangian relaxation and dual search in the previous lectures. Another way of exploiting the block-angular structure is Dantzig-Wolfe decomposition, which was invented by Dantzig and Wolfe in1961. The method is very closely connected to column generation and they are often used interchangeably. The set X k in (P) can be either continuous (polyhedron) or discrete (integer set). Minkowski-Weyl s Theorem: Given the convex set X = {x R n Ax b}. X can be represented by the extreme points and extreme rays of X : X = {x = i λ i x i + j µ i y j i λ i = 1, λ i 0, µ j 0}. 4 / 30

Changing representation When X k is a bounded polyhedron, we can express X k as T k X k = {x k = λ kt x kt λ kt = 1, λ kt 0}, T k t=1 t=1 where x kt, t = 1,..., T k, are extreme points of X k. When X k is a finite integer set, we can express X k as T k T k X k = {x k = λ kt x kt λ kt = 1, λ kt {0, 1}}, t=1 t=1 where x kt, t = 1,..., T k, list all the points of X k. 5 / 30

IP Master Problem Now, we substitute the expression of x k in (P), leading to the following IP Master Problem: (IPM) K T k min ((c k ) T x kt )λ kt s.t. k=1 t=1 K T k (A k x kt )λ kt = b k=1 t=1 T k λ kt = 1, k = 1,..., K, t=1 λ kt {0, 1}, k, t. How to solve this equivalent problem? Note that the number of variables or columns in the constraint matrix is K k=1 T k, which is usually exponentially large. 6 / 30

LP master problem We consider the LP relaxation of (IPM): (LPM) K T k min ((c k ) T x kt )λ kt s.t. k=1 t=1 K T k (A k x kt )λ kt = b k=1 t=1 T k λ kt = 1, k = 1,..., K, t=1 λ kt 0, k, t. How can we find all the points (or extreme points in the continuous case) of X k? We can use the idea of column generation. 7 / 30

Dual of the LP master problem The dual problem of (LPM) is (DLPM) m max b i π i + K i=1 k=1 µ k s.t. πa k x kt + µ k (c k ) T x kt, t = 1,..., T k, k = 1,..., K, where (π, µ k ) R m R K. Notice that a column in (LPM) corresponds a row in (DLPM). 8 / 30

Column generation algorithm Initialization. Suppose that a subset of columns P = K k=1 P k (at least one for each k) is available. Consider the Restricted LP Master Problem: (RLPM) K min ((c k ) T x kt )λ kt k=1 t P k K s.t. (A k x kt )λ kt = b k=1 t P k λ kt = 1, k = 1,..., K, t P k λ kt 0, t P k, k = 1,..., K. Let λ and (π, µ) R m R K be the optimal primal and dual solutions to (RLPM), respectively. 9 / 30

Primal feasibility. Any feasible solution of (LPM) can be expanded to a feasible solution of (RLPM) (setting λ kt = 0 for those columns not in P k. So v(rlpm) = m K πb i + µ k v(lpm). i=1 k=1 Optimality check for (LPM). We need to check whether (π, µ) is dual feasible for (LPM). For any x X k, the corresponding column is (c k ) T x A k x e k. The reduced cost for this column is (c k ) T x π T A k x µ k. Solve the following subproblem: (SP) ζ k = min{((c k ) T πa k )x µ k x X k }. 10 / 30

Stopping criterion. If ζ k 0, then the solution (π, µ) is dual feasible to (LPM) and so K k=1 t P k ((c k ) T x kt )λ kt = m K πb i + µ k v(lpm). i=1 k=1 So the expanded solution of λ is optimal to (LPM). Generating a new column. If ζ k < 0 for some k, the column corresponding to the optimal solution x of the subproblem (SP) has negative reduced cost. Introducing the column (c k ) T x A k x e k into (RLPM) will lead to a new (RLPM), which can be easily re-optimized using Primal Simplex Method. 11 / 30

Dual (lower) bound. From the subproblem, we have which implies ζ k ((c k ) T πa k )x µ k, x X k, πa k x + (µ k + ζ k ) (c k ) T x, x X k. Therefore, setting ζ = (ζ 1,..., ζ K ), we have that (π, µ + ζ) is dual feasible to (LPM). We thus have π T b + K (µ k + ζ k ) v(lpm). k=1 12 / 30

Relation to Lagrangian dual Theorem: K K v(lpm) = min{ (c k ) T x k A k x k = b, x k conv(x k )}. k=1 k=1 Proof: Note that (LPM) is obtained by substituting T k x k = λ kt x kt, t=1 T k λ kt = 1, λ kt 0, t=1 where x kt, t = 1,..., T k, are all the integer points in X k. This is equivalent to x k conv(x k ). Theorem: v(lpm) = v(d). Therefore, the LP relaxation of the IP master problem is actually equivalent to the Lagrangian dual. 13 / 30

Consider the general IP problem: min{c T x Ax b, Dx d, x Z n }. Strength of the D-W decomposition and Lagrangian dual: x 1 x 2 x CONV( 2) Figure: X 1 = {x Z n Ax b}, X 2 = {x Z n Dx d}. 14 / 30

Branch-and-Price Method Consider the general IP problem: (IP) min c T x s.t. Ax b Dx d, x Z n. ( complicating constraints) ( easy constraints) We assume that optimization over the set X 2 = {x Z n Dx d} is easy. This will be used in solving the column generation subproblem. Let x 1,..., x T be all the integer points of X 2. Then X 2 = {x = T λ i x i i=1 T λ i = 1, λ i {0, 1}}. i=1 15 / 30

Use this representation of X 2, we can rewrite (IP) as T (IPM) min c T ( λ i x i ) i=1 T s.t. A( λ i x i ) b, i=1 T λ i = 1, i=1 λ i {0, 1}, i = 1,..., T. This is a reformulation of (IP). 16 / 30

The LP relaxation of (IPM) is at least as strong as the direct LP relaxation of (IP). (Because the LP relaxation of (IPM) is equivalent to the Lagrangian relaxation of (IP) by dualizing the complicating constraint Ax b. We solve the LP relaxation of (IPM) using column generation. Adding one column to the LP master problem corresponds to adding one cutting plane to the dual problem of (LPM). The column generation subproblem is an optimization problem over X 2, which can be solved efficiently in many applications. We can embed this bounding scheme into a branch and price framework. Price here is referred to find the column with negative reduced cost by solving the subproblem. How to branch? 17 / 30

Branching with Dantzig-Wolfe Decomposition Unfortunately, branching on the variables (λ i ) of the reformulation doesn t work well because it s generally difficult to keep a variable from being generated again after it s been fixed to zero. Branching must be done in a way that does not destroy the structure of the column generation master problem and subproblem. We can do this by branching on the original variables, i.e., before the reformulation. In a 0-1 problem, branching on the jth original variable is equivalent to fixing the value of some element of the columns to be generated. This can usually be incorporated into the column generation subproblem. 18 / 30

Generalized Assignment Problem The problem is to assign m tasks to n machines subject to capacity constraints. An IP formulation of tho problem is m n (GAP) max p ij z ij s.t. i=1 j=1 n z ij = 1, i = 1,..., m, j=1 m w ij z ij d j, j = 1,..., n, i=1 z ij {0, 1}, i = 1,..., m, j = 1,..., n. Let s the columns in (GAP) represent feasible assignments of tasks to the machine. m X 2 = {z {0, 1} n m w ij z ij d j, j = 1,..., n}. i=1 19 / 30

The Dantzig-Wolfe decomposition master problem is: T n m j max λ j j y j s.t. j=1 i=1 j=1 k=1 p ij k=1 T n j λ j j y j ik = 1, i = 1,..., m, T j i=1 λ j k ik = 1, j = 1,..., n, λ j k {0, 1}, j = 1,..., n, k = 1,..., T j. This is a set-covering problem. The column generation subproblems are n knapsack problem with a single constraint m i=1 w ijz ij d j. 20 / 30

Multi-commodity flow problem Let D = (V, A) be a directed graph with nonnegative capacities u a for all a A. Given k commodities, i.e., node pairs (s 1, t 1 ),..., (s k, t k ). For each i = 1,..., k, we want to find an s i t i -flow, (xa), i such that the sum of these flows on each arc does not exceed a given capacity. This flow is called a feasible multicommodity flow and can be expressed by the following constraints: xa i xa i = 0, v V \ {s i, t i }, i = 1,..., k, a δ + (v) a δ (v) k xa i u a, a A, i=1 where δ + (v) = {(v, w) (v, w) A} and δ (v) = {(u, v) (u, v) A}. 21 / 30

Edge formulation The multicommodity flow problem (MCF) is to maximize the sum of the flow values of each commodes: k max xa i s.t. i=1 a δ + (v) a δ + (s i ) x i a a δ (v) a δ (s i ) k xa i u a, a A. i=1 x i a x i a = 0, v V \ {s i, t i }, i = 1,..., k, This formulation is called edge formulation of MCF. It is a linear program and can be solved polynomially. However, if we require the flow to be integer, i.e., x i a Z +, then the MCF is NP-hard. 22 / 30

Path formulation We now give a path formulation for MCF, which can be viewed as the reformulation of the edge formulation using Dantzig-Wolfe decomposition. Let P st be the set of all s t paths. Let P = P s1,t 1 P sk,t k. Let P a denote the set of paths that use arc a, i.e., P a = {p P a p}. For each p P si,t i, we define a flow variable f p. The multi-commodity flow problem can be expressed as (MCF ) max p P f p s.t. p P a f p u a, a A, f a 0, a A. 23 / 30

Column generation for MCF Let P P be a subset of all paths. Consider the restricted LP master problem: (MCF ) max p P f p s.t. p P a f p u a, a A, f a 0, a A. Note that a feasible solution (f p) p P for (MCF ) can be expanded to a feasible solution (f p ) p P to (MCF ) (setting f p = 0 for p P \ P. Thus, v(mcf ) v(mcf ). 24 / 30

The dual of (MCF ) is (DMCF ) min a A u a µ a s.t. a p µ a 1, p P, µ a 0, a A. Reduced cost 0 Dual feasible. So, if µ a 1, p P, a p then the current solution is optimal to (MCF ). Otherwise, we need to find a new column to improve (MCF ). 25 / 30

Checking whether the above inequality is satisfied for all paths is called pricing subproblem. This can be done efficiently by computing the shortest path w.r.t. the weight vector µ a (a A) for each commodity (s i, t i ) (i = 1,..., k). If the shortest paths are all at least 1, then we mush have µ a 1, p P, a p Otherwise, we find a new path p that has the positive reduced cost and we can add this path (column) to (MCF ). Finding a shortest (s i, t i )-path in graph (V, A) with nonnegative weights µ a 0 (a A) is polynomial (e.g., by Dijkstra s algorithm). 26 / 30

Column generation algorithm for MCF Repeat 1. Solve (MCF ) for P 2. Let µ be the optimal multiplier vector of (MCF ) 3. for i = 1,..., k, do Find a shortest path s i t i -path w.r.t. weight vector (µ a) If weight of the shortest path p is less than 1, add p to (MCF ) 4. end for Until no path has been added How to get an integer solution? 27 / 30

Dantzig-Wolfe decomposition for MCF We now show that the path formulation can be obtained from Dantzig-Wolfe decomposition. We first write the edge formulation in a simple form: Let max c T x s.t. Nx = 0, Ux u, x 0. (flow conservation) (capacity constraints) P = {x 0 Nx = 0}. This polyhedron is a cone and has a single vertex 0 and a finite number of rays r 1,..., r s. By the Minkowski-Weyl theorem, we have s P = cone{r 1,..., r s } = { λ i r i λ i 0}. i=1 28 / 30

Using the above expression of P, we can rewrite the edge formulation as s max λ i (c T r i ) s.t. i=1 s λ i (Nr i ) u, i=1 λ i 0, i = 1,..., s. Analyzing the structure of c and U, the above problem can be reduced to s max i=1 y i s.t. y i u a, a A i:a p i y i 0, i = 1,..., s, which is equivalent to the path formulation (refer to Numhauser and Wolsey (1988) for more details) 29 / 30

References 1. F. Vanderbeck, M. W. P. Savelsbergh, A generic view of Dantzig-Wolfe decomposition in mixed-integer programming, Vol. 34(3), 296-306, Operations Research Letters, 2006. 2. F. Vanderbeck, L. A. Wolsey, Reformulation and decomposition of integer programs, in 50 Years of Integer Programming 1958-2008, Part 2, pp. 431-502, 2010. 3. Cynthia Barnhart, Ellis L. Johnson, George L. Nemhauser, Martin W. P. Savelsbergh and Pamela H. Vance, Branch-and-Price: Column Generation for Solving Huge Integer Programs, Operations Research, Vol. 46(3), 316-329, 1998. 4. Marc Pfetsch, Multi-commodity flow and column generation, Lecture Notes, ZIB, 2006. 30 / 30