Using Sparsity to Design Primal Heuristics for MILPs: Two Stories

Size: px
Start display at page:

Download "Using Sparsity to Design Primal Heuristics for MILPs: Two Stories"

Transcription

1 for MILPs: Two Stories Santanu S. Dey Joint work with: Andres Iroume, Marco Molinaro, Domenico Salvagnin, Qianyi Wang MIP Workshop, 2017

2 Sparsity in real" Integer Programs (IPs) Real" IPs are sparse: The average number (median) of non-zero entries in the constraint matrix of MIPLIB 2010 instances is 1.63% (0.17%). Many have "arrow shape" [Bergner, Caprara, Furini, Lübbecke, SPARSITY IN IPs Malaguti, Traversi 11] or "almost decomposable structure" of the constraint Many IPs matrix. in practice have sparse constraints Other Many example, have almost two-stage decomposable Stochasticstructure: IPs: 2-stage stochastic programming x As proxy, we keep in mind decomposable x

3 3 Sparsity in real" Integer Programs (IPs) Real" IPs are sparse: The average number (median) of non-zero entries in the constraint matrix of MIPLIB 2010 instances is 1.63% (0.17%). Many have "arrow shape" [Bergner, Caprara, Furini, Lübbecke, SPARSITY IN IPs Malaguti, Traversi 11] or "almost decomposable structure" of the constraint Many IPs matrix. in practice have sparse constraints Other Many example, have almost two-stage decomposable Stochasticstructure: IPs: 2-stage stochastic programming x As proxy, we keep in mind decomposable Goal: Exploit sparsity of IPs while designing primal heuristics? x

4 1 Story 1: Pump Joint work with: Andres Iroume 1, Marco Molinaro 2, and Domenico Salvagnin 3 1 Revenue Analytics, USA 2 PUC-Rio, Brazil 3 IBM Italy and DEI, University of Padova, Italy

5 5 : Feasibility FP + [Fischetti, Glover, Lodi 05] Vanilla Feasibility Pump Input: Mixed-binary LP (with binary variables x and continuous variables y) Solve the linear programming relaxation, and let ( x, ȳ) be an optimal solution While x is not integral do: Round: Round x to closest 0/1 values, call the obtained vector x. Project: Let ( x, ȳ) be the point in the LP relaxation that minimizes i x i x i (we say, x = l 1 -proj( x)).

6 6 : Feasibility FP + [Fischetti, Glover, Lodi 05] Vanilla Feasibility Pump Input: Mixed-binary LP (with binary variables x and continuous variables y) Solve the linear programming relaxation, and let ( x, ȳ) be an optimal solution While x is not integral do: Round: Round x to closest 0/1 values, call the obtained vector x. Project: Let ( x, ȳ) be the point in the LP relaxation that minimizes i x i x i (we say, x = l 1 -proj( x)). Problem: The above algorithm may cycle: Revisit the same x {0, 1} n is different iterations (stalling). Solution: Randomly perturb x.

7 7 : Feasibility FP + [Fischetti, Glover, Lodi 05] Vanilla Feasibility Pump Input: Mixed-binary LP (with binary variables x and continuous variables y) Solve the linear programming relaxation, and let ( x, ȳ) be an optimal solution while x is not integral do: Round: Round x to closest 0/1 values, call the obtained vector x. If stalling detected: Randomly perturb x to a different 0/1 vector. Project (l 1 -proj): Let ( x, ȳ) be the point in the LP relaxation that minimizes i x i x i.

8 8 Feasibility FP + FP is very successful in practice (For example, the original FP finds feasible solutions for 96.3% of the instances in MIPLIB 2003 instances). Many improvements and generalizations: [Achterberg, Berthold 07], [Bertacco, Fischetti, Lodi 07], [Bonami, Cornuéjols, Lodi, Margot 09], [Fischetti, Salvagnin 09], [Boland, Eberhard, Engineer, Tsoukalas 12], [D Ambrosio, Frangioni, Liberti, Lodi 12], [De Santis, Lucidi, Rinaldi 13], [Boland, Eberhard, Engineer, Fischetti, Savelsbergh, Tsoukalas 14], [Geißler, Morsi, Schewe, Schmidt 17],... Some directions of research: Take objective function into account Mixed-integer programs with general integer variables. Mixed-integer Non-linear programs (MINLP) Alternative projection and rounding steps

9 9 Feasibility FP + FP is very successful in practice (For example, the original FP finds feasible solutions for 96.3% of the instances in MIPLIB 2003 instances). Many improvements and generalizations: [Achterberg, Berthold 07], [Bertacco, Fischetti, Lodi 07], [Bonami, Cornuéjols, Lodi, Margot 09], [Fischetti, Salvagnin 09], [Boland, Eberhard, Engineer, Tsoukalas 12], [D Ambrosio, Frangioni, Liberti, Lodi 12], [De Santis, Lucidi, Rinaldi 13], [Boland, Eberhard, Engineer, Fischetti, Savelsbergh, Tsoukalas 14], [Geißler, Morsi, Schewe, Schmidt 17],... Some directions of research: Take objective function into account Mixed-integer programs with general integer variables. Mixed-integer Non-linear programs (MINLP) Alternative projection and rounding steps Randomization step plays significant role but has not been explicitly studied. We focus on changing the randomization step by "thinking about sparsity".

10 10 Sparse IPs Decomposable IPs FP + As discussed earlier real integer programs are sparse. A common example of sparse integer programs is those that are almost decomposable.

11 11 Sparse IPs Decomposable IPs SPARSITY IN IPs FP + Many IPs in practice have sparse constraints Many have almost decomposable structure: As 2-stage discussed stochastic earlier real integer programs are sparse. programming x A common example of sparse integer programs is those that are almost decomposable. As proxy, As proxy, we we keep keep in in mind decomposable. x

12 12 Agenda FP + Propose a modification of for the mixed-binary case. Show that this modified algorithm "works well" on-mixed-binary instances that are decomposable.

13 13 Agenda FP + Propose a modification of for the mixed-binary case. Show that this modified algorithm "works well" on-mixed-binary instances that are decomposable. Analyze randomization based on + Feasibility Pump. Show that this version of FP "works well" on single-row decomposable instances.

14 Agenda FP + Propose a modification of for the mixed-binary case. Show that this modified algorithm "works well" on-mixed-binary instances that are decomposable. Analyze randomization based on + Feasibility Pump. Show that this version of FP "works well" on single-row decomposable instances. Implementation of FP with new randomization step that combines ideas from the previous randomization and new randomization. The new method shows small but consistent improvement over FP.

15 1.1

16 16 FP + is effective primal heuristic used in SAT community [Schoning 99] : WALKSAT for pure-binary IPs - Start with a uniformly random point x {0,1} n. If feasible, done - Else, pick any violated constraint and randomly pick a variable x i in its support is effective - Flip value primal of x i heuristic used in SAT community [Schöning 99] - Repeat from Step 2 x = = for pure binary IPs Start with a uniformly random point x {0, 1} n. If feasible, done While x is infeasible do Pick any violated constraint and randomly pick a variable x i in its support Flip value of x i

17 17 Performance of FP + [Schöning 99] returns a feasible solution in 2 n iterations, in expectation. Key Ideas:

18 18 Performance of FP + [Schöning 99] returns a feasible solution in 2 n iterations, in expectation. Key Ideas: Consider a fixed integer feasible solution x. Track the number of coordinates that different from x.

19 19 Performance of FP + [Schöning 99] returns a feasible solution in 2 n iterations, in expectation. Key Ideas: Consider a fixed integer feasible solution x. Track the number of coordinates that different from x. In each step, with probability at least 1 s }{{} # non-zeros in violated constraint we choose to flip coordinate where they differ

20 20 Performance of FP + [Schöning 99] returns a feasible solution in 2 n iterations, in expectation. Key Ideas: Consider a fixed integer feasible solution x. Track the number of coordinates that different from x. In each step, with probability at least 1 s }{{} # non-zeros in violated constraint we choose to flip coordinate where they differ a positive constant With probability atleast 1 s, reduce by 1 the number of coordinates they differ.

21 21 2-stage stochastic programming good for decomposable instances As proxy, we keep in mind decomposable x FP + x Observation Each iteration depends only on one part. Overall execution can be split into independent executions over each part Put together bound from previous page over all parts

22 2-stage stochastic programming good for decomposable instances As proxy, we keep in mind decomposable x FP + x Observation Each iteration depends only on one part. Overall execution can be split into independent executions over each part Put together bound from previous page over all parts Consequences Find feasible solution in k2 n/k iterations, in expectation. Compare this to total enumeration 2 n iterations.

23 1.2 Mixed-binary version of

24 Mixed-binary version of FP + (l) for Mixed-binary IPs Input: Mixed-binary LP-relaxation {(x, y) Ax + By b} (with binary variables x and continuous variables y); parameter: l Start with a uniformly random point x {0, 1} n. If ȳ such that ( x, ȳ) is feasible, done While x Proj x (P) is infeasible do Generate minimal projected certificate of infeasibility: λ Ax λ b 1. a valid inequality for Proj x (P) i.e.,λ 0, λ B = violating x: (λ A) x > λ b 3. minimal with respect to support of λ. (Can be obtained by solving a LP) Randomly pick l variables (with replacement) in the support of minimal projected certificate. Flip value of these variables

25 25 FP + Key Observation: If a set is decomposable, then minimal certificate has a support contained in exactly one disjoint set of variables.

26 26 FP + Key Observation: If a set is decomposable, then minimal certificate has a support contained in exactly one disjoint set of variables. Theorem Consider a feasible decomposable mixed-binary set P I = P I 1... P I k, where for all i [k] we have P I i = P i ({0, 1} n i R d i ), P i = {(x i, y i ) [0, 1] n i R d i : A i x i + B i y i c i }. (1) Let s i be such that each constraint in P i has at most s i binary variables, and define γ i := min{s i (d i + 1), n i }. Then with probability at least 1 δ, (1) returns a feasible solution within ln(k/δ) n i 2 n i log γ i i iterations.

27 1.3 Feasibility Pump +

28 28 FP + (FPW) FP + Vanilla Feasibility Pump Input: Mixed-binary LP (with binary variables x and continuous variables y) Solve the linear programming relaxation, and let ( x, ȳ) be an optimal solution while x is not integral do: Round: Round x to closest 0/1 values, call the obtained vector x. If Stalling detected: "Randomly" perturb x to a different 0/1 vector. Project: Let ( x, ȳ) be the point in the LP relaxation that minimizes i x i x i.

29 29 FP + (FPW) FP + Feasibility Pump + Input: Mixed-binary LP (with binary variables x and continuous variables y) Solve the linear programming relaxation, and let ( x, ȳ) be an optimal solution while x is not integral do: Round: Round x to closest 0/1 values, call the obtained vector x. If Stalling detected: "Randomly" perturb x to a different 0/1 vector. < Use mixed-binary (l) for random update Project: Let ( x, ȳ) be the point in the LP relaxation that minimizes i x i x i.

30 30 Analysis of Feasibility Pump + (FPW) FP + 1. We are not able to analyze this algorithm WFP for a general mixed-binary IP Issue: From previous proof, with probability 1/s j randomization makes progress, but projection+rounding in next iteration could ruin everything

31 31 Analysis of Feasibility Pump + (FPW) FP + 1. We are not able to analyze this algorithm WFP for a general mixed-binary IP Issue: From previous proof, with probability 1/s j randomization makes progress, but projection+rounding in next iteration could ruin everything 2. Can analyze running-time for decomposable 1-row instances, i.e. instances of the following kind: a i x i + b i y i } = c i x i {0, 1} n i, y i R d i [k] i +.

32 32 Main result FP + Theorem Consider a feasible decomposable 1-row instances set as shown in the previous slide. Then with probability at least 1 δ, Feasibility Pump + (2) returns a feasible solution within T = ln(k/δ) i [k] n i (n i + 1) 2 2n i log n i ln(k/δ) k( n + 1) 2 2 n log n 2 iterations, where n = max i n i.

33 33 Main result FP + Theorem Consider a feasible decomposable 1-row instances set as shown in the previous slide. Then with probability at least 1 δ, Feasibility Pump + (2) returns a feasible solution within T = ln(k/δ) i [k] n i (n i + 1) 2 2n i log n i ln(k/δ) k( n + 1) 2 2 n log n 2 iterations, where n = max i n i. Note: Naive Feasibility Pump with original randomization may fail to converge for these instances.

34 34 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint.

35 35 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint. Notation: For x {0, 1} n : AltProj( x) := round (l 1 -proj( x)).

36 36 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint. Notation: For x {0, 1} n : AltProj( x) := round (l 1 -proj( x)). Proposition (Length of cycle) All cycles are due to short cycles, i.e. randomization is invoked only when AltProj( x) = x.

37 37 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint. Notation: For x {0, 1} n : AltProj( x) := round (l 1 -proj( x)). Proposition (Length of cycle) All cycles are due to short cycles, i.e. randomization is invoked only when AltProj( x) = x. Notation (Stabilization): AltProj ( x) = x, where AltProj k ( x) = AltProj k+1 ( x) = x for some k Z ++.

38 38 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint. Notation: For x {0, 1} n : AltProj( x) := round (l 1 -proj( x)). Proposition (Length of cycle) All cycles are due to short cycles, i.e. randomization is invoked only when AltProj( x) = x. Notation (Stabilization): AltProj ( x) = x, where AltProj k ( x) = AltProj k+1 ( x) = x for some k Z ++. # of iterations FPW [# iterations of AltProj ] }{{} Number of stallings max x {0,1} n min{k : altproj k ( x) = altproj ( x)}. }{{} Worst-case stabilization time

39 39 FP + Proof sketch - I (Like before) We can split the execution into independent executions over each constraint. Notation: For x {0, 1} n : AltProj( x) := round (l 1 -proj( x)). Proposition (Length of cycle) All cycles are due to short cycles, i.e. randomization is invoked only when AltProj( x) = x. Notation (Stabilization): AltProj ( x) = x, where AltProj k ( x) = AltProj k+1 ( x) = x for some k Z ++. # of iterations FPW [# iterations of AltProj ] }{{} Number of stallings max x {0,1} n min{k : altproj k ( x) = altproj ( x)}. }{{} Worst-case stabilization time Proposition (Worst-case stabilization time) For any x {0, 1} n, AltProj n+1 ( x) = AltProj( x).

40 40 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n.

41 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n. Key Question: What if we get closer to x in the step, but then go far away in the AltProj step.

42 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n. Key Question: What if we get closer to x in the step, but then go far away in the AltProj step. Lemma Consider a point x Proj x (P) {0, 1} n, and a point x {0, 1} n not in Proj x (P) {0, 1} n. Suppose altproj( x) = x.

43 43 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n. Key Question: What if we get closer to x in the step, but then go far away in the AltProj step. Lemma Consider a point x Proj x (P) {0, 1} n, and a point x {0, 1} n not in Proj x (P) {0, 1} n. Suppose altproj( x) = x. Then there is a point x {0, 1} n satisfying the following: 1. (close to x) x x (closer to x ) x x 0 x x 0 1

44 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n. Key Question: What if we get closer to x in the step, but then go far away in the AltProj step. Lemma Consider a point x Proj x (P) {0, 1} n, and a point x {0, 1} n not in Proj x (P) {0, 1} n. Suppose altproj( x) = x. Then there is a point x {0, 1} n satisfying the following: 1. (close to x) x x (closer to x ) x x 0 x x (projection control) l 1 -proj( x ) x Moreover, if we have the equality l 1 -proj( x ) x 1 = 1 in Item 3, then 2 x x 0 x x 0 2.

45 FP + Proof sketch - II [x 2 := AltProj ( x 1 )] [ x 2 := WALKSAT(x 2 )] [x 3 := AltProj ( x 2 )] [ x 3 := WALKSAT(x 3 )] [x 4 := AltProj ( x 3 )] [ x 4 := WALKSAT(x 4 )]... Like last time, we target a point x Proj x (P) {0, 1} n. Key Question: What if we get closer to x in the step, but then go far away in the AltProj step. Lemma Consider a point x Proj x (P) {0, 1} n, and a point x {0, 1} n not in Proj x (P) {0, 1} n. Suppose altproj( x) = x. Then there is a point x {0, 1} n satisfying the following: 1. (close to x) x x (closer to x ) x x 0 x x (projection control) l 1 -proj( x ) x Moreover, if we have the equality l 1 -proj( x ) x 1 = 1 in Item 3, then 2 x x 0 x x 0 2. Corollary Let x be a coordinate-wise maximal solution in {0, 1} n Proj x (P). Consider any point x {0, 1} n \ Proj x (P) satisfying altproj ( x) = x, and let x {0, 1} n be a point constructed in Lemma above with respect to x and x. Then altproj ( x ) x 0 x x

46 1.4

47 47 FP + Proposed randomization All features (such as constraint propagation) which are part of the Feasibility Pump 2.0 ([Fischetti, Salvagnin 09]) code have been left unchanged. The only change is in the randomization step.

48 48 FP + Proposed randomization All features (such as constraint propagation) which are part of the Feasibility Pump 2.0 ([Fischetti, Salvagnin 09]) code have been left unchanged. The only change is in the randomization step. Old randomization Define fractionality of i th variable: x i x i. Let F be the number of variables with positive fractionality. Randomly generate an integer TT (uniformly from {10,..., 30}). Flip the min{f, TT } variables with highest fractionality.

49 49 FP + Proposed randomization All features (such as constraint propagation) which are part of the Feasibility Pump 2.0 ([Fischetti, Salvagnin 09]) code have been left unchanged. The only change is in the randomization step. Old randomization Define fractionality of i th variable: x i x i. Let F be the number of variables with positive fractionality. Randomly generate an integer TT (uniformly from {10,..., 30}). Flip the min{f, TT } variables with highest fractionality. Flip the min{f, TT } variables with highest fractionality. If F < TT, then: let S be the union of the supports of the constraints that are not satisfied by the current point ( x, ȳ). Select uniformly at random min{ S, TT F } indices from S, and flip the values in x for all the selected indices.

50 50 Computational experiments FP + Two classes of : 1. Two-stage stochastic models (randomly generated) Ax + D i y i b i, i {1,..., k} x {0, 1} p y i {0, 1} q i {1,..., k} 2. MIPLIB 2010 Two algorithms: 1. FP: Feasibility pump FPWM: Feasibility pump the modified randomization above

51 51 Results for stochastic instances FP + # found itr. time (s) % gap % modified seed FP FPWM FP FPWM FP FPWM FP FPWM FPWM % 39% 22% % 36% 26% % 40% 25% % 35% 23% % 35% 25% % 37% 27% % 39% 27% % 37% 25% % 37% 26% % 38% 24% Avg % 37% 25% Table: Aggregated results by seed on two-stage stochastic models. 150 instances k {10, 20, 30, 40, 50} p = q {10, 20}

52 52 Results for MIPLIB 2010 instances FP + # found itr. time (s) % gap % modified seed FP FPWM FP FPWM FP FPWM FP FPWM FPWM % 48% 29% % 50% 22% % 47% 33% % 48% 25% % 51% 27% % 50% 32% % 49% 26% % 48% 31% % 49% 27% % 49% 31% Avg % 49% 28% Table: Aggregated results by seed on MIPLIB2010.

53 53 Conclusions FP + First ever analysis of running time of Feasibility Pump (even if it is for a special class of instances) Suggested changes are trivial to implement and appears to dominate feasibility pump almost consistently.

54 54 Conclusions FP + First ever analysis of running time of Feasibility Pump (even if it is for a special class of instances) Suggested changes are trivial to implement and appears to dominate feasibility pump almost consistently. "Designing for sparse instances" helps!

55 2 Story 2: Joint work with: Qianyi Wang 4 and Marco Molinaro 5 4 Uber Inc. 5 PUC-Rio, Brazil

56 56 Packing instances Key ideas Some definitions Algorithm and main result max s.t. where A and b is non-negative. c x Ax b x Z n +,

57 57 Packing instances Key ideas Some definitions Algorithm and main result max s.t. where A and b is non-negative. High-level sketch of algorithm The algorithm runs in two phases. c x Ax b x Z n +,

58 58 Packing instances Key ideas Some definitions Algorithm and main result max s.t. where A and b is non-negative. High-level sketch of algorithm The algorithm runs in two phases. c x Ax b x Z n +, 1. In the first phase, the sparse packing problem is partitioned into smaller parts and then these small integer programs are solved.

59 59 Packing instances Key ideas Some definitions Algorithm and main result max s.t. where A and b is non-negative. High-level sketch of algorithm The algorithm runs in two phases. c x Ax b x Z n +, 1. In the first phase, the sparse packing problem is partitioned into smaller parts and then these small integer programs are solved. 2. In the second phase, the optimal solutions of the smaller are patched together into a feasible solution for the original problem by exploiting the sparsity structure of the constraint matrix.

60 2.1 Key ideas

61 61 Key idea - I Key ideas Some definitions Algorithm and main result Suppose the matrix A looks like this: 5 blocks of variables. Unshaded boxes correspond to zeros in A A 1 x 1 + A 2 x 2 + A 3 x 3 + Ax 4 + A 5 x 5 b, x i Z n i +

62 62 Key idea - I Key ideas Some definitions Algorithm and main result Suppose we fix variables in blocks {2, 3, 4, 5} to zero and find a non-zero feasible solution in variables of block 1. A 1 x A 2 x 2 + A 3 x 3 + Ax 4 + A 5 x 5 b, x i Z n i + Let this solutions be: ( x 1, 0, 0, 0, 0)

63 63 Key idea - I Key ideas Some definitions Algorithm and main result Similarly we can fix variables in blocks {1, 3, 4, 5} to zero and find a non-zero feasible solution in variables of block 2. 0 A 1 x 1 + A 2 x A 3 x 3 + Ax 4 + A 5 x 5 b, x i Z n i + Let this solutions be: (0, x 2, 0, 0, 0)

64 64 Key idea - I Key ideas Some definitions Algorithm and main result Is ( x 1, 0, 0, 0, 0) + (0, x 2, 0, 0, 0) guaranteed to be a feasible solution?

65 65 Key idea - I Key ideas Some definitions Algorithm and main result Is ( x 1, 0, 0, 0, 0) + (0, x 2, 0, 0, 0) guaranteed to be a feasible solution? No!

66 66 Key idea - I Key ideas Some definitions Algorithm and main result Is ( x 1, 0, 0, 0, 0) + (0, 0, x 3, 0, 0) guaranteed to be a feasible solution?

67 67 Key idea - I Key ideas Some definitions Algorithm and main result Is ( x 1, 0, 0, 0, 0) + (0, 0, x 3, 0, 0) guaranteed to be a feasible solution? Yes!

68 68 Key idea - I A graph theoretic perspective Key ideas Some definitions Algorithm and main result

69 69 Key idea - I A graph theoretic perspective Key ideas Some definitions Algorithm and main result {1, 2} is not a stable set in the graph!

70 70 Key idea - I A graph theoretic perspective Key ideas Some definitions Algorithm and main result {1, 3} is a stable set in the graph!

71 71 Key idea - II Generalizing the notion of stable sets Key ideas Some definitions Algorithm and main result

72 72 Key idea - II Generalizing the notion of stable sets Key ideas Some definitions Algorithm and main result

73 73 Key idea - II Generalizing the notion of stable sets Key ideas Some definitions Algorithm and main result {{2, 3}, {5}} is a general-stable set in the graph!

74 2.2 Some definitions

75 75 Describing sparsity of A J1 J2 J3 J4 J5 J6 Key ideas Some definitions Algorithm and main result The matrix A with: 1. Column partition J := {J 1,..., J 6 }. 2. Unshaded boxes correspond to zeros in A. 3. Shaded boxes may have non-zero entries.

76 76 Describing sparsity of A J1 J2 J3 J4 J5 J6 v2 v3 v1 v4 Key ideas Some definitions Algorithm and main result v6 v5 The matrix A with: 1. Column partition J := {J 1,..., J 6 }. 2. Unshaded boxes correspond to zeros in A. 3. Shaded boxes may have non-zero entries. The corresponding graph G A,J : 1. One node for every block of variables. 2. (v i, v j ) E if and only if there is a row in A with non-zero entries in both parts J i and J j.

77 Some graph-theoretic definition I: General-stable set Key ideas Some definitions Algorithm and main result Definition (General stable set) Let G = (V, E) be a simple graph. Let V be a collection of subsets of the vertices V. We call a collection of subsets of vertices M 2 V a general stable set subordinate to V if the following hold: 1. Every set in M is contained in a set in V. 2. The sets in M are pairwise disjoint. 3. There are no edges of G with endpoints in distinct sets in M.

78 78 Some graph-theoretic definition I: General-stable set Key ideas Some definitions Algorithm and main result Definition (General stable set) Let G = (V, E) be a simple graph. Let V be a collection of subsets of the vertices V. We call a collection of subsets of vertices M 2 V a general stable set subordinate to V if the following hold: 1. Every set in M is contained in a set in V. 2. The sets in M are pairwise disjoint. 3. There are no edges of G with endpoints in distinct sets in M. Example: 1. V = {{v 1, v 2 }, {v 2, v 3 }, {v 3, v 4 }, {v 4, v 5}, {v 1, v 5}} 2. M = {{v 3 }, {v 1, v 5}} Original Graph: v1 v2 General Stable Set: v1 v2 v5 v3 v5 v3 v4 v4

79 79 Key ideas Some definitions Some graph-theoretic definition II: General-chromatic number Consider a simple graph G = (V, E) and a collection V of subset of vertices. General-chromatic number with respect to V (Denoted as η (G)): V It is the smallest number of general-stable sets M 1,..., M k subordinate to V that cover all vertices of the graph (that is, every vertex v V belongs to a set in one of the M i s). Algorithm and main result

80 80 Key ideas Some definitions Algorithm and main result Some graph-theoretic definition II: General-chromatic number Consider a simple graph G = (V, E) and a collection V of subset of vertices. General-chromatic number with respect to V (Denoted as η (G)): V It is the smallest number of general-stable sets M 1,..., M k subordinate to V that cover all vertices of the graph (that is, every vertex v V belongs to a set in one of the M i s). Fractional general-chromatic number with respect to V (Denoted as η(g)): V Given a general stable set M subordinate to V, let χ M {0, 1} V denote its incidence vector (that is, for each vertex v V, χ M (v) = 1 if v belongs to a set in M, and χ M (v) = 0 otherwise.) Then we define the fractional general-chromatic number η V (G) = min M y M s.t. M y M χ M 1 (2) y M 0 M, where the summations range over all general stable sets subordinate to V.

81 2.3 Algorithm

82 82 Algorithm Input: Interaction graph G A,J = (V, E); A collection V of subset of vertices. Let M be the collection of all general stable sets wrt V. Key ideas Some definitions Algorithm and main result

83 83 Key ideas Some definitions Algorithm and main result Algorithm Input: Interaction graph G A,J = (V, E); A collection V of subset of vertices. Let M be the collection of all general stable sets wrt V. Solve Sub: For W U V. w W := max s.t. c x Ax b x j = 0, variables not in W. x j Z ++ Optimal solution: x W

84 84 Key ideas Some definitions Algorithm and main result Algorithm Input: Interaction graph G A,J = (V, E); A collection V of subset of vertices. Let M be the collection of all general stable sets wrt V. Solve Sub: For W U V. w W := max s.t. c x Ax b x j = 0, variables not in W. x j Z ++ Optimal solution: x W Patching Integer program: z A := max Û w W y W s.t. y W1 + y W2 1 if W 1 W 2 y {0, 1} W y W1 + y W2 1 if v i W i, (v 1, v 2 ) E Optimal solution: y

85 85 Key ideas Some definitions Algorithm and main result Algorithm Input: Interaction graph G A,J = (V, E); A collection V of subset of vertices. Let M be the collection of all general stable sets wrt V. Solve Sub: For W U V. w W := max s.t. c x Ax b x j = 0, variables not in W. x j Z ++ Optimal solution: x W Patching Integer program: z A := max Û w W y W s.t. y W1 + y W2 1 if W 1 W 2 y {0, 1} W Optimal solution: y Output: x A := M W y W x W y W1 + y W2 1 if v i W i, (v 1, v 2 ) E

86 86 Performance guarantee Key ideas Some definitions Algorithm and main result Theorem Let x be the optimal solution of packing IP, and x A be the solution produced by Algorithm, with column partition J (resulting in packing interaction graph G A,J ) and list V. Then ( ) c x A 1 c x. η V (G A,J )

87 2.4

88 88 Two-stage instances Key ideas Some definitions Algorithm and main result Table: Gaps for two-stage instances Instance Features #Variables Algo. CPLEX GRASP nv1-5s100/z (2) NA(5) nv1-5s100/z nv1-5s100/z nv1-5s100/z nv1-5s150/z nv1-5s150/z nv1-5s150/z nv1-5s150/z nv1-5s200/z5 1, nv1-5s200/z10 1, nv1-5s200/z15 1, nv1-5s200/z20 1,

89 89 Three stage instances Key ideas Some definitions Algorithm and main result Table: Gaps for three stage instances Instance Features #Variables Algo. CPLEX GRASP nv1-5-25s100/z5 3, nv1-5-25s100/z10 3, nv1-5-25s100/z15 3, nv1-5-25s100/z20 3, nv1-5-25s150/z5 4, nv1-5-25s150/z10 4, nv1-5-25s150/z15 4, nv1-5-25s150/z20 4, nv1-5-25s200/z5 6, nv1-5-25s200/z10 6, nv1-5-25s200/z15 6, nv1-5-25s200/z20 6,

90 90 Key ideas Some definitions Algorithm and main result Random Graph Instances Table: Relative gap of various algorithms (random graph instances) Instance Features #Variables Algo. CPLEX GRASP nv50pb3s400/z5 20, nv50pb3s400/z10 20, nv50pb3s400/z15 20, (1) (1) nv50pb3s400/z20 20,000 NA(5) NA(5) nv50pb5s400/z5 20, nv50pb5s400/z10 20, nv50pb5s400/z15 20, nv50pb5s400/z20 20,000 NA(5) (1) nv50pb8s400/z5 20, nv50pb8s400/z10 20, nv50pb8s400/z15 20, nv50pb8s400/z20 20, nv50pb10s400/z5 20, nv50pb10s400/z10 20, nv50pb10s400/z15 20, nv50pb10s400/z20 20,

91 91 Key ideas Some definitions Algorithm and main result Random Graph Instances - II Table: Relative gap of various algorithms (random graph instances) - II Instance Features #Variables Gap A Gap IP Gap GS GS nv100pb3s200/z5 20, nv100pb3s200/z10 20, nv100pb3s200/z15 20, nv100pb3s200/z20 20, nv100pb5s200/z5 20, nv100pb5s200/z10 20, nv100pb5s200/z15 20, nv100pb5s200/z20 20, nv100pb8s200/z5 20, nv100pb8s200/z10 20, nv100pb8s200/z15 20, nv100pb8s200/z20 20, nv100pb10s200/z5 20, nv100pb10s200/z10 20, nv100pb10s200/z15 20, nv100pb10s200/z20 20,

92 92 Conclusion Key ideas Some definitions Algorithm and main result Some comments: Our approximation algorithm almost always obtains a significantly better solution than CPLEX and GRASP.

93 93 Conclusion Key ideas Some definitions Algorithm and main result Some comments: Our approximation algorithm almost always obtains a significantly better solution than CPLEX and GRASP. Our algorithm is very easily parallelizable. We may solve each of the sub-ips separately. Dual bounds: Also parallelizable: z UB = max s.t. c T x j variables in (U) 0 x 1 c j x j }{{} w U, U V Opt.obj.ofsubproblem [Huchette, D., Vielma 16]

Improving the Randomization Step in Feasibility Pump using WalkSAT

Improving the Randomization Step in Feasibility Pump using WalkSAT Improving the Randomization Step in Feasibility Pump using Santanu S. Dey Joint work with: Andres Iroume, Marco Molinaro, Domenico Salvagnin Discrepancy & IP workshop, 2018 Sparsity in real" Integer Programs

More information

Analysis of Sparse Cutting-plane for Sparse IPs with Applications to Stochastic IPs

Analysis of Sparse Cutting-plane for Sparse IPs with Applications to Stochastic IPs Analysis of Sparse Cutting-plane for Sparse IPs with Applications to Stochastic IPs Santanu S. Dey 1, Marco Molinaro 2, and Qianyi Wang 1 1 School of Industrial and Systems Engineering, Georgia Institute

More information

Mixed Integer Programming Solvers: from Where to Where. Andrea Lodi University of Bologna, Italy

Mixed Integer Programming Solvers: from Where to Where. Andrea Lodi University of Bologna, Italy Mixed Integer Programming Solvers: from Where to Where Andrea Lodi University of Bologna, Italy andrea.lodi@unibo.it November 30, 2011 @ Explanatory Workshop on Locational Analysis, Sevilla A. Lodi, MIP

More information

Heuristics for nonconvex MINLP

Heuristics for nonconvex MINLP Heuristics for nonconvex MINLP Pietro Belotti, Timo Berthold FICO, Xpress Optimization Team, Birmingham, UK pietrobelotti@fico.com 18th Combinatorial Optimization Workshop, Aussois, 9 Jan 2014 ======This

More information

On Counting Lattice Points and Chvátal-Gomory Cutting Planes

On Counting Lattice Points and Chvátal-Gomory Cutting Planes On Counting Lattice Points and Chvátal-Gomory Cutting Planes Andrea Lodi 1, Gilles Pesant 2, and Louis-Martin Rousseau 2 1 DEIS, Università di Bologna - andrea.lodi@unibo.it 2 CIRRELT, École Polytechnique

More information

SPARSITY IN INTEGER PROGRAMMING. A Dissertation Presented to The Academic Faculty. Andres Iroume

SPARSITY IN INTEGER PROGRAMMING. A Dissertation Presented to The Academic Faculty. Andres Iroume SPARSITY IN INTEGER PROGRAMMING A Dissertation Presented to The Academic Faculty By Andres Iroume In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the School of Industrial

More information

On the knapsack closure of 0-1 Integer Linear Programs. Matteo Fischetti University of Padova, Italy

On the knapsack closure of 0-1 Integer Linear Programs. Matteo Fischetti University of Padova, Italy On the knapsack closure of 0-1 Integer Linear Programs Matteo Fischetti University of Padova, Italy matteo.fischetti@unipd.it Andrea Lodi University of Bologna, Italy alodi@deis.unibo.it Aussois, January

More information

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover duality 1 Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover Guy Kortsarz duality 2 The set cover problem with uniform costs Input: A universe U and a collection of subsets

More information

The Strength of Multi-row Aggregation Cuts for Sign-pattern Integer Programs

The Strength of Multi-row Aggregation Cuts for Sign-pattern Integer Programs The Strength of Multi-row Aggregation Cuts for Sign-pattern Integer Programs Santanu S. Dey 1, Andres Iroume 1, and Guanyi Wang 1 1 School of Industrial and Systems Engineering, Georgia Institute of Technology

More information

Benders Decomposition

Benders Decomposition Benders Decomposition Yuping Huang, Dr. Qipeng Phil Zheng Department of Industrial and Management Systems Engineering West Virginia University IENG 593G Nonlinear Programg, Spring 2012 Yuping Huang (IMSE@WVU)

More information

Projected Chvátal-Gomory cuts for Mixed Integer Linear Programs. Pierre Bonami CMU, USA. Gerard Cornuéjols CMU, USA and LIF Marseille, France

Projected Chvátal-Gomory cuts for Mixed Integer Linear Programs. Pierre Bonami CMU, USA. Gerard Cornuéjols CMU, USA and LIF Marseille, France Projected Chvátal-Gomory cuts for Mixed Integer Linear Programs Pierre Bonami CMU, USA Gerard Cornuéjols CMU, USA and LIF Marseille, France Sanjeeb Dash IBM T.J. Watson, USA Matteo Fischetti University

More information

Polyhedral Approach to Integer Linear Programming. Tepper School of Business Carnegie Mellon University, Pittsburgh

Polyhedral Approach to Integer Linear Programming. Tepper School of Business Carnegie Mellon University, Pittsburgh Polyhedral Approach to Integer Linear Programming Gérard Cornuéjols Tepper School of Business Carnegie Mellon University, Pittsburgh 1 / 30 Brief history First Algorithms Polynomial Algorithms Solving

More information

Solving Linear and Integer Programs

Solving Linear and Integer Programs Solving Linear and Integer Programs Robert E. Bixby ILOG, Inc. and Rice University Ed Rothberg ILOG, Inc. DAY 2 2 INFORMS Practice 2002 1 Dual Simplex Algorithm 3 Some Motivation Dual simplex vs. primal

More information

Computational Experiments with Cross and Crooked Cross Cuts

Computational Experiments with Cross and Crooked Cross Cuts Submitted to INFORMS Journal on Computing manuscript (Please, provide the mansucript number!) Authors are encouraged to submit new papers to INFORMS journals by means of a style file template, which includes

More information

Computational Experiments with Cross and Crooked Cross Cuts

Computational Experiments with Cross and Crooked Cross Cuts Computational Experiments with Cross and Crooked Cross Cuts Sanjeeb Dash IBM Research sanjeebd@us.ibm.com Oktay Günlük IBM Research gunluk@us.ibm.com Juan Pablo Vielma Massachusetts Institute of Technology

More information

Branch-and-cut Approaches for Chance-constrained Formulations of Reliable Network Design Problems

Branch-and-cut Approaches for Chance-constrained Formulations of Reliable Network Design Problems Branch-and-cut Approaches for Chance-constrained Formulations of Reliable Network Design Problems Yongjia Song James R. Luedtke August 9, 2012 Abstract We study solution approaches for the design of reliably

More information

Spring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization

Spring 2017 CO 250 Course Notes TABLE OF CONTENTS. richardwu.ca. CO 250 Course Notes. Introduction to Optimization Spring 2017 CO 250 Course Notes TABLE OF CONTENTS richardwu.ca CO 250 Course Notes Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4, 2018 Table

More information

Topics in Theoretical Computer Science April 08, Lecture 8

Topics in Theoretical Computer Science April 08, Lecture 8 Topics in Theoretical Computer Science April 08, 204 Lecture 8 Lecturer: Ola Svensson Scribes: David Leydier and Samuel Grütter Introduction In this lecture we will introduce Linear Programming. It was

More information

Introduction to Integer Linear Programming

Introduction to Integer Linear Programming Lecture 7/12/2006 p. 1/30 Introduction to Integer Linear Programming Leo Liberti, Ruslan Sadykov LIX, École Polytechnique liberti@lix.polytechnique.fr sadykov@lix.polytechnique.fr Lecture 7/12/2006 p.

More information

Indicator Constraints in Mixed-Integer Programming

Indicator Constraints in Mixed-Integer Programming Indicator Constraints in Mixed-Integer Programming Andrea Lodi University of Bologna, Italy - andrea.lodi@unibo.it Amaya Nogales-Gómez, Universidad de Sevilla, Spain Pietro Belotti, FICO, UK Matteo Fischetti,

More information

MIR Closures of Polyhedral Sets

MIR Closures of Polyhedral Sets MIR Closures of Polyhedral Sets Sanjeeb Dash IBM Research Oktay Günlük IBM Research Andrea Lodi Univ. Bologna February 28, 2007 Abstract We study the mixed-integer rounding (MIR) closures of polyhedral

More information

Integer Programming ISE 418. Lecture 13. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 13. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 13 Dr. Ted Ralphs ISE 418 Lecture 13 1 Reading for This Lecture Nemhauser and Wolsey Sections II.1.1-II.1.3, II.1.6 Wolsey Chapter 8 CCZ Chapters 5 and 6 Valid Inequalities

More information

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 12 Dr. Ted Ralphs ISE 418 Lecture 12 1 Reading for This Lecture Nemhauser and Wolsey Sections II.2.1 Wolsey Chapter 9 ISE 418 Lecture 12 2 Generating Stronger Valid

More information

From structures to heuristics to global solvers

From structures to heuristics to global solvers From structures to heuristics to global solvers Timo Berthold Zuse Institute Berlin DFG Research Center MATHEON Mathematics for key technologies OR2013, 04/Sep/13, Rotterdam Outline From structures to

More information

Distributed Optimization. Song Chong EE, KAIST

Distributed Optimization. Song Chong EE, KAIST Distributed Optimization Song Chong EE, KAIST songchong@kaist.edu Dynamic Programming for Path Planning A path-planning problem consists of a weighted directed graph with a set of n nodes N, directed links

More information

Integer Linear Programs

Integer Linear Programs Lecture 2: Review, Linear Programming Relaxations Today we will talk about expressing combinatorial problems as mathematical programs, specifically Integer Linear Programs (ILPs). We then see what happens

More information

Theoretical challenges towards cutting-plane selection

Theoretical challenges towards cutting-plane selection Theoretical challenges towards cutting-plane selection Santanu S. Dey 1 and Marco Molinaro 2 1 School of Industrial and Systems Engineering, Georgia Institute of Technology 2 Computer Science Department,

More information

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 8 Dr. Ted Ralphs ISE 418 Lecture 8 1 Reading for This Lecture Wolsey Chapter 2 Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Duality for Mixed-Integer

More information

A Lower Bound on the Split Rank of Intersection Cuts

A Lower Bound on the Split Rank of Intersection Cuts A Lower Bound on the Split Rank of Intersection Cuts Santanu S. Dey H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology. 200 Outline Introduction: split rank,

More information

Computing with multi-row Gomory cuts

Computing with multi-row Gomory cuts Computing with multi-row Gomory cuts Daniel G. Espinoza Departamento de Ingeniería Industrial, Universidad de Chile, Av. República 71, Santiago, 837-439, Chile Abstract Recent advances on the understanding

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

Homework Assignment 4 Solutions

Homework Assignment 4 Solutions MTAT.03.86: Advanced Methods in Algorithms Homework Assignment 4 Solutions University of Tartu 1 Probabilistic algorithm Let S = {x 1, x,, x n } be a set of binary variables of size n 1, x i {0, 1}. Consider

More information

Solving Mixed-Integer Nonlinear Programs

Solving Mixed-Integer Nonlinear Programs Solving Mixed-Integer Nonlinear Programs (with SCIP) Ambros M. Gleixner Zuse Institute Berlin MATHEON Berlin Mathematical School 5th Porto Meeting on Mathematics for Industry, April 10 11, 2014, Porto

More information

A Hub Location Problem with Fully Interconnected Backbone and Access Networks

A Hub Location Problem with Fully Interconnected Backbone and Access Networks A Hub Location Problem with Fully Interconnected Backbone and Access Networks Tommy Thomadsen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark tt@imm.dtu.dk

More information

Cutting Planes in SCIP

Cutting Planes in SCIP Cutting Planes in SCIP Kati Wolter Zuse-Institute Berlin Department Optimization Berlin, 6th June 2007 Outline 1 Cutting Planes in SCIP 2 Cutting Planes for the 0-1 Knapsack Problem 2.1 Cover Cuts 2.2

More information

where X is the feasible region, i.e., the set of the feasible solutions.

where X is the feasible region, i.e., the set of the feasible solutions. 3.5 Branch and Bound Consider a generic Discrete Optimization problem (P) z = max{c(x) : x X }, where X is the feasible region, i.e., the set of the feasible solutions. Branch and Bound is a general semi-enumerative

More information

3.10 Lagrangian relaxation

3.10 Lagrangian relaxation 3.10 Lagrangian relaxation Consider a generic ILP problem min {c t x : Ax b, Dx d, x Z n } with integer coefficients. Suppose Dx d are the complicating constraints. Often the linear relaxation and the

More information

Strengthened Benders Cuts for Stochastic Integer Programs with Continuous Recourse

Strengthened Benders Cuts for Stochastic Integer Programs with Continuous Recourse Strengthened Benders Cuts for Stochastic Integer Programs with Continuous Recourse Merve Bodur 1, Sanjeeb Dash 2, Otay Günlü 2, and James Luedte 3 1 Department of Mechanical and Industrial Engineering,

More information

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009 Lessons from MIP Search John Hooker Carnegie Mellon University November 2009 Outline MIP search The main ideas Duality and nogoods From MIP to AI (and back) Binary decision diagrams From MIP to constraint

More information

Topics in Approximation Algorithms Solution for Homework 3

Topics in Approximation Algorithms Solution for Homework 3 Topics in Approximation Algorithms Solution for Homework 3 Problem 1 We show that any solution {U t } can be modified to satisfy U τ L τ as follows. Suppose U τ L τ, so there is a vertex v U τ but v L

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Chapter 26 Semidefinite Programming Zacharias Pitouras 1 Introduction LP place a good lower bound on OPT for NP-hard problems Are there other ways of doing this? Vector programs

More information

CO 250 Final Exam Guide

CO 250 Final Exam Guide Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,

More information

23. Cutting planes and branch & bound

23. Cutting planes and branch & bound CS/ECE/ISyE 524 Introduction to Optimization Spring 207 8 23. Cutting planes and branch & bound ˆ Algorithms for solving MIPs ˆ Cutting plane methods ˆ Branch and bound methods Laurent Lessard (www.laurentlessard.com)

More information

Some Recent Advances in Mixed-Integer Nonlinear Programming

Some Recent Advances in Mixed-Integer Nonlinear Programming Some Recent Advances in Mixed-Integer Nonlinear Programming Andreas Wächter IBM T.J. Watson Research Center Yorktown Heights, New York andreasw@us.ibm.com SIAM Conference on Optimization 2008 Boston, MA

More information

Proximity search heuristics for Mixed Integer Programs

Proximity search heuristics for Mixed Integer Programs Proceedings of the Twenty-Sixth RAMP Symposium The Operations Research Society of Japan Proximity search heuristics for Mixed Integer Programs Martina Fischetti 1, Matteo Fischetti 2, and Michele Monaci

More information

Linear Programming: Simplex

Linear Programming: Simplex Linear Programming: Simplex Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Linear Programming: Simplex IMA, August 2016

More information

Feasibility Pump Heuristics for Column Generation Approaches

Feasibility Pump Heuristics for Column Generation Approaches 1 / 29 Feasibility Pump Heuristics for Column Generation Approaches Ruslan Sadykov 2 Pierre Pesneau 1,2 Francois Vanderbeck 1,2 1 University Bordeaux I 2 INRIA Bordeaux Sud-Ouest SEA 2012 Bordeaux, France,

More information

3.7 Cutting plane methods

3.7 Cutting plane methods 3.7 Cutting plane methods Generic ILP problem min{ c t x : x X = {x Z n + : Ax b} } with m n matrix A and n 1 vector b of rationals. According to Meyer s theorem: There exists an ideal formulation: conv(x

More information

The strength of multi-row models 1

The strength of multi-row models 1 The strength of multi-row models 1 Quentin Louveaux 2 Laurent Poirrier 3 Domenico Salvagnin 4 October 6, 2014 Abstract We develop a method for computing facet-defining valid inequalities for any mixed-integer

More information

Feasibility Pump for Mixed Integer Nonlinear Programs 1

Feasibility Pump for Mixed Integer Nonlinear Programs 1 Feasibility Pump for Mixed Integer Nonlinear Programs 1 Presenter: 1 by Pierre Bonami, Gerard Cornuejols, Andrea Lodi and Francois Margot Mixed Integer Linear or Nonlinear Programs (MILP/MINLP) Optimize

More information

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs

More information

MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms

MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Jeff Linderoth Industrial and Systems Engineering University of Wisconsin-Madison Jonas Schweiger Friedrich-Alexander-Universität

More information

Lecture 2. Split Inequalities and Gomory Mixed Integer Cuts. Tepper School of Business Carnegie Mellon University, Pittsburgh

Lecture 2. Split Inequalities and Gomory Mixed Integer Cuts. Tepper School of Business Carnegie Mellon University, Pittsburgh Lecture 2 Split Inequalities and Gomory Mixed Integer Cuts Gérard Cornuéjols Tepper School of Business Carnegie Mellon University, Pittsburgh Mixed Integer Cuts Gomory 1963 Consider a single constraint

More information

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n 2 4. Duality of LPs and the duality theorem... 22 4.2 Complementary slackness... 23 4.3 The shortest path problem and its dual... 24 4.4 Farkas' Lemma... 25 4.5 Dual information in the tableau... 26 4.6

More information

Lectures 6, 7 and part of 8

Lectures 6, 7 and part of 8 Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,

More information

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko Approximation Algorithms for Maximum Coverage and Max Cut with Given Sizes of Parts? A. A. Ageev and M. I. Sviridenko Sobolev Institute of Mathematics pr. Koptyuga 4, 630090, Novosibirsk, Russia fageev,svirg@math.nsc.ru

More information

Robust Optimal Discrete Arc Sizing for Tree-Shaped Potential Networks

Robust Optimal Discrete Arc Sizing for Tree-Shaped Potential Networks Robust Optimal Discrete Arc Sizing for Tree-Shaped Potential Networks Martin Robinius, Lars Schewe, Martin Schmidt, Detlef Stolten, Johannes Thürauf, Lara Welder Abstract. We consider the problem of discrete

More information

On the Existence of Ideal Solutions in Multi-objective 0-1 Integer Programs

On the Existence of Ideal Solutions in Multi-objective 0-1 Integer Programs On the Existence of Ideal Solutions in Multi-objective -1 Integer Programs Natashia Boland a, Hadi Charkhgard b, and Martin Savelsbergh a a School of Industrial and Systems Engineering, Georgia Institute

More information

An Adaptive Partition-based Approach for Solving Two-stage Stochastic Programs with Fixed Recourse

An Adaptive Partition-based Approach for Solving Two-stage Stochastic Programs with Fixed Recourse An Adaptive Partition-based Approach for Solving Two-stage Stochastic Programs with Fixed Recourse Yongjia Song, James Luedtke Virginia Commonwealth University, Richmond, VA, ysong3@vcu.edu University

More information

Generating p-extremal graphs

Generating p-extremal graphs Generating p-extremal graphs Derrick Stolee Department of Mathematics Department of Computer Science University of Nebraska Lincoln s-dstolee1@math.unl.edu August 2, 2011 Abstract Let f(n, p be the maximum

More information

1 Column Generation and the Cutting Stock Problem

1 Column Generation and the Cutting Stock Problem 1 Column Generation and the Cutting Stock Problem In the linear programming approach to the traveling salesman problem we used the cutting plane approach. The cutting plane approach is appropriate when

More information

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA Gestion de la production Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA 1 Contents 1 Integer Linear Programming 3 1.1 Definitions and notations......................................

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 1 Homework Problems Exercise

More information

Week 8. 1 LP is easy: the Ellipsoid Method

Week 8. 1 LP is easy: the Ellipsoid Method Week 8 1 LP is easy: the Ellipsoid Method In 1979 Khachyan proved that LP is solvable in polynomial time by a method of shrinking ellipsoids. The running time is polynomial in the number of variables n,

More information

Integer program reformulation for robust branch-and-cut-and-price

Integer program reformulation for robust branch-and-cut-and-price Integer program reformulation for robust branch-and-cut-and-price Marcus Poggi de Aragão Informática PUC-Rio Eduardo Uchoa Engenharia de Produção Universidade Federal Fluminense Outline of the talk Robust

More information

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions? SOLVING INTEGER LINEAR PROGRAMS 1. Solving the LP relaxation. 2. How to deal with fractional solutions? Integer Linear Program: Example max x 1 2x 2 0.5x 3 0.2x 4 x 5 +0.6x 6 s.t. x 1 +2x 2 1 x 1 + x 2

More information

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010 Section Notes 9 IP: Cutting Planes Applied Math 121 Week of April 12, 2010 Goals for the week understand what a strong formulations is. be familiar with the cutting planes algorithm and the types of cuts

More information

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week Cuts, Branch & Bound, and Lagrangean Relaxation Week 11 1 Integer Linear Programming This week we will discuss solution methods for solving integer linear programming problems. I will skip the part on complexity theory, Section 11.8, although this is

More information

An Empirical Evaluation of a Walk-Relax-Round Heuristic for Mixed Integer Convex Programs

An Empirical Evaluation of a Walk-Relax-Round Heuristic for Mixed Integer Convex Programs Noname manuscript No. (will be inserted by the editor) An Empirical Evaluation of a Walk-Relax-Round Heuristic for Mixed Integer Convex Programs Kuo-Ling Huang Sanjay Mehrotra Received: date / Accepted:

More information

Combinatorial Optimization

Combinatorial Optimization Combinatorial Optimization 2017-2018 1 Maximum matching on bipartite graphs Given a graph G = (V, E), find a maximum cardinal matching. 1.1 Direct algorithms Theorem 1.1 (Petersen, 1891) A matching M is

More information

Linear Programming Redux

Linear Programming Redux Linear Programming Redux Jim Bremer May 12, 2008 The purpose of these notes is to review the basics of linear programming and the simplex method in a clear, concise, and comprehensive way. The book contains

More information

Notes on Dantzig-Wolfe decomposition and column generation

Notes on Dantzig-Wolfe decomposition and column generation Notes on Dantzig-Wolfe decomposition and column generation Mette Gamst November 11, 2010 1 Introduction This note introduces an exact solution method for mathematical programming problems. The method is

More information

Linear Programming. Scheduling problems

Linear Programming. Scheduling problems Linear Programming Scheduling problems Linear programming (LP) ( )., 1, for 0 min 1 1 1 1 1 11 1 1 n i x b x a x a b x a x a x c x c x z i m n mn m n n n n! = + + + + + + = Extreme points x ={x 1,,x n

More information

BBM402-Lecture 20: LP Duality

BBM402-Lecture 20: LP Duality BBM402-Lecture 20: LP Duality Lecturer: Lale Özkahya Resources for the presentation: https://courses.engr.illinois.edu/cs473/fa2016/lectures.html An easy LP? which is compact form for max cx subject to

More information

CS261: Problem Set #3

CS261: Problem Set #3 CS261: Problem Set #3 Due by 11:59 PM on Tuesday, February 23, 2016 Instructions: (1) Form a group of 1-3 students. You should turn in only one write-up for your entire group. (2) Submission instructions:

More information

Conic optimization under combinatorial sparsity constraints

Conic optimization under combinatorial sparsity constraints Conic optimization under combinatorial sparsity constraints Christoph Buchheim and Emiliano Traversi Abstract We present a heuristic approach for conic optimization problems containing sparsity constraints.

More information

Introduction to Integer Programming

Introduction to Integer Programming Lecture 3/3/2006 p. /27 Introduction to Integer Programming Leo Liberti LIX, École Polytechnique liberti@lix.polytechnique.fr Lecture 3/3/2006 p. 2/27 Contents IP formulations and examples Total unimodularity

More information

Cutting Plane Separators in SCIP

Cutting Plane Separators in SCIP Cutting Plane Separators in SCIP Kati Wolter Zuse Institute Berlin DFG Research Center MATHEON Mathematics for key technologies 1 / 36 General Cutting Plane Method MIP min{c T x : x X MIP }, X MIP := {x

More information

A note on : A Superior Representation Method for Piecewise Linear Functions

A note on : A Superior Representation Method for Piecewise Linear Functions A note on : A Superior Representation Method for Piecewise Linear Functions Juan Pablo Vielma Business Analytics and Mathematical Sciences Department, IBM T. J. Watson Research Center, Yorktown Heights,

More information

Sequential pairing of mixed integer inequalities

Sequential pairing of mixed integer inequalities Sequential pairing of mixed integer inequalities Yongpei Guan, Shabbir Ahmed, George L. Nemhauser School of Industrial & Systems Engineering, Georgia Institute of Technology, 765 Ferst Drive, Atlanta,

More information

Critical Reading of Optimization Methods for Logical Inference [1]

Critical Reading of Optimization Methods for Logical Inference [1] Critical Reading of Optimization Methods for Logical Inference [1] Undergraduate Research Internship Department of Management Sciences Fall 2007 Supervisor: Dr. Miguel Anjos UNIVERSITY OF WATERLOO Rajesh

More information

Lecture 5: Probabilistic tools and Applications II

Lecture 5: Probabilistic tools and Applications II T-79.7003: Graphs and Networks Fall 2013 Lecture 5: Probabilistic tools and Applications II Lecturer: Charalampos E. Tsourakakis Oct. 11, 2013 5.1 Overview In the first part of today s lecture we will

More information

Chapter 3, Operations Research (OR)

Chapter 3, Operations Research (OR) Chapter 3, Operations Research (OR) Kent Andersen February 7, 2007 1 Linear Programs (continued) In the last chapter, we introduced the general form of a linear program, which we denote (P) Minimize Z

More information

A COMPUTATIONAL COMPARISON OF SYMMETRY HANDLING METHODS FOR MIXED INTEGER PROGRAMS

A COMPUTATIONAL COMPARISON OF SYMMETRY HANDLING METHODS FOR MIXED INTEGER PROGRAMS A COMPUTATIONAL COMPARISON OF SYMMETRY HANDLING METHODS FOR MIXED INTEGER PROGRAMS MARC E. PFETSCH AND THOMAS REHN Abstract. The handling of symmetries in mixed integer programs in order to speed up the

More information

Asymptotic Polynomial-Time Approximation (APTAS) and Randomized Approximation Algorithms

Asymptotic Polynomial-Time Approximation (APTAS) and Randomized Approximation Algorithms Approximation Algorithms Asymptotic Polynomial-Time Approximation (APTAS) and Randomized Approximation Algorithms Jens Egeblad November 29th, 2006 Agenda First lesson (Asymptotic Approximation): Bin-Packing

More information

Mathematical Programming II Lecture 5 ORIE 6310 Spring 2014 February 7, 2014 Scribe: Weixuan Lin

Mathematical Programming II Lecture 5 ORIE 6310 Spring 2014 February 7, 2014 Scribe: Weixuan Lin Mathematical Programming II Lecture 5 ORIE 630 Spring 204 February 7, 204 Scribe: Weixuan Lin Consider the last general scheme: perturb the linear system. This is applicable for any LCP. w = dz 0 + Mz

More information

The Strength of Multi-Row Relaxations

The Strength of Multi-Row Relaxations The Strength of Multi-Row Relaxations Quentin Louveaux 1 Laurent Poirrier 1 Domenico Salvagnin 2 1 Université de Liège 2 Università degli studi di Padova August 2012 Motivations Cuts viewed as facets of

More information

Lecture #21. c T x Ax b. maximize subject to

Lecture #21. c T x Ax b. maximize subject to COMPSCI 330: Design and Analysis of Algorithms 11/11/2014 Lecture #21 Lecturer: Debmalya Panigrahi Scribe: Samuel Haney 1 Overview In this lecture, we discuss linear programming. We first show that the

More information

Stochastic Integer Programming

Stochastic Integer Programming IE 495 Lecture 20 Stochastic Integer Programming Prof. Jeff Linderoth April 14, 2003 April 14, 2002 Stochastic Programming Lecture 20 Slide 1 Outline Stochastic Integer Programming Integer LShaped Method

More information

Discrepancy Theory in Approximation Algorithms

Discrepancy Theory in Approximation Algorithms Discrepancy Theory in Approximation Algorithms Rajat Sen, Soumya Basu May 8, 2015 1 Introduction In this report we would like to motivate the use of discrepancy theory in algorithms. Discrepancy theory

More information

On the knapsack closure of 0-1 Integer Linear Programs

On the knapsack closure of 0-1 Integer Linear Programs On the knapsack closure of 0-1 Integer Linear Programs Matteo Fischetti 1 Dipartimento di Ingegneria dell Informazione University of Padova Padova, Italy Andrea Lodi 2 Dipartimento di Elettronica, Informatica

More information

Dual bounds: can t get any better than...

Dual bounds: can t get any better than... Bounds, relaxations and duality Given an optimization problem z max{c(x) x 2 }, how does one find z, or prove that a feasible solution x? is optimal or close to optimal? I Search for a lower and upper

More information

1 Primals and Duals: Zero Sum Games

1 Primals and Duals: Zero Sum Games CS 124 Section #11 Zero Sum Games; NP Completeness 4/15/17 1 Primals and Duals: Zero Sum Games We can represent various situations of conflict in life in terms of matrix games. For example, the game shown

More information

Orbital Shrinking: A New Tool for Hybrid MIP/CP Methods

Orbital Shrinking: A New Tool for Hybrid MIP/CP Methods Orbital Shrinking: A New Tool for Hybrid MIP/CP Methods Domenico Salvagnin DEI, University of Padova salvagni@dei.unipd.it Abstract. Orbital shrinking is a newly developed technique in the MIP community

More information

Valid Inequalities for Optimal Transmission Switching

Valid Inequalities for Optimal Transmission Switching Valid Inequalities for Optimal Transmission Switching Hyemin Jeon Jeff Linderoth Jim Luedtke Dept. of ISyE UW-Madison Burak Kocuk Santanu Dey Andy Sun Dept. of ISyE Georgia Tech 19th Combinatorial Optimization

More information

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs Instructor: Shaddin Dughmi Outline 1 Introduction 2 Shortest Path 3 Algorithms for Single-Source

More information

Aditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016

Aditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016 Lecture 1: Introduction and Review We begin with a short introduction to the course, and logistics. We then survey some basics about approximation algorithms and probability. We also introduce some of

More information

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1 Branch-and-Bound Leo Liberti LIX, École Polytechnique, France INF431 2011, Lecture p. 1 Reminders INF431 2011, Lecture p. 2 Problems Decision problem: a question admitting a YES/NO answer Example HAMILTONIAN

More information

maxz = 3x 1 +4x 2 2x 1 +x 2 6 2x 1 +3x 2 9 x 1,x 2

maxz = 3x 1 +4x 2 2x 1 +x 2 6 2x 1 +3x 2 9 x 1,x 2 ex-5.-5. Foundations of Operations Research Prof. E. Amaldi 5. Branch-and-Bound Given the integer linear program maxz = x +x x +x 6 x +x 9 x,x integer solve it via the Branch-and-Bound method (solving

More information

Decision Procedures An Algorithmic Point of View

Decision Procedures An Algorithmic Point of View An Algorithmic Point of View ILP References: Integer Programming / Laurence Wolsey Deciding ILPs with Branch & Bound Intro. To mathematical programming / Hillier, Lieberman Daniel Kroening and Ofer Strichman

More information