Tutorial: Operations Research in Constraint Programming

Size: px
Start display at page:

Download "Tutorial: Operations Research in Constraint Programming"

Transcription

1 Tutorial: Operations Research in Constraint Programming John Hooker Carnegie Mellon University May 2009 Revised June 2009 May 2009 Slide 1

2 Why Integrate OR and CP? Complementary strengths Computational advantages Outline of the Tutorial May 2009 Slide 2

3 Complementary Strengths CP: Inference methods Modeling Exploits local structure OR: Relaxation methods Duality theory Exploits global structure Let s bring them together! May 2009 Slide 3

4 Computational Advantage of Integrating CP and OR Using CP + relaxation from MILP Focacci, Lodi, Milano (1999) Refalo (1999) Hooker & Osorio (1999) Thorsteinsson & Ottosson (2001) Problem Lesson timetabling Piecewise linear costs Flow shop scheduling, etc. Product configuration Speedup 2 to 50 times faster than CP 2 to 200 times faster than MILP 4 to 150 times faster than MILP. 30 to 40 times faster than CP, MILP May 2009 Slide 4

5 Computational Advantage of Integrating CP and MILP Using CP + relaxation from MILP May 2009 Slide 5 Sellmann & Fahle (2001) Van Hoeve (2001) Bollapragada, Ghattas & Hooker (2001) Beck & Refalo (2003) Problem Automatic recording Stable set problem Structural design (nonlinear) Scheduling with earliness & tardiness costs Speedup 1 to 10 times faster than CP, MILP Better than CP in less time Up to 600 times faster than MILP. 2 problems: <6 min vs >20 hrs for MILP Solved 67 of 90, CP solved only 12

6 Computational Advantage of Integrating CP and MILP Using CP-based Branch and Price Yunes, Moura & de Souza (1999) Easton, Nemhauser & Trick (2002) Problem Urban transit crew scheduling Traveling tournament scheduling Speedup Optimal schedule for 210 trips, vs. 120 for traditional branch and price First to solve 8-team instance May 2009 Slide 6

7 Computational Advantage of Integrating CP and MILP Using CP/MILP Benders methods Jain & Grossmann (2001) Thorsteinsson (2001) Timpe (2002) Problem Min-cost planning & scheduing Min-cost planning & scheduling Polypropylene batch scheduling at BASF Speedup 20 to 1000 times faster than CP, MILP 10 times faster than Jain & Grossmann Solved previously insoluble problem in 10 min May 2009 Slide 7

8 Computational Advantage of Integrating CP and MILP Using CP/MILP Benders methods Benoist, Gaudin, Rottembourg (2002) Hooker (2004) Hooker (2005) Problem Call center scheduling Min-cost, min-makespan planning & cumulative scheduling Min tardiness planning & cumulative scheduling Speedup Solved twice as many instances as traditional Benders times faster than CP, MILP times faster than CP, MILP May 2009 Slide 8

9 Software for Integrated Methods ECLiPSe Exchanges information between ECLiPSEe solver, Xpress-MP OPL Studio Combines CPLEX and ILOG CP Optimizer with script language Mosel Combines Xpress-MP, Xpress-Kalis with low-level modeling BARON Global optimization with relaxation + domain reduction SIMPL Full integration with high-level modeling (prototype) SCIP Combines MILP and CP-based propagation May 2009 Slide 9

10 Outline of the Tutorial Why Integrate OR and CP? A Glimpse at CP Initial Example: Integrated Methods CP Concepts CP Filtering Algorithms Linear Relaxation and CP Mixed Integer/Linear Modeling Cutting Planes Lagrangean Relaxation and CP Dynamic Programming in CP CP-based Branch and Price CP-based Benders Decomposition May 2009 Slide 10

11 Detailed Outline Why Integrate OR and CP? Complementary strengths Computational advantages Outline of the tutorial A Glimpse at CP Early successes Advantages and disadvantages Initial Example: Integrated Methods Freight Transfer Bounds Propagation Cutting Planes Branch-infer-and-relax Tree May 2009 Slide 11

12 Detailed Outline CP Concepts Consistency Hyperarc Consistency Modeling Examples CP Filtering Algorithms Element Alldiff Disjunctive Scheduling Cumulative Scheduling Linear Relaxation and CP Why relax? Algebraic Analysis of LP Linear Programming Duality LP-Based Domain Filtering Example: Single-Vehicle Routing Disjunctions of Linear Systems May 2009 Slide 12

13 Detailed Outline Mixed Integer/Linear Modeling MILP Representability 4.2 Disjunctive Modeling 4.3 Knapsack Modeling Cutting Planes 0-1 Knapsack Cuts Gomory Cuts Mixed Integer Rounding Cuts Example: Product Configuration Lagrangean Relaxation and CP Lagrangean Duality Properties of the Lagrangean Dual Example: Fast Linear Programming Domain Filtering Example: Continuous Global Optimization May 2009 Slide 13

14 Detailed Outline Dynamic Programming in CP Example: Capital Budgeting Domain Filtering Recursive Optimization CP-based Branch and Price Basic Idea Example: Airline Crew Scheduling CP-based Benders Decomposition Benders Decomposition in the Abstract Classical Benders Decomposition Example: Machine Scheduling May 2009 Slide 14

15 Background Reading This tutorial is based on: J. N. Hooker, Integrated Methods for Optimization, Springer (2007). Contains 295 exercises. J. N. Hooker, Operations research methods in constraint programming, in F. Rossi, P. van Beek and T. Walsh, eds., Handbook of Constraint Programming, Elsevier (2006), pp May 2009 Slide 15

16 A Glimpse at Constraint Programming Early Successes Advantages and Disadvantages May 2009 Slide 16

17 What is constraint programming? It is a relatively new technology developed in the computer science and artificial intelligence communities. It has found an important role in scheduling, logistics and supply chain management. May 2009 Slide 17

18 Early commercial successes Circuit design (Siemens) Container port scheduling (Hong Kong and Singapore) Real-time control (Siemens, Xerox) May 2009 Slide 18

19 Applications Job shop scheduling Assembly line smoothing and balancing Cellular frequency assignment Nurse scheduling Shift planning Maintenance planning Airline crew rostering and scheduling Airport gate allocation and stand planning May 2009 Slide 19

20 Applications Production scheduling chemicals aviation oil refining steel lumber photographic plates tires Transport scheduling (food, nuclear fuel) Warehouse management Course timetabling May 2009 Slide 20

21 Advantages and Disadvantages CP vs. Mathematical Programming Relaxation Branching MP Numerical calculation Atomistic modeling (linear inequalities) Independence of model and algorithm Logic processing Inference (filtering, constraint propagation) High-level modeling (global constraints) Branching CP Constraint-based processing May 2009 Slide 21

22 Programming programming In constraint programming: programming = a form of computer programming (constraint-based processing) In mathematical programming: programming = logistics planning (historically) May 2009 Slide 22

23 CP vs. MP In mathematical programming, equations (constraints) describe the problem but don t tell how to solve it. In constraint programming, each constraint invokes a procedure that screens out unacceptable solutions. Much as each line of a computer program invokes an operation. May 2009 Slide 23

24 Advantages of CP Better at sequencing and scheduling where MP methods have weak relaxations. Adding messy constraints makes the problem easier. The more constraints, the better. More powerful modeling language. Global constraints lead to succinct models. Constraints convey problem structure to the solver. Better at highly-constrained problems Misleading better when constraints propagate well, or when constraints have few variables. May 2009 Slide 24

25 Disdvantages of CP Weaker for continuous variables. Due to lack of numerical techniques May fail when constraints contain many variables. These constraints don t propagate well. Often not good for finding optimal solutions. Due to lack of relaxation technology. May not scale up Discrete combinatorial methods Software is not robust Younger field May 2009 Slide 25

26 Obvious solution Integrate CP and MP. May 2009 Slide 26

27 Trends CP is better known in continental Europe, Asia. Less known in North America, seen as threat to OR. CP/MP integration is growing Eclipse, Mozart, OPL Studio, SIMPL, SCIP, BARON Heuristic methods increasingly important in CP Discrete combinatorial methods MP/CP/heuristics may become a single technology. May 2009 Slide 27

28 Initial Example: Integrated Methods Freight Transfer Bounds Propagation Cutting Planes Branch-infer-and-relax Tree May 2009 Slide 28

29 Example: Freight Transfer Transport 42 tons of freight using 8 trucks, which come in 4 sizes Truck size Number available Capacity (tons) Cost per truck May 2009 Slide

30 Number of trucks of type 1 Knapsack packing constraint min 90x + 60x + 50x + 40x 7x + 5x + 4x + 3x 42 {0,1,2,3} x + x + x + x 8 x i Knapsack covering constraint Truck type Number available Capacity (tons) Cost per truck May 2009 Slide

31 Bounds propagation min 90x + 60x + 50x + 40x 7x + 5x + 4x + 3x 42 {0,1,2,3} x + x + x + x 8 x i x = 7 1 May 2009 Slide 31

32 Bounds propagation Reduced domain min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x {1,2,3}, x, x, x {0,1,2,3} x = 7 1 May 2009 Slide 32

33 Bounds consistency Let {L j,, U j } be the domain of x j A constraint set is bounds consistent if for each j : May 2009 Slide 33 x j = L j in some feasible solution and x j = U j in some feasible solution. Bounds consistency we will not set x j to any infeasible values during branching. Bounds propagation achieves bounds consistency for a single inequality. 7x 1 + 5x 2 + 4x 3 + 3x 4 42 is bounds consistent when the domains are x 1 {1,2,3} and x 2, x 3, x 4 {0,1,2,3}. But not necessarily for a set of inequalities.

34 Bounds consistency Bounds propagation may not achieve bounds consistency for a set of constraints. Consider set of inequalities + x with domains x 1, x 2 {0,1}, solutions (x 1,x 2 ) = (1,0), (1,1). Bounds propagation has no effect on the domains. But constraint set is not bounds consistent because x 1 = 0 in no feasible solution. x x x May 2009 Slide 34

35 Cutting Planes Begin with continuous relaxation min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Replace domains with bounds This is a linear programming problem, which is easy to solve. Its optimal value provides a lower bound on optimal value of original problem. May 2009 Slide 35

36 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 We can create a tighter relaxation (larger minimum value) with the addition of cutting planes. May 2009 Slide 36

37 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Cutting plane Continuous relaxation All feasible solutions of the original problem satisfy a cutting plane (i.e., it is valid). May 2009 Slide 37 But a cutting plane may exclude ( cut off ) solutions of the continuous relaxation. Feasible solutions

38 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 {1,2} is a packing i 1 because 7x 1 + 5x 2 alone cannot satisfy the inequality, even with x 1 = x 2 = 3. May 2009 Slide 38

39 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 {1,2} is a packing i 1 So, 4x + 3x 42 ( ) 3 4 Knapsack cut which implies x 42 ( ) + x = max{ 4,3} May 2009 Slide 39

40 Cutting planes (valid inequalities) Let x i have domain [L i,u i ] and let a 0. In general, a packing P for ax a 0 satisfies i P a x a a U i i 0 i i i P and generates a knapsack cut i P x i a0 max i P i P a U i { a } i i May 2009 Slide 40

41 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Maximal Packings {1,2} {1,3} {1,4} Knapsack cuts x 3 + x 4 2 x 2 + x 4 2 x 2 + x 3 3 May 2009 Slide 41 Knapsack cuts corresponding to nonmaximal packings can be nonredundant.

42 Continuous relaxation with cuts min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x 8 0 x 3, x 1 x x x i x 2 + x x Knapsack cuts Optimal value of is a lower bound on optimal value of original problem. May 2009 Slide 42

43 Branchinfer-andrelax tree Propagate bounds and solve relaxation of original problem. x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ May 2009 Slide 43

44 Branch-inferand-relax tree Branch on a variable with nonintegral value in the relaxation. x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 = 3 May 2009 Slide 44

45 Branch-inferand-relax tree Propagate bounds and solve relaxation. Since relaxation is infeasible, backtrack. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 = 3 May 2009 Slide 45

46 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Propagate bounds and solve relaxation. Branch on nonintegral variable. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3 May 2009 Slide 46

47 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Branch again. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 2 {0,1,2} x 2 = 3 x 3 {1,2} x 3 = 3 May 2009 Slide 47

48 Branch-inferand-relax tree Solution of relaxation is integral and therefore feasible in the original problem. This becomes the incumbent solution. May 2009 Slide 48 x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 1 = 3 x 3 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3

49 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Solution is nonintegral, but we can backtrack because value of relaxation is no better than incumbent solution. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3 May 2009 Slide 49 x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 3 = 3 x 1 { 3} x 2 {012 } x 3 { 3} x 4 {012 } x = (3,1½,3,½) value = 530 backtrack due to bound

50 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Another feasible solution found. No better than incumbent solution, which is optimal because search has finished. May 2009 Slide 50 x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 1 = 3 x 3 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3 x 1 { 3} x 2 {012 } x 3 { 3} x 4 {012 } x = (3,1½,3,½) value = 530 backtrack due to bound x 1 { 3} x 2 { 3} x 3 {012 } x 4 {012 } x = (3,3,0,2) value = 530 feasible solution

51 Two optimal solutions x = (3,2,2,1) x = (3,3,0,2) May 2009 Slide 51

52 Constraint Programming Concepts Consistency Generalized Arc Consistency Modeling Examples May 2009 Slide 52

53 Consistency A constraint set is consistent if every partial assignment to the variables that violates no constraint is feasible. i.e., can be extended to a feasible solution. Consistency feasibility Consistency means that any infeasible partial assignment is explicitly ruled out by a constraint. Fully consistent constraint sets can be solved without backtracking. May 2009 Slide 53

54 Consistency Consider the constraint set x x x j + x x { 0,1} 1 0 It is not consistent, because x 1 = 0 violates no constraint and yet is infeasible (no solution has x 1 = 0). Adding the constraint x 1 = 1 makes the set consistent. May 2009 Slide 54

55 x x + x { 0,1} 1 other constraints x j x x 1 = 0 x 1 = 1 subtree with 2 99 nodes but no feasible solution By adding the constraint x 1 = 1, the left subtree is eliminated May 2009 Slide 55

56 Generalized Arc Consistency (GAC) Also known as hyperarc consistency. A constraint set is GAC if every value in every variable domain is part of some feasible solution. That is, the domains are reduced as much as possible. If all constraints are binary (contain 2 variables), GAC = arc consistency. Domain reduction is CP s biggest engine. May 2009 Slide 56

57 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 57

58 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 58

59 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 59

60 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 60

61 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 61

62 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 62

63 Graph coloring problem that can be solved by arc consistency maintenance alone. Color nodes with red, green, blue with no two adjacent nodes having the same color. May 2009 Slide 63

64 Modeling Examples with Global Constraints Traveling Salesman Traveling salesman problem: Let c ij = distance from city i to city j. Find the shortest route that visits each of n cities exactly once. May 2009 Slide 64

65 Popular 0-1 model Let x ij = 1 if city i immediately precedes city j, 0 otherwise min s.t. ij i j ij ij ij i V j W x ij c x x x ij = 1, all j = 1, all i { 0,1} { } x 1, all disjoint V, W 1,, n ij Subtour elimination constraints May 2009 Slide 65

66 A CP model Let y k = the kth city visited. The model would be written in a specific constraint programming language but would essentially say: Variable indices min k y y k k + 1 s.t. alldiff( y,, y ) y k c 1 { 1,, n} n Global constraint May 2009 Slide 66

67 An alternate CP model Let y k = the city visited after city k. min k ky k s.t. circuit( y,, y ) y k c 1 { 1,, n} n Hamiltonian circuit constraint May 2009 Slide 67

68 Element constraint The constraint c y 5 can be implemented: z 5 ( y c c z) element,(,, ), 1 n Assign z the yth value in the list The constraint x y 5 can be implemented z 5 ( y x x z) element,(,, ), (this is a slightly different constraint) 1 n Add the constraint z = x y May 2009 Slide 68

69 Modeling example: Lot sizing and scheduling Day: A B A Product At most one product manufactured on each day. Demands for each product on each day. Minimize setup + holding cost. May 2009 Slide 69

70 Integer programming model (Wolsey) May 2009 Slide 70 min hitsit + qijδijt t, i j t s.t. s + x = d + s, all i, t i, t 1 it it it z y y, all i, t it it i, t 1 z y, all i, t it it z 1 y, all i, t δ δ δ x it i, t 1 y + y 1, all i, j, t ijt i, t 1 jt ijt ijt it i y it y y i, t 1 jt, all i, j, t, all i, j, t it, all i, t, z, δ {0,1} it it ijt x, s 0 it Cy y it = 1, all t Many variables

71 CP model Minimize holding and setup costs min qyt 1y + h t is it t i s.t. s + x = d + s, all i, t i, t 1 it it it 0 x C, s 0, all i, t it ( ) ( = ) y i x 0, all i, t t it it Inventory balance Production capacity May 2009 Slide 71

72 CP model Variable indices Minimize holding and setup costs min qyt 1y + h t is it t i s.t. s + x = d + s, all i, t i, t 1 it it it 0 x C, s 0, all i, t it ( ) ( = ) y i x 0, all i, t t it it Inventory balance Production capacity May 2009 Slide 72 Production level of product i in period t Product manufactured in period t

73 Cumulative scheduling constraint Used for resource-constrained scheduling. Total resources consumed by jobs at any one time must not exceed L. ( t t p p c c L) cumulative (,, ),(,, ),(,, ), 1 n 1 n 1 n Job start times (variables) Job processing times Job resource requirements May 2009 Slide 73

74 Cumulative scheduling constraint Minimize makespan (no deadlines, all release times = 0): L 4 resources 1 5 Min makespan = min May 2009 Slide 74 z 1 5 ( t t ) s.t. cumulative (,, ),(3,3,3,5,5),(3,3,3,2,2),7 z t + 3 z t time Resources used Processing times Job start times L

75 Modeling example: Ship loading Will use ILOG s OPL Studio modeling language. Example is from OPL manual. The problem Load 34 items on the ship in minimum time (min makespan) Each item requires a certain time and certain number of workers. Total of 8 workers available. May 2009 Slide 75

76 May 2009 Slide Labor Duration Item Labor Duration Item Problem data

77 Precedence constraints 1 2, , , , ,30,31, May 2009 Slide 77

78 Use the cumulative scheduling constraint. min z s.t. z t + 3, z t + 4, etc ( t t ) cumulative (,, ),(3,4,,2),(4,4,,3), t t + 3, t t + 3, etc. May 2009 Slide 78

79 OPL model int capacity = 8; int nbtasks = 34; range Tasks 1..nbTasks; int duration[tasks] = [3,4,4,6,,2]; int totalduration = sum(t in Tasks) duration[t]; int demand[tasks] = [4,4,3,4,,3]; struct Precedences { int before; int after; } {Precedences} setofprecedences = { <1,2>, <1,4>,, <33,34> }; May 2009 Slide 79

80 schedulehorizon = totalduration; Activity a[t in Tasks](duration[t]); DiscreteResource res(8); Activity makespan(0); minimize makespan.end subject to forall(t in Tasks) a[t] precedes makespan; forall(p in setofprecedences) a[p.before] precedes a[p.after]; forall(t in Tasks) a[t] requires(demand[t]) res; }; May 2009 Slide 80

81 Modeling example: Production scheduling with intermediate storage Manufacturing Unit Storage Tanks Capacity C 1 Capacity C 2 Packing Units May 2009 Slide 81 Capacity C 3

82 Filling of storage tank Level Need to enforce capacity constraint here only Filling starts Packing starts May 2009 Slide 82 t u t + (b/r) u + (b/s) Manufacturing rate Batch size Filling ends Packing ends Packing rate

83 min s.t. Makespan T bj T u j +, all j s t R, all j j j ( t v e m) cumulative,,, bi v i = ui + ti, all i s s i bi 1 + siui Ci, all i r Tank capacity i 1 n cumulative u,,,,, u j t j 0 j i b s 1 b s n e p Job release time m storage tanks Job duration p packing units May 2009 Slide 83 e = (1,,1)

84 Modeling example: Employee scheduling Schedule four nurses in 8-hour shifts. A nurse works at most one shift a day, at least 5 days a week. Same schedule every week. No shift staffed by more than two different nurses in a week. A nurse cannot work different shifts on two consecutive days. A nurse who works shift 2 or 3 must do so at least two days in a row. May 2009 Slide 84

85 Two ways to view the problem Assign nurses to shifts Sun Mon Tue Wed Thu Fri Sat Shift 1 A B A A A A A Shift 2 C C C B B B B Shift 3 D D D D C C D Assign shifts to nurses Sun Mon Tue Wed Thu Fri Sat Nurse A Nurse B Nurse C Nurse D May 2009 Slide 85 0 = day off

86 Use both formulations in the same model! First, assign nurses to shifts. Let w sd = nurse assigned to shift s on day d alldiff( w, w, w ), all d The variables w 1d, w 2d, 1d 2d 3d w 3d take different values That is, schedule 3 different nurses on each day May 2009 Slide 86

87 Use both formulations in the same model! First, assign nurses to shifts. Let w sd = nurse assigned to shift s on day d alldiff( w, w, w ), all 1d 2d 3d ( w A B C D ) cardinality (,,, ),(5,5,5,5),(6,6,6,6) d A occurs at least 5 and at most 6 times in the array w, and similarly for B, C, D. That is, each nurse works at least 5 and at most 6 days a week May 2009 Slide 87

88 Use both formulations in the same model! First, assign nurses to shifts. Let w sd = nurse assigned to shift s on day d ( w1 d w2d w3d ) d ( w A B C D ) alldiff,,, all cardinality (,,, ),(5,5,5,5),(6,6,6,6) ( ) nvalues w,..., w 1,2, all s s,sun s,sat The variables w s,sun,, w s,sat take at least 1 and at most 2 different values. May 2009 Slide 88 That is, at least 1 and at most 2 nurses work any given shift.

89 Remaining constraints are not easily expressed in this notation. So, assign shifts to nurses. Let y id = shift assigned to nurse i on day d alldiff (,, ) y y y, all d 1d 2d 3d Assign a different nurse to each shift on each day. May 2009 Slide 89 This constraint is redundant of previous constraints, but redundant constraints speed solution.

90 Remaining constraints are not easily expressed in this notation. So, assign shifts to nurses. Let y id = shift assigned to nurse i on day d alldiff ( y, y, y ) 1d 2d 3d ( y y P ) i,sun, all stretch,, (2,3),(2,2),(6,6),, all i,sat d i Every stretch of 2 s has length between 2 and 6. Every stretch of 3 s has length between 2 and 6. So a nurse who works shift 2 or 3 must do so at least two days in a row. May 2009 Slide 90

91 Remaining constraints are not easily expressed in this notation. So, assign shifts to nurses. Let y id = shift assigned to nurse i on day d alldiff ( y, y, y ) 1d 2d 3d ( y y P ) i,sun, all stretch,, (2,3),(2,2),(6,6),, all i,sat d i Here P = {(s,0),(0,s) s = 1,2,3} Whenever a stretch of a s immediately precedes a stretch of b s, (a,b) must be one of the pairs in P. So a nurse cannot switch shifts without taking at least one day off. May 2009 Slide 91

92 Now we must connect the w sd variables to the y id variables. Use channeling constraints: w y y w id sd d d = = i, all i, d s, all s, d Channeling constraints increase propagation and make the problem easier to solve. May 2009 Slide 92

93 The complete model is: ( w1 d w2d w3d ) d ( w A B C D ) alldiff,,, all cardinality (,,, ),(5,5,5,5),(6,6,6,6) ( ) nvalues w,..., w 1,2, all s alldiff s,sun ( y, y, y ) 1d 2d 3d s,sat ( y y P ) i,sun, all stretch,, (2,3),(2,2),(6,6),, all i,sat d i w y y w id sd d d = = i, all i, d s, all s, d May 2009 Slide 93

94 CP Filtering Algorithms Element Alldiff Disjunctive Scheduling Cumulative Scheduling May 2009 Slide 94

95 Filtering for element ( y x x z) element,(,, ), Variable domains can be easily filtered to maintain GAC. 1 n Domain of z D D D z z x j D j y { } D D j D D D y y z x x { } Dz if Dy = j Dx otherwise j j j May 2009 Slide 95

96 Filtering for element Example... ( y x x x x z) element,(,,, ), The initial domains are: D D D D D D z y x x x x = = = = = = { 20,30,60,80,90 } { 1,3,4 } { 10,50} { 10,20} { 40,50,80,90 } { 40,50,70} The reduced domains are: D D D D D D z y x x x x = = = = = = { 80,90} { 3} { 10,50} { 10,20} { 80,90} { 40,50,70} May 2009 Slide 96

97 Filtering for alldiff ( y y ) alldiff 1,, n Domains can be filtered with an algorithm based on maximum cardinality bipartite matching and a theorem of Berge. It is a special case of optimality conditions for max flow. May 2009 Slide 97

98 Filtering for alldiff Consider the domains y y y y y { 1} { 2,3,5 } { 1,2,3,5 } { 1,5 } { 1,2,3,4,5,6 } May 2009 Slide 98

99 Indicate domains with edges y 1 y 2 y 3 y y 5 6 May 2009 Slide 99

100 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. y 4 5 y 5 6 May 2009 Slide 100

101 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. y 4 5 y 5 6 May 2009 Slide 101

102 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. Mark edges in alternating paths that start at an uncovered vertex. y 4 5 y 5 6 May 2009 Slide 102

103 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. Mark edges in alternating paths that start at an uncovered vertex. y 4 5 y 5 6 May 2009 Slide 103

104 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. Mark edges in alternating paths that start at an uncovered vertex. y 4 5 Mark edges in alternating cycles. y 5 6 May 2009 Slide 104

105 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. Mark edges in alternating paths that start at an uncovered vertex. y 4 5 Mark edges in alternating cycles. y 5 6 Remove unmarked edges not in matching. May 2009 Slide 105

106 Indicate domains with edges y 1 y 2 y Find maximum cardinality bipartite matching. Mark edges in alternating paths that start at an uncovered vertex. y 4 5 Mark edges in alternating cycles. y 5 6 Remove unmarked edges not in matching. May 2009 Slide 106

107 Filtering for alldiff Domains have been filtered: y y y y y { 1} { 2,3,5 } { 1,2,3,5 } { 1,5 } { 1,2,3,4,5,6 } y y y y y { 1} { 2,3} { 2,3} { 5} { 4,6} GAC achieved. May 2009 Slide 107

108 Disjunctive scheduling Consider a disjunctive scheduling constraint: ( s s s s p p p p ) disjunctive (,,, ),(,,, ) Start time variables May 2009 Slide 108

109 Edge finding for disjunctive scheduling Consider a disjunctive scheduling constraint: ( s s s s p p p p ) disjunctive (,,, ),(,,, ) Processing times May 2009 Slide 109

110 Edge finding for disjunctive scheduling Consider a disjunctive scheduling constraint: ( s s s s p p p p ) disjunctive (,,, ),(,,, ) Variable domains defined by time windows and processing times s s s s [0,10 1] [0,10 3] [2,7 3] [4,7 2] May 2009 Slide 110

111 Edge finding for disjunctive scheduling Consider a disjunctive scheduling constraint: ( s s s s p p p p ) disjunctive (,,, ),(,,, ) A feasible (min makespan) solution: Time window May 2009 Slide 111

112 Edge finding for disjunctive scheduling But let s reduce 2 of the deadlines to 9: May 2009 Slide 112

113 Edge finding for disjunctive scheduling But let s reduce 2 of the deadlines to 9: We will use edge finding to prove that there is no feasible schedule. May 2009 Slide 113

114 Edge finding for disjunctive scheduling We can deduce that job 2 must precede jobs 3 and 5: 2 { 3,5} Because if job 2 is not first, there is not enough time for all 3 jobs within the time windows: L{2,3,5} E{3,5} < p{2,3,5} May 2009 Slide 114 E {3,5} 7<3+3+2 L {2,3,5}

115 Edge finding for disjunctive scheduling We can deduce that job 2 must precede jobs 3 and 5: 2 { 3,5} Because if job 2 is not first, there is not enough time for all 3 jobs within the time windows: Latest deadline L{2,3,5} E{3,5} < p{2,3,5} May 2009 Slide 115 E {3,5} 7<3+3+2 L {2,3,5}

116 Edge finding for disjunctive scheduling We can deduce that job 2 must precede jobs 3 and 5: 2 { 3,5} Because if job 2 is not first, there is not enough time for all 3 jobs within the time windows: Earliest release time L{2,3,5} E{3,5} < p{2,3,5} May 2009 Slide 116 E {3,5} 7<3+3+2 L {2,3,5}

117 Edge finding for disjunctive scheduling We can deduce that job 2 must precede jobs 3 and 5: 2 { 3,5} Because if job 2 is not first, there is not enough time for all 3 jobs within the time windows: L{2,3,5} E{3,5} < p{2,3,5} Total processing time May 2009 Slide 117 E {3,5} 7<3+3+2 L {2,3,5}

118 Edge finding for disjunctive scheduling We can deduce that job 2 must precede jobs 3 and 5: 2 { 3,5} So we can tighten deadline of job 2 to minimum of L {3} p{3} = 4 L {5} p{5} 5 = L{3,5} p{3,5} = 2 Since time window of job 2 is now too narrow, there is no feasible schedule. May 2009 Slide 118 E {3,5} 7<3+3+2 L {2,3,5}

119 Edge finding for disjunctive scheduling In general, we can deduce that job k must precede all the jobs in set J: k J If there is not enough time for all the jobs after the earliest release time of the jobs in J LJ { k } EJ < pj { k } L{2,3,5} E{3,5} < p{2,3,5} May 2009 Slide 119

120 Edge finding for disjunctive scheduling In general, we can deduce that job k must precede all the jobs in set J: k J If there is not enough time for all the jobs after the earliest release time of the jobs in J LJ { k } EJ < pj { k } L{2,3,5} E{3,5} < p{2,3,5} Now we can tighten the deadline for job k to: { L p } min J J L{3,5} p{3,5} = 2 J J May 2009 Slide 120

121 Edge finding for disjunctive scheduling There is a symmetric rule: k J If there is not enough time for all the jobs before the latest deadline of the jobs in J: L E < p J J { k } J { k } Now we can tighten the release date for job k to: { E + p } max J J J J May 2009 Slide 121

122 Edge finding for disjunctive scheduling Problem: how can we avoid enumerating all subsets J of jobs to find edges? L E < p J { k } J J { k } and all subsets J of J to tighten the bounds? { L p } min J J J J May 2009 Slide 122

123 Edge finding for disjunctive scheduling Key result: We only have to consider sets J whose time windows lie within some interval. { L p } min J J J J e.g., J = {3,5} May 2009 Slide 123

124 Edge finding for disjunctive scheduling Key result: We only have to consider sets J whose time windows lie within some interval. { L p } min J J J J e.g., J = {3,5} Removing a job from those within an interval only weakens the test L E < p J { k } J J { k } May 2009 Slide 124 There are a polynomial number of intervals defined by release times and deadlines.

125 Edge finding for disjunctive scheduling Key result: We only have to consider sets J whose time windows lie within some interval. { L p } min J J J J e.g., J = {3,5} Note: Edge finding does not achieve bounds consistency, which is an NP-hard problem. May 2009 Slide 125

126 Edge finding for disjunctive scheduling One O(n 2 ) algorithm is based on the Jackson pre-emptive schedule (JPS). Using a different example, the JPS is: May 2009 Slide 126

127 Edge finding for disjunctive scheduling One O(n 2 ) algorithm is based on the Jackson pre-emptive schedule (JPS). Using a different example, the JPS is: For each job Scan jobs k J in decreasing order of L Select first k for which L E < p + p Conclude that Update E to JPS( i, k) i i i i J ik Jobs unfinished at time E i in JPS k i i J ik k Jobs j i in J i with L j L k Total remaining processing time in JPS of jobs in J ik May 2009 Slide 127 Latest completion time in JPS of jobs in J ik

128 Not-first/not-last rules We can deduce that job 4 cannot precede jobs 1 and 2: ( 4 {1,2} ) Because if job 4 is first, there is too little time to complete the jobs before the later deadline of jobs 1 and 2: L{1,2} E4 < p1 + p2 + p4 E 4 6<1+3+3 L {1,2} May 2009 Slide 128

129 Not-first/not-last rules We can deduce that job 4 cannot precede jobs 1 and 2: Now we can tighten the release time of job 4 to minimum of: E 1 + p1 = 3 ( 4 {1,2} ) E 2 + p2 = 4 E 4 6<1+3+3 L {1,2} May 2009 Slide 129

130 Not-first/not-last rules In general, we can deduce that job k cannot precede all the jobs in J: ( k J ) if there is too little time after release time of job k to complete all jobs before the latest deadline in J: LJ Ek < pj Now we can update E i to min E j + p j j J { } May 2009 Slide 130

131 Not-first/not-last rules In general, we can deduce that job k cannot precede all the jobs in J: ( k J ) if there is too little time after release time of job k to complete all jobs before the latest deadline in J: LJ Ek < pj Now we can update E i to min E j + p j j J { } There is a symmetric not-last rule. The rules can be applied in polynomial time, although an efficient algorithm is quite complicated. May 2009 Slide 131

132 Cumulative scheduling Consider a cumulative scheduling constraint: ( s s s p p p c c c C) cumulative (,, ),(,, ),(,, ), A feasible solution: May 2009 Slide 132

133 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } Because the total energy required exceeds the area between the earliest release time and the later deadline of jobs 1,2: ( ) e + e > C L E 3 {1,2} {1,2} {1,2,3} May 2009 Slide 133

134 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } Because the total energy required exceeds the area between the earliest release time and the later deadline of jobs 1,2: ( ) e + e > C L E 3 {1,2} {1,2} {1,2,3} Total energy required = May 2009 Slide 134

135 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } Because the total energy required exceeds the area between the earliest release time and the later deadline of jobs 1,2: ( ) e + e > C L E 3 {1,2} {1,2} {1,2,3} Total energy required = 22 Area available = May 2009 Slide 135

136 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } We can update the release time of job 3 to E {1,2} ej ( C c3 )( L{1,2} E{1,2} ) + c 3 Energy available for jobs 1,2 if space is left for job 3 to start anytime = May 2009 Slide 136

137 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } We can update the release time of job 3 to E {1,2} ej ( C c3 )( L{1,2} E{1,2} ) + c 3 Energy available for jobs 1,2 if space is left for job 3 to start anytime = 10 Excess energy required by jobs 1,2 = 4 May 2009 Slide

138 Edge finding for cumulative scheduling We can deduce that job 3 must finish after the others finish: 3 > { 1,2 } We can update the release time of job 3 to E {1,2} ej ( C c3 )( L{1,2} E{1,2} ) + c 3 Energy available for jobs 1,2 if space is left for job 3 to start anytime = 10 Excess energy required by jobs 1,2 = 4 May 2009 Slide Move up job 3 release time 4/2 = 2 units beyond E 10 {1,2} E 3

139 Edge finding for cumulative scheduling In general, if e ( ) J { k } > C LJ EJ { k } then k > J, and update E k to max J J e ( C c )( L E ) > 0 J k J J E J ej ( C ck )( LJ EJ ) + ck In general, if e ( ) J { k } > C LJ { k } EJ then k < J, and update L k to min J J e ( C c )( L E ) > 0 J k J J L J ej ( C ck )( LJ EJ ) ck May 2009 Slide 139

140 Edge finding for cumulative scheduling There is an O(n 2 ) algorithm that finds all applications of the edge finding rules. May 2009 Slide 140

141 Other propagation rules for cumulative scheduling Extended edge finding. Timetabling. Not-first/not-last rules. Energetic reasoning. May 2009 Slide 141

142 May 2009 Slide 142 Linear Relaxation Why Relax? Algebraic Analysis of LP Linear Programming Duality LP-Based Domain Filtering Example: Single-Vehicle Routing Disjunctions of Linear Systems

143 Why Relax? Solving a relaxation of a problem can: Tighten variable bounds. Possibly solve original problem. Guide the search in a promising direction. Filter domains using reduced costs or Lagrange multipliers. Prune the search tree using a bound on the optimal value. Provide a more global view, because a single OR relaxation can pool relaxations of several constraints. May 2009 Slide 143

144 Some OR models that can provide relaxations: Linear programming (LP). Mixed integer linear programming (MILP) Can itself be relaxed as an LP. LP relaxation can be strengthened with cutting planes. Lagrangean relaxation. Specialized relaxations. For particular problem classes. For global constraints. May 2009 Slide 144

145 Motivation Linear programming is remarkably versatile for representing real-world problems. LP is by far the most widely used tool for relaxation. LP relaxations can be strengthened by cutting planes. - Based on polyhedral analysis. LP has an elegant and powerful duality theory. - Useful for domain filtering, and much else. The LP problem is extremely well solved. May 2009 Slide 145

146 Algebraic Analysis of LP An example min 4x + 7x x + 3x 6 2x + x 4 x, x 0 2x 1 + x 2 4 Optimal solution x = (3,0) 4x 1 + 7x 2 = 12 2x 1 + 3x 2 6 May 2009 Slide 146

147 Algebraic Analysis of LP Rewrite min 4x + 7x x + 3x 6 2x + x 4 x, x 0 as min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x In general an LP has the form min cx Ax = b x 0 May 2009 Slide 147

148 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] m n matrix Basic variables Nonbasic variables Any set of m linearly independent columns of A. These form a basis for the space spanned by the columns. May 2009 Slide 148

149 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] Solve constraint equation for x B : x = 1 1 B B b B NxN All solutions can be obtained by setting x N to some value. The solution is basic if x N = 0. It is a basic feasible solution if x N = 0 and x B 0. May 2009 Slide 149

150 Example x 2 = basic feasible solution min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x x 2, x 3 basic 2x 1 + x 2 4 x 2, x 4 basic x 1, x 2 basic x 3, x 4 basic x 1, x 3 basic 2x 1 + 3x 2 6 x 1, x 4 basic x 1 May 2009 Slide 150

151 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] x = B 1 b B 1 Nx Solve constraint equation for x B : Express cost in terms of nonbasic variables: B N May 2009 Slide 151 c B b + ( c c B N) x 1 1 B N B N Vector of reduced costs Since x N 0, basic solution (x B,0) is optimal if reduced costs are nonnegative.

152 Example x 2 min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x Consider this basic feasible solution x 1, x 4 basic x 1 May 2009 Slide 152

153 Example Write as c B x B c N x N [ ] x + [ ] 1 x2 min 4x1 + 7x2 min x4 x3 2x 1 + 3x2 x3 = x1 3 1 x1 6 2x1 + x2 x4 = 4 Bx B + = 2 1 x4 1 0 x4 4 x1, x2, x3, x4 0 x1 x1 0 NxN b, x4 x4 0 May 2009 Slide 153

154 Example Bx B c B x B c N x N [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN b, x4 x4 0 May 2009 Slide 154

155 Example c B x B c N x N [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 Bx B + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN b, x4 x4 0 Basic solution is x = B b B Nx = B b B N x1 1/ = = = x x 2 x 1, x 4 basic May 2009 Slide 155 x 1

156 Example Bx B c B x B [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN, x4 x4 0 May 2009 Slide 156 c N x N Basic solution is x = B b B Nx = B b B Reduced costs are c N 1 cbb N 1/ = [ 7 0] [ 4 0] = [ 1 2] [ 0 0] N x1 1/ = = = x Solution is optimal

157 Linear Programming Duality An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v implies Dual problem: Find the tightest lower bound on the objective function that is implied by the constraints. May 2009 Slide 157

158 An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v That is, some surrogate (nonnegative linear combination) of Ax b dominates cx v From Farkas Lemma: If Ax b, x 0 is feasible, x 0 λax λb dominates cx v Ax b cx v iff for some λ 0 May 2009 Slide 158 λa c and λb v

159 An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v max λ 0 λb = This is the λa c classical LP dual From Farkas Lemma: If Ax b, x 0 is feasible, x 0 λax λb dominates cx v Ax b cx v iff for some λ 0 May 2009 Slide 159 λa c and λb v

160 This equality is called strong duality. min cx = max λb Ax b x 0 λa c λ 0 If Ax b, x 0 is feasible This is the classical LP dual Note that the dual of the dual is the primal (i.e., the original LP). May 2009 Slide 160

161 Example Primal min 4x + 7x = 2x + 3 x 6 ( λ ) 1 2 2x + x x, x ( λ ) 1 Dual max 6λ + 4λ = λ + 2λ 4 2 3λ + λ 7 λ, λ 0 ( x ( x 1 2 ) ) A dual solution is (λ 1,λ 2 ) = (2,0) May 2009 Slide 161 2x + 3x x + x 4 4x + 6x x + 7x ( λ = 2) 1 ( λ = 0) 2 dominates Dual multipliers Surrogate Tightest bound on cost

162 Weak Duality If x* is feasible in the primal problem and λ* is feasible in the dual problem then cx* λ*b. min cx Ax b x 0 max λb λa c λ 0 This is because cx* λ*ax* λ*b λ* is dual feasible and x* 0 x* is primal feasible and λ* 0 May 2009 Slide 162

163 Dual multipliers as marginal costs Suppose we perturb the RHS of an LP (i.e., change the requirement levels): min cx Ax b + b x 0 The dual of the perturbed LP has the same constraints at the original LP: max λ( b λa c + b) λ 0 So an optimal solution λ* of the original dual is feasible in the perturbed dual. May 2009 Slide 163

164 Dual multipliers as marginal costs Suppose we perturb the RHS of an LP (i.e., change the requirement levels): min cx Ax b + b x 0 By weak duality, the optimal value of the perturbed LP is at least λ*(b + b) = λ*b + λ* b. Optimal value of original LP, by strong duality. So λ i * is a lower bound on the marginal cost of increasing the i-th requirement by one unit ( b i = 1). If λ i * > 0, the i-th constraint must be tight (complementary slackness). May 2009 Slide 164

165 Dual of an LP in equality form Primal Dual min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) May 2009 Slide 165

166 Dual of an LP in equality form Primal Dual min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is c N 1 cbb N λ = c λn N this solves the dual if (x B,0) solves the primal May 2009 Slide 166

167 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is Check: 1 λb = c B B = c λ B B 1 N = cbb N cn c N 1 cbb N λ = c λn this solves the dual if (x B,0) solves the primal N Because reduced cost is nonnegative at optimal solution (x B,0). May 2009 Slide 167

168 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is c N 1 cbb N λ = c λn N In the example, May 2009 Slide 168 [ 4 0] 1/ 2 [ 2 0] λ = cbb = = this solves the dual if (x B,0) solves the primal

169 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is 1 cbb N Note that the reduced cost of an individual variable x j is c N λ = c λn N r j = c j λaj Column j of A May 2009 Slide 169

170 LP-based Domain Filtering Let min cx Ax b x 0 be an LP relaxation of a CP problem. One way to filter the domain of x j is to minimize and maximize x j subject to Ax b, x 0. - This is time consuming. A faster method is to use dual multipliers to derive valid inequalities. - A special case of this method uses reduced costs to bound or fix variables. - Reduced-cost variable fixing is a widely used technique in OR. May 2009 Slide 170

171 Suppose: min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*. and λ i * > 0, which means the i-th constraint is tight (complementary slackness); and the LP is a relaxation of a CP problem; and we have a feasible solution of the CP problem with value U, so that U is an upper bound on the optimal value. May 2009 Slide 171

172 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: If x were to change to a value other than x*, the LHS of i-th constraint A i x b i would change by some amount b i. Since the constraint is tight, this would increase the optimal value as much as changing the constraint to A i x b i + b i. So it would increase the optimal value at least λ i * b i. May 2009 Slide 172

173 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: We have found: a change in x that changes A i x by b i the optimal value of LP at least λ i * b i. increases Since optimal value of the LP optimal value of the CP U, we have λ i * b i U v*, or * b i U v λ * i May 2009 Slide 173

174 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: We have found: a change in x that changes A i x by b i the optimal value of LP at least λ i * b i. increases Since optimal value of the LP optimal value of the CP U, we have λ i * b i U v*, or * Since b i = A i x A i x* = A i x b i, this implies the inequality May 2009 Slide 174 i A x U v bi + λ * i * b i U v λ * i which can be propagated.

175 Example min x + 7x 1 2 2x + 3x 6 ( λ = 2) 2x + x 4 ( λ = 0) x, x A x Suppose we have a feasible solution of the original CP with value U = 13. Since the first constraint is tight, we can propagate the inequality U v b + λ 1 * 1 * or x1 + 3x2 6 + = May 2009 Slide 175

176 Reduced-cost domain filtering Suppose x j * = 0, which means the constraint x j 0 is tight. The inequality i A x * U v bi + becomes λ * i x j U r j v * The dual multiplier for x j 0 is the reduced cost r j of x j, because increasing x j (currently 0) by 1 increases optimal cost by r j. Similar reasoning can bound a variable below when it is at its upper bound. May 2009 Slide 176

177 Example min x + 7x 1 2 2x + 3x 6 ( λ = 2) 2x + x 4 ( λ = 0) x, x Since x 2 * = 0, we have or x 2 Suppose we have a feasible solution of the original CP with value U = = x 2 U r 2 v * If x 2 is required to be integer, we can fix it to zero. This is reduced-cost variable fixing. May 2009 Slide 177

178 Example: Single-Vehicle Routing A vehicle must make several stops and return home, perhaps subject to time windows. The objective is to find the order of stops that minimizes travel time. This is also known as the traveling salesman problem (with time windows). Travel time c ij Stop i Stop j May 2009 Slide 178

179 Assignment Relaxation min ij ij ij j j x ij x = x = 1, all i { } c x ij ji 0,1, all i, j = 1 if stop i immediately precedes stop j Stop i is preceded and followed by exactly one stop. May 2009 Slide 179

180 Assignment Relaxation min ij = x = 1, all 0 x 1, all i, j ij x ij j j ij c x ij ji = 1 if stop i immediately precedes stop j i Stop i is preceded and followed by exactly one stop. Because this problem is totally unimodular, it can be solved as an LP. The relaxation provides a very weak lower bound on the optimal value. But reduced-cost variable fixing can be very useful in a CP context. May 2009 Slide 180

181 Disjunctions of linear systems Disjunctions of linear systems often occur naturally in problems and can be given a convex hull relaxation. A disjunction of linear systems represents a union of polyhedra. min k k ( A x b ) k cx May 2009 Slide 181

182 Relaxing a disjunction of linear systems Disjunctions of linear systems often occur naturally in problems and can be given a convex hull relaxation. A disjunction of linear systems represents a union of polyhedra. We want a convex hull relaxation (tightest linear relaxation). min k k ( A x b ) k cx May 2009 Slide 182

183 Relaxing a disjunction of linear systems Disjunctions of linear systems often occur naturally in problems and can be given a convex hull relaxation. The closure of the convex hull of min k k ( A x b ) k cx May 2009 Slide 183 is described by min cx k k k A x b y, all k k y x = k k = 1 k 0 y 1 k x k

184 Why? To derive convex hull relaxation of a disjunction Write each solution as a convex combination of points in the polyhedron min cx k k k A x b, all k k y x = k k = 1 k 0 y 1 k y x k 1 x x 2 x Convex hull relaxation (tightest linear relaxation) May 2009 Slide 184

185 Why? To derive convex hull relaxation of a disjunction Write each solution as a convex combination of points in the polyhedron min k k k A x b, all k k x = y k cx k = 1 0 y 1 k y x k k Change of variable k x = y x k 1 x x min k k k A x b y, all k k x = y k cx k = 1 k 0 y 1 k 2 x x k Convex hull relaxation (tightest linear relaxation) May 2009 Slide 185

186 Mixed Integer/Linear Modeling MILP Representability Disjunctive Modeling Knapsack Modeling May 2009 Slide 186

187 Motivation A mixed integer/linear programming (MILP) problem has the form min cx + dy Ax + by b x, y 0 y integer We can relax a CP problem by modeling some constraints with an MILP. If desired, we can then relax the MILP by dropping the integrality constraint, to obtain an LP. The LP relaxation can be strengthened with cutting planes. The first step is to learn how to write MILP models. May 2009 Slide 187

188 MILP Representability R n A subset S of is MILP representable if it is the projection onto x of some MILP constraint set of the form Ax + Bu + Dy b x, y 0 { } n m x R, u R, y 0,1 k May 2009 Slide 188

189 MILP Representability R n A subset S of is MILP representable if it is the projection onto x of some MILP constraint set of the form Ax + Bu + Dy b x, y 0 { } n m x R, u R, y 0,1 k Recession cone of polyhedron R Theorem. S is MILP representable if and only if S is the union of finitely many polyhedra having the same recession cone. n Polyhedron May 2009 Slide 189

190 Example: Fixed charge function Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 May 2009 Slide 190 x 1

191 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Feasible set May 2009 Slide 191 x 1

192 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Union of two polyhedra P 1, P 2 P 1 May 2009 Slide 192 x 1

193 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Union of two polyhedra P 1, P 2 P 2 P 1 May 2009 Slide 193 x 1

194 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 The polyhedra have different recession cones. P 1 P 2 May 2009 Slide 194 x 1 P 1 recession cone P 2 recession cone

195 Example Minimize a fixed charge function: Add an upper bound on x 1 min x if x1 = 0 x2 f + cx1 if x1 > 0 0 x M x 2 The polyhedra have the same recession cone. P 1 P 2 May 2009 Slide 195 M x 1 P 1 recession cone P 2 recession cone

196 Modeling a union of polyhedra Start with a disjunction of linear systems to represent the union of polyhedra. The kth polyhedron is {x A k x b} min cx ( k k A x b ) k Introduce a 0-1 variable y k that is 1 when x is in polyhedron k. Disaggregate x to create an x k for each k. min cx k k k A x b y, all k k k y x = y k k = 1 x k { 0,1} k May 2009 Slide 196

197 Example Start with a disjunction of linear systems to represent the union of polyhedra min x 2 x1 = 0 0 x1 M x 0 x f + cx x 2 P 2 P 1 May 2009 Slide 197 M x 1

198 Example Start with a disjunction of linear systems to represent the union of polyhedra Introduce a 0-1 variable y k that is 1 when x is in polyhedron k. Disaggregate x to create an x k for each k. min x 2 x1 = 0 0 x1 M x 0 x f + cx min cx x = 0, x x1 My 2 cx1 + x2 fy2 0, { } y + y = 1, y 0,1 x = x + x k May 2009 Slide 198

199 Example To simplify: Replace x 12 with x 1. Replace x 22 with x 2. min = 0, x x1 My 2 cx1 + x2 fy2 0, Replace y 2 with y. y + y = 1, y { 0,1} x x x = x + x k This yields min 0 x x { 0,1} My x fy + cx y or min fy + cx 0 x y { 0,1} My Big M May 2009 Slide 199

200 Disjunctive Modeling Disjunctions often occur naturally in problems and can be given an MILP model. Recall that a disjunction of linear systems (representing polyhedra min k k with the same recession cone) ( A x b ) k cx May 2009 Slide 200 has the MILP model min cx k k k A x b y, all k k k y x = y k k = 1 x k { 0,1} k

201 Example: Uncapacitated facility location m possible factory locations n markets Locate factories to serve markets so as to minimize total fixed cost and transport cost. No limit on production capacity of each factory. Fixed cost f i i c ij Transport cost j May 2009 Slide 201

202 Uncapacitated facility location m possible factory locations n markets Disjunctive model: min i ij z + c x i i ij ij ij xij = 0, all j 0 xij 1, all j, all i zi 0 zi f = i x = 1, all j Fraction of market j s demand satisfied from location i Fixed cost f i i c ij Transport cost j No factory at location i Factory at location i May 2009 Slide 202

203 Uncapacitated facility location MILP formulation: min 0 x y, all i, j y i f y + c x ij i { 0,1} i i ij ij ij i Disjunctive model: min i i ij ij ij xij = 0, all j 0 xij 1, all j, all zi 0 zi f = i x = 1, all j i ij z + c x i No factory at location i Factory at location i May 2009 Slide 203

204 Uncapacitated facility location Maximum output from location i MILP formulation: min i 0 x y, all i, j y i f y + c x ij { 0,1} i i ij ij ij i Beginner s model: min y j i i x ny, all i, j ij f y + c x { 0,1} i i ij ij ij i Based on capacitated location model. It has a weaker continuous relaxation (obtained by replacing y i {0,1} with 0 y i 1). May 2009 Slide 204 This beginner s mistake can be avoided by starting with disjunctive formulation.

205 Knapsack Modeling Knapsack models consist of knapsack covering and knapsack packing constraints. The freight transfer model presented earlier is an example. We will consider a similar example that combines disjunctive and knapsack modeling. Most OR professionals are unlikely to write a model as good as the one presented here. May 2009 Slide 205

206 Note on tightness of knapsack models The continuous relaxation of a knapsack model is not in general a convex hull relaxation. - A disjunctive formulation would provide a convex hull relaxation, but there are exponentially many disjuncts. Knapsack cuts can significantly tighten the relaxation. May 2009 Slide 206

207 Example: Package transport Each package j has size a j Each truck i has capacity Q i and costs c i to operate May 2009 Slide 207 Truck i used 1 if truck i carries package j Disjunctive model min i i Q y a ; x = 1, all j i i j ij i j i ij i z yi = 1 z 0 i = c y i i = a zi 0, all i j xij Q = i j xij = 0 0 xij 1, all j x, y 0,1 { } 1 if truck i is used Knapsack constraints Truck i not used

208 Example: Package transport MILP model min i Q y a ; x = 1, all j a x Q y, all i x y, all i, j { }, y 0,1 i i i i j ij i j i x j ij ij j ij i i i i c y Disjunctive model min i i Q y a ; x = 1, all j i i j ij i j i yi = 1 z 0 i = c y i i = a zi 0, all j xij Q = i j xij = 0 0 xij 1, all j x, y 0,1 ij i z { } i May 2009 Slide 208

209 Example: Package transport MILP model min i Q y a ; x = 1, all j a x Q y, all i x y, all i, j { }, y 0,1 i i i i j ij i j i x j ij ij j ij i i i i c y Modeling trick; unobvious without disjunctive approach Most OR professionals would omit this constraint, since it is the sum over i of the next constraint. But it generates very effective knapsack cuts. May 2009 Slide 209

210 Cutting Planes 0-1 Knapsack Cuts Gomory Cuts Mixed Integer Rounding Cuts Example: Product Configuration May 2009 Slide 210

211 To review A cutting plane (cut, valid inequality) for an MILP model: is valid - It is satisfied by all feasible solutions of the model. cuts off solutions of the continuous relaxation. - This makes the relaxation tighter. Cutting plane Continuous relaxation Feasible solutions May 2009 Slide 211

212 Motivation Cutting planes (cuts) tighten the continuous relaxation of an MILP model. Knapsack cuts - Generated for individual knapsack constraints. - We saw general integer knapsack cuts earlier knapsack cuts and lifting techniques are well studied and widely used. Rounding cuts - Generated for the entire MILP, they are widely used. - Gomory cuts for integer variables only. - Mixed integer rounding cuts for any MILP. May 2009 Slide 212

213 0-1 Knapsack Cuts 0-1 knapsack cuts are designed for knapsack constraints with 0-1 variables. The analysis is different from that of general knapsack constraints, to exploit the special structure of 0-1 inequalities. May 2009 Slide 213

214 0-1 Knapsack Cuts 0-1 knapsack cuts are designed for knapsack constraints with 0-1 variables. The analysis is different from that of general knapsack constraints, to exploit the special structure of 0-1 inequalities. Consider a 0-1 knapsack packing constraint ax a 0. (Knapsack covering constraints are similarly analyzed.) Index set J is a cover if j 0 j J x j J 1 ax a j J 0 a > The cover inequality is a 0-1 knapsack cut for a May 2009 Slide 214 Only minimal covers need be considered.

215 Example J = {1,2,3,4} is a cover for 6x + 5x + 5x + 5x + 8x + 3x This gives rise to the cover inequality x1 + x2 + x3 + x4 3 Index set J is a cover if j 0 j J x j J 1 ax a j J 0 a > The cover inequality is a 0-1 knapsack cut for a May 2009 Slide 215 Only minimal covers need be considered.

216 Sequential lifting A cover inequality can often be strengthened by lifting it into a higher dimensional space. That is, by adding variables. Sequential lifting adds one variable at a time. Sequence-independent lifting adds several variables at once. May 2009 Slide 216

217 Sequential lifting To lift a cover inequality x J 1 j J add a term to the left-hand side + π 1 j j J x x J j k k where π k is the largest coefficient for which the inequality is still valid. So, π = J 1 max x a x a a k x 0 j { 0,1 j j j k } j J j J for j J This can be done repeatedly (by dynamic programming). May 2009 Slide 217

218 Example Given To lift add a term to the left-hand side where π = 3 max This yields 6x + 5x + 5x + 5x + 8x + 3x 17 x1 + x2 + x3 + x4 3 { x x x x x x x x } { 0,1} x j for j {1,2,3,4} x + x + x + x + 2x Further lifting leaves the cut unchanged. x x x x π x But if the variables are added in the order x 6, x 5, the result is different: x1 + x2 + x3 + x4 + x5 + x6 3 May 2009 Slide 218

219 Sequence-independent lifting Sequence-independent lifting usually yields a weaker cut than sequential lifting. But it adds all the variables at once and is much faster. Commonly used in commercial MILP solvers. May 2009 Slide 219

220 Sequence-independent lifting To lift a cover inequality x J 1 j J j add terms to the left-hand side x + ρ( a ) x J 1 where with j J j j k j J { } { } j if Aj u Aj + 1 and j 0,, p 1 ρ( u) = j + ( u Aj )/ if Aj u < Aj and j 1,, p 1 p + ( u Ap )/ if Ap u = J j J a j a { 1,, p} = 0 A j j = k = 1 A 0 = 0 a k May 2009 Slide 220

221 Example Given To lift Add terms 6x + 5x + 5x + 5x + 8x + 3x 17 x1 + x2 + x3 + x4 3 x + x + x + x + ρ(8) x + ρ(3) x 3 where ρ(u) is given by This yields the lifted cut x + x + x + x + (5 / 4) x + (1/ 4) x May 2009 Slide 221

222 Gomory Cuts When an integer programming problem has a nonintegral solution, we can generate at least one Gomory cut to cut off that solution. - This is a special case of a separating cut, because it separates the current solution of the relaxation from the feasible set. Gomory cuts are widely used and very effective in MILP solvers. Separating cut Solution of continuous relaxation Feasible solutions May 2009 Slide 222

223 Gomory cuts Given an integer programming problem min Ax x = cx b 0 and integral Let (x B,0) be an optimal solution of the continuous relaxation, where x ˆ ˆ B = b NxN ˆ 1 ˆ 1 b B = b, N = B N Then if x i is nonintegral in this solution, the following Gomory cut is violated by (x B,0): x ˆ ˆ i + N i x N b i May 2009 Slide 223

224 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / 3 May 2009 Slide 224

225 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral The Gomory cut is x [ ] 2 or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j x ˆ ˆ i + N i x N b i x / 9 1/ 9 2 / 3 x 4 0 and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / 3 or x2 x3 0 In x 1,x 2 space this is x1 + 2x2 3 May 2009 Slide 225

226 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / 3 bˆ 1 = 2 / 3 Gomory cut x 1 + 2x 2 3 May 2009 Slide 226 Gomory cut after re-solving LP with previous cut.

227 Mixed Integer Rounding Cuts Mixed integer rounding (MIR) cuts can be generated for solutions of any relaxed MILP in which one or more integer variables has a fractional value. Like Gomory cuts, they are separating cuts. MIR cuts are widely used in commercial solvers. May 2009 Slide 227

228 MIR cuts Given an MILP problem min cx + dy Ax + Dy = b x, y 0 and y integral In an optimal solution of the continuous relaxation, let J = { j y j is nonbasic} K = { j x j is nonbasic} N = nonbasic cols of [A D] Then if y i is nonintegral in this solution, the following MIR cut is violated by the solution of the relaxation: frac( Nˆ ) ˆ ˆ ij 1 ˆ + y ˆ ˆ i + N ij y j + N ij + Nij x j Nij bi frac( ˆ + ) frac( ˆ j J ) 1 j J b 2 i b i j K where J1 { j J frac( Nˆ ) frac( ˆ ij bj )} J = J \ J = 2 1 May 2009 Slide 228

229 Example 3x + 4x 6y 4y = x + 2x y y = x, y 0, y integer j j j Take basic solution (x 1,y 1 ) = (8/3,17/3). Then N 1/ 3 2 / 3 ˆ 8 / 3 = 2 / 3 8 / 3 bˆ = 17 / 3 J = {2}, K = {2}, J 1 =, J 2 = {2} 1/ The MIR cut is y1 + 1/ 3 + y2 + (2 / 3) x2 8 / 3 2 / 3 2 / 3 y + (1/ 2) y + x 3 or May 2009 Slide 229

230 Example: Product Configuration This example illustrates: Combination of propagation and relaxation. Processing of variable indices. Continuous relaxation of element constraint. May 2009 Slide 230

231 The problem Choose what type of each component, and how many Memory Memory Memory Memory Memory Memory Personal computer Power supply Disk drive Disk drive Disk drive Disk drive Disk drive Power supply Power supply Power supply May 2009 Slide 231

232 Model of the problem Unit cost of producing attribute j Amount of attribute j produced (< 0 if consumed): memory, heat, power, weight, etc. min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i Amount of attribute j produced by type t i of component i t i is a variable index Quantity of component i installed May 2009 Slide 232

233 To solve it: Branch on domains of t i and q i. Propagate element constraints and bounds on v j. Variable index is converted to specially structured element constraint. Valid knapsack cuts are derived and propagated. Use linear continuous relaxations. Special purpose MILP relaxation for element. May 2009 Slide 233

234 Propagation min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is propagated in the usual way May 2009 Slide 234

235 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is rewritten as This is propagated in the usual way May 2009 Slide 235

236 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This can be propagated by (a) using specialized filters for element constraints of this form May 2009 Slide 236

237 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This is propagated by (a) using specialized filters for element constraints of this form, (b) adding knapsack cuts for the valid inequalities: May 2009 Slide 237 i i max k D t i min k D t i { } A q v, all j ijk { } i A q v, all j ijk i j and (c) propagating the knapsack cuts. j [ v, v ] j j is current domain of v j

238 Relaxation min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is relaxed as v j v j v j May 2009 Slide 238

239 Relaxation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is relaxed by relaxing this and adding the knapsack cuts. This is relaxed as v j v j v j May 2009 Slide 239

240 Relaxation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This is relaxed by replacing each element constraint with a disjunctive convex hull relaxation: z = A q, q = q i ijk ik i ik k D k D ti ti May 2009 Slide 240

241 Relaxation So the following LP relaxation is solved at each node of the search tree to obtain a lower bound: May 2009 Slide 241 min j v = A q, all j j ijk ik i k D q = q, all i j k D t i v v v, all j q q q, all i { } knapsack cuts for max A q v, all j { } knapsack cuts for min A q v, all j q 0, all i, k j t i ik j j j i i i ik c v j i i k D k D t i t i ijk i j ijk i j

242 Computational Results Seconds x10 16x20 20x24 20x30 CPLEX CLP Hybrid Problem May 2009 Slide 242

243 Lagrangean Relaxation Lagrangean Duality Properties of the Lagrangean Dual Example: Fast Linear Programming Domain Filtering Example: Continuous Global Optimization May 2009 Slide 243

244 Motivation Lagrangean relaxation can provide better bounds than LP relaxation. The Lagrangean dual generalizes LP duality. It provides domain filtering analogous to that based on LP duality. - This is a key technique in continuous global optimization. Lagrangean relaxation gets rid of troublesome constraints by dualizing them. - That is, moving them into the objective function. - The Lagrangean relaxation may decouple. May 2009 Slide 244

245 Lagrangean Duality Consider an inequality-constrained problem min f ( x) g( x) 0 x S Hard constraints Easy constraints The object is to get rid of (dualize) the hard constraints by moving them into the objective function. May 2009 Slide 245

246 Lagrangean Duality Consider an inequality-constrained problem min f ( x) g( x) 0 x S It is related to an inference problem max v s S g( x) b f ( x) v implies Lagrangean Dual problem: Find the tightest lower bound on the objective function that is implied by the constraints. May 2009 Slide 246

247 Primal Dual min f ( x) max v g( x) 0 s S g( x) b f ( x) v x S Surrogate Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S May 2009 Slide 247

248 Primal Dual min f ( x) max v g( x) 0 s S g( x) b f ( x) v x S Surrogate Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S Or v min { f ( x) λg( x) } x S May 2009 Slide 248

249 Primal min f ( x) g( x) 0 x S Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 So the dual becomes May 2009 Slide 249 max v λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S Or Dual max { λ } v s S g( x) b f ( x) v v min f ( x) g( x) for some λ 0 x S Surrogate { λ } v min f ( x) g( x) x S

250 Now we have Primal min f ( x) g( x) 0 x S These constraints are dualized Dual max v or where max θ( λ) λ 0 { λ } v min f ( x) g( x) for some λ 0 x S { f x λg x } θ( λ) = min ( ) ( ) x S Lagrangean relaxation Vector of Lagrange multipliers May 2009 Slide 250 The Lagrangean dual can be viewed as the problem of finding the Lagrangean relaxation that gives the tightest bound.

251 Example min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 The Lagrangean relaxation is { x x x x x x } θ( λ, λ ) = min λ ( + 3 ) λ (2 + 5) x {0,,3} x {0,,3} j { λ λ x λ λ x λ } = min (3 + 2 ) + (4 3 ) + 5 j The Lagrangean relaxation is easy to solve for any given λ 1, λ 2 : Strongest surrogate x x if 3 + λ1 2λ2 0 = 3 otherwise 0 if 4 3λ1 λ2 0 = 3 otherwise May 2009 Slide 251 Optimal solution (2,1)

252 Example θ(λ 1,λ 2 ) is piecewise linear and concave. min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 λ 2 θ(λ)=5 θ(λ)=9 2/7 θ(λ)=0 λ 1 θ(λ)=0 θ(λ)=7.5 Solution of Lagrangean dual: (λ 1,λ 2 ) = (5/7, 13/7), θ(λ) = 9 2/7 May 2009 Slide 252 Optimal solution (2,1) Value = 10 Note duality gap between 10 and 9 2/7 (no strong duality).

253 Example min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 Note: in this example, the Lagrangean dual provides the same bound (9 2/7) as the continuous relaxation of the IP. This is because the Lagrangean relaxation can be solved as an LP: { x x } θ( λ, λ ) = min (3 + λ 2 λ ) + (4 3 λ λ ) + 5λ 1 2 x {0,,3} x 3 j { λ λ x λ λ x λ } = min (3 + 2 ) + (4 3 ) + 5 j Lagrangean duality is useful when the Lagrangean relaxation is tighter than an LP but nonetheless easy to solve. May 2009 Slide 253

254 Properties of the Lagrangean dual Weak duality: For any feasible x* and any λ* 0, f(x*) θ(λ*). In particular, min f ( x) g( x) 0 x S max θ( λ) λ 0 Concavity: θ(λ) is concave. It can therefore be maximized by local search methods. Complementary slackness: If x* and λ* are optimal, and there is no duality gap, then λ*g(x*) = 0. May 2009 Slide 254

255 Solving the Lagrangean dual Let λ k be the kth iterate, and let λ + = λ + α ξ k 1 k k k Subgradient of θ(λ) at λ = λ k If x k solves the Lagrangean relaxation for λ = λ k, then ξ k = g(x k ). This is because θ(λ) = f(x k ) + λg(x k ) at λ = λ k. The stepsize α k must be adjusted so that the sequence converges but not before reaching a maximum. May 2009 Slide 255

256 Example: Fast Linear Programming In CP contexts, it is best to process each node of the search tree very rapidly. Lagrangean relaxation may allow very fast calculation of a lower bound on the optimal value of the LP relaxation at each node. The idea is to solve the Lagrangean dual at the root node (which is an LP) and use the same Lagrange multipliers to get an LP bound at other nodes. May 2009 Slide 256

257 At root node, solve Dualize Special structure, e.g. variable bounds min cx Ax Dx b d x 0 ( λ) The (partial) LP dual solution λ* solves the Lagrangean dual in which x 0 { cx λ Ax b } θ( λ) = min ( ) Dx d May 2009 Slide 257

258 At root node, solve Dualize Special structure, e.g. variable bounds min cx Ax Dx b d x 0 ( λ) The (partial) LP dual solution λ* solves the Lagrangean dual in which x 0 { cx λ Ax b } θ( λ) = min ( ) Dx d At another node, the LP is Here θ(λ*) is still a lower bound on the optimal value of the LP and can be quickly calculated by solving a specially structured LP. min Ax Dx Hx x 0 cx b d h ( λ) Branching constraints, etc. May 2009 Slide 258

259 Domain Filtering Suppose: min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*. and λ i * > 0, which means the i-th constraint is tight (complementary slackness); and the problem is a relaxation of a CP problem; and we have a feasible solution of the CP problem with value U, so that U is an upper bound on the optimal value. May 2009 Slide 259

260 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: If x were to change to a value other than x*, the LHS of i-th constraint g i (x) 0 would change by some amount i. Since the constraint is tight, this would increase the optimal value as much as changing the constraint to g i (x) i 0. So it would increase the optimal value at least λ i * i. (It is easily shown that Lagrange multipliers are marginal costs. Dual multipliers for LP are a special case of Lagrange multipliers.) May 2009 Slide 260

261 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: We have found: a change in x that changes g i (x) by i the optimal value at least λ i * i. increases Since optimal value of this problem optimal value of the CP U, we have λ i * i U v*, or * U v i λ * i May 2009 Slide 261

262 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: We have found: a change in x that changes g i (x) by i the optimal value at least λ i * i. increases Since optimal value of this problem optimal value of the CP U, we have λ i * i U v*, or * Since i = g i (x) g i (x*) = g i (x), this implies the inequality May 2009 Slide 262 g ( x) i U v λ * i * U v i λ * i which can be propagated.

263 Example: Continuous Global Optimization Some of the best continuous global solvers (e.g., BARON) combine OR-style relaxation with CP-style interval arithmetic and domain filtering. These methods can be combined with domain filtering based on Lagrange multipliers. May 2009 Slide 263

264 Continuous Global Optimization max 1 2 x 4x x = x 1 2 2x + x 2 x + [0,1], x [0,2] 1 2 x 2 Global optimum Local optimum Feasible set May 2009 Slide 264 x 1

265 To solve it: Search: split interval domains of x 1, x 2. Each node of search tree is a problem restriction. Propagation: Interval propagation, domain filtering. Use Lagrange multipliers to infer valid inequality for propagation. Reduced-cost variable fixing is a special case. Relaxation: Use function factorization to obtain linear continuous relaxation. May 2009 Slide 265

266 Interval propagation x 2 Propagate intervals [0,1], [0,2] through constraints to obtain [1/8,7/8], [1/4,7/4] x 1 May 2009 Slide 266

267 Relaxation (function factorization) Factor complex functions into elementary functions that have known linear relaxations. Write 4x 1 x 2 = 1 as 4y = 1 where y = x 1 x 2. This factors 4x 1 x 2 into linear function 4y and bilinear function x 1 x 2. Linear function 4y is its own linear relaxation. May 2009 Slide 267

268 Relaxation (function factorization) Factor complex functions into elementary functions that have known linear relaxations. Write 4x 1 x 2 = 1 as 4y = 1 where y = x 1 x 2. This factors 4x 1 x 2 into linear function 4y and bilinear function x 1 x 2. Linear function 4y is its own linear relaxation. Bilinear function y = x 1 x 2 has relaxation: x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x where domain of x j is [ x, x ] j j May 2009 Slide 268

269 Relaxation (function factorization) The linear relaxation becomes: min x 4y = x 1 2 2x + x 2 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j May 2009 Slide 269

270 Relaxation (function factorization) x 2 Solve linear relaxation. x 1 May 2009 Slide 270

271 Relaxation (function factorization) x 2 Solve linear relaxation. x2 [1,1.75] Since solution is infeasible, split an interval and branch. x2 [0.25,1] x 1 May 2009 Slide 271

272 x2 [1,1.75] x2 [0.25,1] x 2 x 2 x 1 x 1 May 2009 Slide 272

273 x2 [1,1.75] x2 [0.25,1] x 2 Solution of relaxation is feasible, value = 1.25 This becomes incumbent solution x 2 x 1 x 1 May 2009 Slide 273

274 x2 [1,1.75] x2 [0.25,1] x 2 Solution of relaxation is feasible, value = 1.25 This becomes incumbent solution x 2 Solution of relaxation is not quite feasible, value = Also use Lagrange multipliers for domain filtering x 1 x 1 May 2009 Slide 274

275 Relaxation (function factorization) min x 4y = x 1 2 2x + x 2 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j Associated Lagrange multiplier in solution of relaxation is λ 2 = 1.1 May 2009 Slide 275

276 Relaxation (function factorization) min x 4y = x 1 2 2x + x 2 Associated Lagrange multiplier in solution of relaxation is λ 2 = 1.1 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j May 2009 Slide 276 This yields a valid inequality for propagation: x1 + x2 2 = Value of relaxation Lagrange multiplier Value of incumbent solution

277 Dynamic Programming in CP Example: Capital Budgeting Domain Filtering Recursive Optimization May 2009 Slide 277

278 Motivation Dynamic programming (DP) is a highly versatile technique that can exploit recursive structure in a problem. Domain filtering is straightforward for problems modeled as a DP. DP is also important in designing filters for some global constraints, such as the stretch constraint (employee scheduling). Nonserial DP is related to bucket elimination in CP and exploits the structure of the primal graph. DP modeling is the art of keeping the state space small while maintaining a Markovian property. We will examine only one simple example of serial DP. May 2009 Slide 278

279 Example: Capital Budgeting We wish to built power plants with a total cost of at most 12 million Euros. There are three types of plants, costing 4, 2 or 3 million Euros each. We must build one or two of each type. The problem has a simple knapsack packing model: Number of factories of type j 4x + 2x + 3x 12 x j { 1,2 } May 2009 Slide 279

280 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } State s k Stage k In general the recursion for ax b is { } f ( s ) = max f ( s + a x ) k k k + 1 x D k k k k x k x 3 =2 = 1 if there is a path from state s k to a feasible solution, 0 otherwise May 2009 Slide 280 State is sum of first k terms of ax x 3 =1 f 3 (8) = max{f 4 (8+3 1), f 4 (8+3 2)} = max{1,0} = 1 f 4 (14)=0 f 4 (11)=1

281 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } In general the recursion for ax b is { } f ( s ) = max f ( s + a x ) k k k + 1 x D k k k k x k Boundary condition: f 1 if sn+ 1 b ( s ) = 0 otherwise n+ 1 n f k (s k ) for each state s k May 2009 Slide 281

282 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } The problem is feasible. 1 0 Each path to 1 is a feasible solution Path 1: x = (1,2,1) 1 1 Path 2: x = (1,1,2) 1 Path 3: x = (1,1,1) Possible costs are 9,11,12. f k (s k ) for each state s k May 2009 Slide 282

283 Domain Filtering 4x + 2x + 3x 12 x j { 1,2 } To filter domains: observe what values of x k occur on feasible paths. x 1 =1 x 2 =2 x 3 =1 D = x 3 D = x 2 D = x 1 { 1,2 } { 1,2 } { 1} x 2 =1 x 3 =2 x 3 =1 May 2009 Slide 283

284 Recursive Optimization max 15x + 10x + 12x { 1,2 } x + 2x + 3x 12 x j Maximize revenue The recursion includes arc values: { } f ( s ) = max c x + f ( s + a x ) k k k k k + 1 x D k k k k x k = value on max value path from s k to final stage (value to go) May 2009 Slide 284 Arc value f 3 (8) = max{12 1+f 4 (8+3 1), 12 2+f 4 (8+3 2)} = max{12, } = 12 f 4 (14)= f 4 (11)=0

285 f Recursive optimization max 15x + 10x + 12x { 1,2 } Boundary condition: 0 if sn+ 1 b ( s ) = otherwise n+ 1 n x + 2x + 3x 12 x j The recursion includes arc values: { } f ( s ) = max c x + f ( s + a x ) k k k k k + 1 k k k f k (s k ) for each state s k May 2009 Slide 285

286 Recursive optimization max 15x + 10x + 12x { 1,2 } x + 2x + 3x 12 x j 49 The maximum revenue is The optimal path is easy to retrace (x 1,x 2,x 3 ) = (1,1,2) 0 f k (s k ) for each state s k May 2009 Slide 286

287 CP-based Branch and Price Basic Idea Example: Airline Crew Scheduling May 2009 Slide 287

288 Motivation Branch and price allows solution of integer programming problems with a huge number of variables. The problem is solved by a branch-and-relax method. The difference lies in how the LP relaxation is solved. Variables are added to the LP relaxation only as needed. Variables are priced to find which ones should be added. CP is useful for solving the pricing problem, particularly when constraints are complex. CP-based branch and price has been successfully applied to airline crew scheduling, transit scheduling, and other transportation-related problems. May 2009 Slide 288

289 Basic Idea Suppose the LP relaxation of an integer programming problem has a huge number of variables: We will solve a restricted master problem, which has a small subset of the variables: Column j of A min cx Ax = b x 0 min j J x j j 0 j J A x j c x j = b j ( λ) Adding x k to the problem would improve the solution if x k has a negative reduced cost: r = c λa < 0 k k k May 2009 Slide 289

290 Basic Idea Adding x k to the problem would improve the solution if x k has a negative reduced cost: r = c λa < k k k 0 Computing the reduced cost of x k is known as pricing x k. Cost of column y So we solve the pricing problem: min y cy λy is a column of A If the solution y* satisfies c y* λy* < 0, then we can add column y to the restricted master problem. May 2009 Slide 290

291 Basic Idea The pricing problem max λy y is a column of A need not be solved to optimality, so long as we find a column with negative reduced cost. However, when we can no longer find an improving column, we solved the pricing problem to optimality to make sure we have the optimal solution of the LP. If we can state constraints that the columns of A must satisfy, CP may be a good way to solve the pricing problem. May 2009 Slide 291

292 Example: Airline Crew Scheduling We want to assign crew members to flights to minimize cost while covering the flights and observing complex work rules. Flight data Start time Finish time May 2009 Slide 292 A roster is the sequence of flights assigned to a single crew member. The gap between two consecutive flights in a roster must be from 2 to 3 hours. Total flight time for a roster must be between 6 and 10 hours. For example, flight 1 cannot immediately precede 6 flight 4 cannot immediately precede 5. The possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6)

293 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 293

294 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 294 Rosters that cover flight 1.

295 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 295 Rosters that cover flight 2.

296 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 296 Rosters that cover flight 3.

297 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 297 Rosters that cover flight 4.

298 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 298 Rosters that cover flight 5.

299 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 299 Rosters that cover flight 6.

300 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost c 12 of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member. May 2009 Slide 300 In a real problem, there can be millions of rosters.

301 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Optimal dual solution u 1 u 2 v 1 v 2 v 3 v 4 v 5 v 6 May 2009 Slide 301

302 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Dual variables u 1 u 2 v 1 v 2 v 3 v 4 v 5 v 6 May 2009 Slide 302

303 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Dual variables u 1 u 2 v 1 v 2 v 3 v 4 v 5 v 6 The reduced cost of an excluded roster k for crew member i is c u v ik i j j in roster k We will formulate the pricing problem as a shortest path problem. May 2009 Slide 303

304 Pricing problem 2 Crew member 1 Crew member 2 May 2009 Slide 304

305 Pricing problem Each s-t path corresponds to a roster, provided the flight time is within bounds. 2 Crew member 1 Crew member 2 May 2009 Slide 305

306 Pricing problem Cost of flight 3 if it immediately follows flight 1, offset by dual multiplier for flight 1 2 Crew member 1 Crew member 2 May 2009 Slide 306

307 Pricing problem Cost of transferring from home to flight 1, offset by dual multiplier for crew member 1 2 Crew member 1 Dual multiplier omitted to break symmetry Crew member 2 May 2009 Slide 307

308 Pricing problem Length of a path is reduced cost of the corresponding roster. 2 Crew member 1 Crew member 2 May 2009 Slide 308

309 Pricing problem Arc lengths using dual solution of LP relaxation Crew member Crew member May 2009 Slide 309

310 Pricing problem Solution of shortest path problems Crew member 1 Reduced cost = 1 Add x 12 to problem Crew member 2 Reduced cost = 2 Add x 23 to problem. May 2009 Slide After x 12 and x 23 are added to the problem, no remaining variable has negative reduced cost.

311 Pricing problem The shortest path problem cannot be solved by traditional shortest path algorithms, due to the bounds on total duration of flights. It can be solved by CP: Set of flights assigned to crew member i Path length Graph Path global constraint Setsum global constraint Path( X, z, G), all flights i min i j X i { } i T f s T max X flights, z < 0, all i i ( j j ) i Duration of flight j May 2009 Slide 311

312 CP-based Benders Decomposition Benders Decomposition in the Abstract Classical Benders Decomposition Example: Machine Scheduling May 2009 Slide 312

Integrating CP and Mathematical Programming

Integrating CP and Mathematical Programming Integrating CP and Mathematical Programming John Hooker Carnegie Mellon University June 2011 June 2011 Slide 1 Why Integrate CP and MP? Complementary strengths Computational advantages Outline of the Tutorial

More information

Operations Research Methods in Constraint Programming

Operations Research Methods in Constraint Programming Operations Research Methods in Constraint Programming John Hooker Carnegie Mellon University Prepared for Lloret de Mar, Spain June 2007 2007 Slide 1 CP and OR Have Complementary Strengths CP: Inference

More information

Combining Optimization and Constraint Programming

Combining Optimization and Constraint Programming Combining Optimization and Constraint Programming John Hooker Carnegie Mellon University GE Research Center 7 September 2007 GE 7 Sep 07 Slide 1 Optimization and Constraint Programming Optimization: focus

More information

A Framework for Integrating Optimization and Constraint Programming

A Framework for Integrating Optimization and Constraint Programming A Framework for Integrating Optimization and Constraint Programming John Hooker Carnegie Mellon University SARA 2007 SARA 07 Slide Underlying theme Model vs. solution method. How can we match the model

More information

Recent Developments in Integrated Methods for Optimization

Recent Developments in Integrated Methods for Optimization Recent Developments in Integrated Methods for Optimization John Hooker Carnegie Mellon University August 2012 Premise Constraint programming and mathematical programming can work together. There is underlying

More information

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008 How to Relax Slide 1 John Hooker Carnegie Mellon University September 2008 Two ways to relax Relax your mind and body. Relax your problem formulations. Slide 2 Relaxing a problem Feasible set of original

More information

Integrating Solution Methods. through Duality. US-Mexico Workshop on Optimization. Oaxaca, January John Hooker

Integrating Solution Methods. through Duality. US-Mexico Workshop on Optimization. Oaxaca, January John Hooker Integrating Solution Methods through Duality John Hooker US-Meico Workshop on Optimization Oaaca, January 2011 Zapotec Duality mask Late classical period Represents life/death, day/night, heat/cold See

More information

Projection in Logic, CP, and Optimization

Projection in Logic, CP, and Optimization Projection in Logic, CP, and Optimization John Hooker Carnegie Mellon University Workshop on Logic and Search Melbourne, 2017 Projection as a Unifying Concept Projection is a fundamental concept in logic,

More information

An Integrated Approach to Truss Structure Design

An Integrated Approach to Truss Structure Design Slide 1 An Integrated Approach to Truss Structure Design J. N. Hooker Tallys Yunes CPAIOR Workshop on Hybrid Methods for Nonlinear Combinatorial Problems Bologna, June 2010 How to Solve Nonlinear Combinatorial

More information

Decision Diagrams for Sequencing and Scheduling

Decision Diagrams for Sequencing and Scheduling Decision Diagrams for Sequencing and Scheduling Willem-Jan van Hoeve Tepper School of Business Carnegie Mellon University www.andrew.cmu.edu/user/vanhoeve/mdd/ Plan What can MDDs do for Combinatorial Optimization?

More information

Alternative Methods for Obtaining. Optimization Bounds. AFOSR Program Review, April Carnegie Mellon University. Grant FA

Alternative Methods for Obtaining. Optimization Bounds. AFOSR Program Review, April Carnegie Mellon University. Grant FA Alternative Methods for Obtaining Optimization Bounds J. N. Hooker Carnegie Mellon University AFOSR Program Review, April 2012 Grant FA9550-11-1-0180 Integrating OR and CP/AI Early support by AFOSR First

More information

Single-Facility Scheduling by Logic-Based Benders Decomposition

Single-Facility Scheduling by Logic-Based Benders Decomposition Single-Facility Scheduling by Logic-Based Benders Decomposition Elvin Coban J. N. Hooker Tepper School of Business Carnegie Mellon University ecoban@andrew.cmu.edu john@hooker.tepper.cmu.edu September

More information

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009 Lessons from MIP Search John Hooker Carnegie Mellon University November 2009 Outline MIP search The main ideas Duality and nogoods From MIP to AI (and back) Binary decision diagrams From MIP to constraint

More information

Scheduling Home Hospice Care with Logic-Based Benders Decomposition

Scheduling Home Hospice Care with Logic-Based Benders Decomposition Scheduling Home Hospice Care with Logic-Based Benders Decomposition John Hooker Carnegie Mellon University Joint work with Aliza Heching Ryo Kimura Compassionate Care Hospice CMU Lehigh University October

More information

An Integrated Method for Planning and Scheduling to Minimize Tardiness

An Integrated Method for Planning and Scheduling to Minimize Tardiness An Integrated Method for Planning and Scheduling to Minimize Tardiness J N Hooker Carnegie Mellon University john@hookerteppercmuedu Abstract We combine mixed integer linear programming (MILP) and constraint

More information

Decision Diagrams: Tutorial

Decision Diagrams: Tutorial Decision Diagrams: Tutorial John Hooker Carnegie Mellon University CP Summer School Cork, Ireland, June 2016 Decision Diagrams Used in computer science and AI for decades Logic circuit design Product configuration

More information

Logic, Optimization and Data Analytics

Logic, Optimization and Data Analytics Logic, Optimization and Data Analytics John Hooker Carnegie Mellon University United Technologies Research Center, Cork, Ireland August 2015 Thesis Logic and optimization have an underlying unity. Ideas

More information

A Scheme for Integrated Optimization

A Scheme for Integrated Optimization A Scheme for Integrated Optimization John Hooker ZIB November 2009 Slide 1 Outline Overview of integrated methods A taste of the underlying theory Eamples, with results from SIMPL Proposal for SCIP/SIMPL

More information

Decision Diagrams for Discrete Optimization

Decision Diagrams for Discrete Optimization Decision Diagrams for Discrete Optimization Willem Jan van Hoeve Tepper School of Business Carnegie Mellon University www.andrew.cmu.edu/user/vanhoeve/mdd/ Acknowledgments: David Bergman, Andre Cire, Samid

More information

Logic-Based Benders Decomposition

Logic-Based Benders Decomposition Logic-Based Benders Decomposition J. N. Hooker Graduate School of Industrial Administration Carnegie Mellon University, Pittsburgh, PA 15213 USA G. Ottosson Department of Information Technology Computing

More information

Projection, Inference, and Consistency

Projection, Inference, and Consistency Projection, Inference, and Consistency John Hooker Carnegie Mellon University IJCAI 2016, New York City A high-level presentation. Don t worry about the details. 2 Projection as a Unifying Concept Projection

More information

The Separation Problem for Binary Decision Diagrams

The Separation Problem for Binary Decision Diagrams The Separation Problem for Binary Decision Diagrams J. N. Hooker Joint work with André Ciré Carnegie Mellon University ISAIM 2014 Separation Problem in Optimization Given a relaxation of an optimization

More information

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 8 Dr. Ted Ralphs ISE 418 Lecture 8 1 Reading for This Lecture Wolsey Chapter 2 Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Duality for Mixed-Integer

More information

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING 1. Lagrangian Relaxation Lecture 12 Single Machine Models, Column Generation 2. Dantzig-Wolfe Decomposition Dantzig-Wolfe Decomposition Delayed Column

More information

Operations Research as Empirical Modeling

Operations Research as Empirical Modeling Operations Research as Empirical Modeling John Hooker Carnegie Mellon University June 2009 June 2009 Slide 1 OR as Modeling Current practice: research in OR is primarily about algorithms. Computational

More information

Scheduling with Constraint Programming. Job Shop Cumulative Job Shop

Scheduling with Constraint Programming. Job Shop Cumulative Job Shop Scheduling with Constraint Programming Job Shop Cumulative Job Shop CP vs MIP: Task Sequencing We need to sequence a set of tasks on a machine Each task i has a specific fixed processing time p i Each

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-16-17.shtml Academic year 2016-17

More information

An Integrated Solver for Optimization Problems

An Integrated Solver for Optimization Problems WORKING PAPER WPS-MAS-08-01 Department of Management Science School of Business Administration, University of Miami Coral Gables, FL 33124-8237 Created: December 2005 Last update: July 2008 An Integrated

More information

Planning and Scheduling by Logic-Based Benders Decomposition

Planning and Scheduling by Logic-Based Benders Decomposition Planning and Scheduling by Logic-Based Benders Decomposition J. N. Hooker Carnegie Mellon University john@hooker.tepper.cmu.edu July 2004, Revised October 2005 Abstract We combine mixed integer linear

More information

arxiv: v1 [cs.cc] 5 Dec 2018

arxiv: v1 [cs.cc] 5 Dec 2018 Consistency for 0 1 Programming Danial Davarnia 1 and J. N. Hooker 2 1 Iowa state University davarnia@iastate.edu 2 Carnegie Mellon University jh38@andrew.cmu.edu arxiv:1812.02215v1 [cs.cc] 5 Dec 2018

More information

Workforce Scheduling. Outline DM87 SCHEDULING, TIMETABLING AND ROUTING. Outline. Workforce Scheduling. 1. Workforce Scheduling.

Workforce Scheduling. Outline DM87 SCHEDULING, TIMETABLING AND ROUTING. Outline. Workforce Scheduling. 1. Workforce Scheduling. Outline DM87 SCHEDULING, TIMETABLING AND ROUTING Lecture 17 Workforce Scheduling 2. Crew Scheduling and Roering Marco Chiarandini DM87 Scheduling, Timetabling and Routing 2 Outline Workforce Scheduling

More information

Resource Constrained Project Scheduling Linear and Integer Programming (1)

Resource Constrained Project Scheduling Linear and Integer Programming (1) DM204, 2010 SCHEDULING, TIMETABLING AND ROUTING Lecture 3 Resource Constrained Project Linear and Integer Programming (1) Marco Chiarandini Department of Mathematics & Computer Science University of Southern

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Sito web: http://home.deib.polimi.it/amaldi/ott-13-14.shtml A.A. 2013-14 Edoardo

More information

is called an integer programming (IP) problem. model is called a mixed integer programming (MIP)

is called an integer programming (IP) problem. model is called a mixed integer programming (MIP) INTEGER PROGRAMMING Integer Programming g In many problems the decision variables must have integer values. Example: assign people, machines, and vehicles to activities in integer quantities. If this is

More information

Stochastic Decision Diagrams

Stochastic Decision Diagrams Stochastic Decision Diagrams John Hooker CORS/INFORMS Montréal June 2015 Objective Relaxed decision diagrams provide an generalpurpose method for discrete optimization. When the problem has a dynamic programming

More information

Combinatorial optimization problems

Combinatorial optimization problems Combinatorial optimization problems Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Optimization In general an optimization problem can be formulated as:

More information

Consistency as Projection

Consistency as Projection Consistency as Projection John Hooker Carnegie Mellon University INFORMS 2015, Philadelphia USA Consistency as Projection Reconceive consistency in constraint programming as a form of projection. For eample,

More information

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming: models and applications; complexity Ann-Brith Strömberg 2011 04 01 Modelling with integer variables (Ch. 13.1) Variables Linear programming (LP) uses continuous

More information

Introduction to optimization and operations research

Introduction to optimization and operations research Introduction to optimization and operations research David Pisinger, Fall 2002 1 Smoked ham (Chvatal 1.6, adapted from Greene et al. (1957)) A meat packing plant produces 480 hams, 400 pork bellies, and

More information

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the multi-dimensional knapsack problem, additional

More information

Discrete (and Continuous) Optimization WI4 131

Discrete (and Continuous) Optimization WI4 131 Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek e-mail: C.Roos@ewi.tudelft.nl

More information

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING Lecture 3 and Mixed Integer Programg Marco Chiarandini 1. Resource Constrained Project Model 2. Mathematical Programg 2 Outline Outline 1. Resource Constrained

More information

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010 Section Notes 9 IP: Cutting Planes Applied Math 121 Week of April 12, 2010 Goals for the week understand what a strong formulations is. be familiar with the cutting planes algorithm and the types of cuts

More information

Travelling Salesman Problem

Travelling Salesman Problem Travelling Salesman Problem Fabio Furini November 10th, 2014 Travelling Salesman Problem 1 Outline 1 Traveling Salesman Problem Separation Travelling Salesman Problem 2 (Asymmetric) Traveling Salesman

More information

The core of solving constraint problems using Constraint Programming (CP), with emphasis on:

The core of solving constraint problems using Constraint Programming (CP), with emphasis on: What is it about? l Theory The core of solving constraint problems using Constraint Programming (CP), with emphasis on: l Modeling l Solving: Local consistency and propagation; Backtracking search + heuristics.

More information

Duality in Optimization and Constraint Satisfaction

Duality in Optimization and Constraint Satisfaction Duality in Optimization and Constraint Satisfaction J. N. Hooker Carnegie Mellon University, Pittsburgh, USA john@hooker.tepper.cmu.edu Abstract. We show that various duals that occur in optimization and

More information

Integer Programming, Constraint Programming, and their Combination

Integer Programming, Constraint Programming, and their Combination Integer Programming, Constraint Programming, and their Combination Alexander Bockmayr Freie Universität Berlin & DFG Research Center Matheon Eindhoven, 27 January 2006 Discrete Optimization General framework

More information

Projection, Consistency, and George Boole

Projection, Consistency, and George Boole Projection, Consistency, and George Boole John Hooker Carnegie Mellon University CP 2015, Cork, Ireland Projection as a Unifying Concept Projection underlies both optimization and logical inference. Optimization

More information

3.7 Cutting plane methods

3.7 Cutting plane methods 3.7 Cutting plane methods Generic ILP problem min{ c t x : x X = {x Z n + : Ax b} } with m n matrix A and n 1 vector b of rationals. According to Meyer s theorem: There exists an ideal formulation: conv(x

More information

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 21 Dr. Ted Ralphs IE406 Lecture 21 1 Reading for This Lecture Bertsimas Sections 10.2, 10.3, 11.1, 11.2 IE406 Lecture 21 2 Branch and Bound Branch

More information

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA Gestion de la production Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA 1 Contents 1 Integer Linear Programming 3 1.1 Definitions and notations......................................

More information

Linear Programming Redux

Linear Programming Redux Linear Programming Redux Jim Bremer May 12, 2008 The purpose of these notes is to review the basics of linear programming and the simplex method in a clear, concise, and comprehensive way. The book contains

More information

Decision Procedures An Algorithmic Point of View

Decision Procedures An Algorithmic Point of View An Algorithmic Point of View ILP References: Integer Programming / Laurence Wolsey Deciding ILPs with Branch & Bound Intro. To mathematical programming / Hillier, Lieberman Daniel Kroening and Ofer Strichman

More information

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs Computational Integer Programming Lecture 2: Modeling and Formulation Dr. Ted Ralphs Computational MILP Lecture 2 1 Reading for This Lecture N&W Sections I.1.1-I.1.6 Wolsey Chapter 1 CCZ Chapter 2 Computational

More information

Optimization Bounds from Binary Decision Diagrams

Optimization Bounds from Binary Decision Diagrams Optimization Bounds from Binary Decision Diagrams J. N. Hooker Joint work with David Bergman, André Ciré, Willem van Hoeve Carnegie Mellon University ICS 203 Binary Decision Diagrams BDDs historically

More information

1 The linear algebra of linear programs (March 15 and 22, 2015)

1 The linear algebra of linear programs (March 15 and 22, 2015) 1 The linear algebra of linear programs (March 15 and 22, 2015) Many optimization problems can be formulated as linear programs. The main features of a linear program are the following: Variables are real

More information

Bounds on the Traveling Salesman Problem

Bounds on the Traveling Salesman Problem Bounds on the Traveling Salesman Problem Sean Zachary Roberson Texas A&M University MATH 613, Graph Theory A common routing problem is as follows: given a collection of stops (for example, towns, stations,

More information

A Principled Approach to Mixed Integer/Linear Problem Formulation

A Principled Approach to Mixed Integer/Linear Problem Formulation A Principled Approach to Mixed Integer/Linear Problem Formulation J N Hooker September 9, 2008 Abstract We view mixed integer/linear problem formulation as a process of identifying disjunctive and knapsack

More information

Solving Mixed-Integer Nonlinear Programs

Solving Mixed-Integer Nonlinear Programs Solving Mixed-Integer Nonlinear Programs (with SCIP) Ambros M. Gleixner Zuse Institute Berlin MATHEON Berlin Mathematical School 5th Porto Meeting on Mathematics for Industry, April 10 11, 2014, Porto

More information

BBM402-Lecture 20: LP Duality

BBM402-Lecture 20: LP Duality BBM402-Lecture 20: LP Duality Lecturer: Lale Özkahya Resources for the presentation: https://courses.engr.illinois.edu/cs473/fa2016/lectures.html An easy LP? which is compact form for max cx subject to

More information

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle Directed Hamiltonian Cycle Chapter 8 NP and Computational Intractability Claim. G has a Hamiltonian cycle iff G' does. Pf. Suppose G has a directed Hamiltonian cycle Γ. Then G' has an undirected Hamiltonian

More information

Lecture 8: Column Generation

Lecture 8: Column Generation Lecture 8: Column Generation (3 units) Outline Cutting stock problem Classical IP formulation Set covering formulation Column generation A dual perspective Vehicle routing problem 1 / 33 Cutting stock

More information

3.10 Lagrangian relaxation

3.10 Lagrangian relaxation 3.10 Lagrangian relaxation Consider a generic ILP problem min {c t x : Ax b, Dx d, x Z n } with integer coefficients. Suppose Dx d are the complicating constraints. Often the linear relaxation and the

More information

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 Definition: An Integer Linear Programming problem is an optimization problem of the form (ILP) min

More information

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints. Section Notes 8 Integer Programming II Applied Math 121 Week of April 5, 2010 Goals for the week understand IP relaxations be able to determine the relative strength of formulations understand the branch

More information

Benders decomposition [7, 17] uses a problem-solving strategy that can be generalized to a larger context. It assigns some of the variables trial valu

Benders decomposition [7, 17] uses a problem-solving strategy that can be generalized to a larger context. It assigns some of the variables trial valu Logic-Based Benders Decomposition Λ J. N. Hooker Graduate School of Industrial Administration Carnegie Mellon University, Pittsburgh, PA 15213 USA G. Ottosson Department of Information Technology Computing

More information

3.4 Relaxations and bounds

3.4 Relaxations and bounds 3.4 Relaxations and bounds Consider a generic Discrete Optimization problem z = min{c(x) : x X} with an optimal solution x X. In general, the algorithms generate not only a decreasing sequence of upper

More information

21. Set cover and TSP

21. Set cover and TSP CS/ECE/ISyE 524 Introduction to Optimization Spring 2017 18 21. Set cover and TSP ˆ Set covering ˆ Cutting problems and column generation ˆ Traveling salesman problem Laurent Lessard (www.laurentlessard.com)

More information

Optimization Methods in Logic

Optimization Methods in Logic Optimization Methods in Logic John Hooker Carnegie Mellon University February 2003, Revised December 2008 1 Numerical Semantics for Logic Optimization can make at least two contributions to boolean logic.

More information

Introduction into Vehicle Routing Problems and other basic mixed-integer problems

Introduction into Vehicle Routing Problems and other basic mixed-integer problems Introduction into Vehicle Routing Problems and other basic mixed-integer problems Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical

More information

CHAPTER 3: INTEGER PROGRAMMING

CHAPTER 3: INTEGER PROGRAMMING CHAPTER 3: INTEGER PROGRAMMING Overview To this point, we have considered optimization problems with continuous design variables. That is, the design variables can take any value within a continuous feasible

More information

Multivalued Decision Diagrams. Postoptimality Analysis Using. J. N. Hooker. Tarik Hadzic. Cork Constraint Computation Centre

Multivalued Decision Diagrams. Postoptimality Analysis Using. J. N. Hooker. Tarik Hadzic. Cork Constraint Computation Centre Postoptimality Analysis Using Multivalued Decision Diagrams Tarik Hadzic Cork Constraint Computation Centre J. N. Hooker Carnegie Mellon University London School of Economics June 2008 Postoptimality Analysis

More information

Graph Coloring Inequalities from All-different Systems

Graph Coloring Inequalities from All-different Systems Constraints manuscript No (will be inserted by the editor) Graph Coloring Inequalities from All-different Systems David Bergman J N Hooker Received: date / Accepted: date Abstract We explore the idea of

More information

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory by Troels Martin Range Discussion Papers on Business and Economics No. 10/2006 FURTHER INFORMATION Department of Business

More information

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications François Vanderbeck University of Bordeaux INRIA Bordeaux-Sud-Ouest part : Defining Extended Formulations

More information

Single Machine Problems Polynomial Cases

Single Machine Problems Polynomial Cases DM204, 2011 SCHEDULING, TIMETABLING AND ROUTING Lecture 2 Single Machine Problems Polynomial Cases Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline

More information

The Dual Simplex Algorithm

The Dual Simplex Algorithm p. 1 The Dual Simplex Algorithm Primal optimal (dual feasible) and primal feasible (dual optimal) bases The dual simplex tableau, dual optimality and the dual pivot rules Classical applications of linear

More information

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs

More information

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1 Branch-and-Bound Leo Liberti LIX, École Polytechnique, France INF431 2011, Lecture p. 1 Reminders INF431 2011, Lecture p. 2 Problems Decision problem: a question admitting a YES/NO answer Example HAMILTONIAN

More information

Introduction to optimization

Introduction to optimization Introduction to optimization Geir Dahl CMA, Dept. of Mathematics and Dept. of Informatics University of Oslo 1 / 24 The plan 1. The basic concepts 2. Some useful tools (linear programming = linear optimization)

More information

Advanced Linear Programming: The Exercises

Advanced Linear Programming: The Exercises Advanced Linear Programming: The Exercises The answers are sometimes not written out completely. 1.5 a) min c T x + d T y Ax + By b y = x (1) First reformulation, using z smallest number satisfying x z

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2017 04 07 Lecture 8 Linear and integer optimization with applications

More information

Integer program reformulation for robust branch-and-cut-and-price

Integer program reformulation for robust branch-and-cut-and-price Integer program reformulation for robust branch-and-cut-and-price Marcus Poggi de Aragão Informática PUC-Rio Eduardo Uchoa Engenharia de Produção Universidade Federal Fluminense Outline of the talk Robust

More information

INTEGER PROGRAMMING. In many problems the decision variables must have integer values.

INTEGER PROGRAMMING. In many problems the decision variables must have integer values. INTEGER PROGRAMMING Integer Programming In many problems the decision variables must have integer values. Example:assign people, machines, and vehicles to activities in integer quantities. If this is the

More information

Lectures 6, 7 and part of 8

Lectures 6, 7 and part of 8 Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,

More information

Decomposition-based Methods for Large-scale Discrete Optimization p.1

Decomposition-based Methods for Large-scale Discrete Optimization p.1 Decomposition-based Methods for Large-scale Discrete Optimization Matthew V Galati Ted K Ralphs Department of Industrial and Systems Engineering Lehigh University, Bethlehem, PA, USA Départment de Mathématiques

More information

Integer Linear Programming (ILP)

Integer Linear Programming (ILP) Integer Linear Programming (ILP) Zdeněk Hanzálek, Přemysl Šůcha hanzalek@fel.cvut.cz CTU in Prague March 8, 2017 Z. Hanzálek (CTU) Integer Linear Programming (ILP) March 8, 2017 1 / 43 Table of contents

More information

Introduction. Very efficient solution procedure: simplex method.

Introduction. Very efficient solution procedure: simplex method. LINEAR PROGRAMMING Introduction Development of linear programming was among the most important scientific advances of mid 20th cent. Most common type of applications: allocate limited resources to competing

More information

A First Look at Picking Dual Variables for Maximizing Reduced Cost Fixing

A First Look at Picking Dual Variables for Maximizing Reduced Cost Fixing TSpace Research Repository tspace.library.utoronto.ca A First Look at Picking Dual Variables for Maximizing Reduced Cost Fixing Omid Sanei Bajgiran, Andre A. Cire, and Louis-Martin Rousseau Version Post-print/accepted

More information

A Global Constraint for Parallelizing the Execution of Task Sets in Non-Preemptive Scheduling

A Global Constraint for Parallelizing the Execution of Task Sets in Non-Preemptive Scheduling A Global Constraint for Parallelizing the Execution of Task Sets in Non-Preemptive Scheduling 1 Introduction Michael Marte 1 Institut für Informatik, Universität München Oettingenstr. 67, 80538 München,

More information

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Totally unimodular matrices Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and

More information

A Search-Infer-and-Relax Framework for Integrating Solution Methods

A Search-Infer-and-Relax Framework for Integrating Solution Methods Carnegie Mellon University Research Showcase @ CMU Tepper School of Business 2005 A Search-Infer-and-Relax Framework for Integrating Solution Methods John N. Hooker Carnegie Mellon University, john@hooker.tepper.cmu.edu

More information

Reconnect 04 Introduction to Integer Programming

Reconnect 04 Introduction to Integer Programming Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, Reconnect 04 Introduction to Integer Programming Cynthia Phillips, Sandia National Laboratories Integer programming

More information

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748 COT 6936: Topics in Algorithms! Giri Narasimhan ECS 254A / EC 2443; Phone: x3748 giri@cs.fiu.edu https://moodle.cis.fiu.edu/v2.1/course/view.php?id=612 Gaussian Elimination! Solving a system of simultaneous

More information

Lecture 9: Dantzig-Wolfe Decomposition

Lecture 9: Dantzig-Wolfe Decomposition Lecture 9: Dantzig-Wolfe Decomposition (3 units) Outline Dantzig-Wolfe decomposition Column generation algorithm Relation to Lagrangian dual Branch-and-price method Generated assignment problem and multi-commodity

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 1 Homework Problems Exercise

More information

CS Algorithms and Complexity

CS Algorithms and Complexity CS 50 - Algorithms and Complexity Linear Programming, the Simplex Method, and Hard Problems Sean Anderson 2/15/18 Portland State University Table of contents 1. The Simplex Method 2. The Graph Problem

More information

Integer Linear Programming Modeling

Integer Linear Programming Modeling DM554/DM545 Linear and Lecture 9 Integer Linear Programming Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. Assignment Problem Knapsack Problem

More information

CO759: Algorithmic Game Theory Spring 2015

CO759: Algorithmic Game Theory Spring 2015 CO759: Algorithmic Game Theory Spring 2015 Instructor: Chaitanya Swamy Assignment 1 Due: By Jun 25, 2015 You may use anything proved in class directly. I will maintain a FAQ about the assignment on the

More information

Introduction to integer programming III:

Introduction to integer programming III: Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability

More information