Operations Research Methods in Constraint Programming

Size: px
Start display at page:

Download "Operations Research Methods in Constraint Programming"

Transcription

1 Operations Research Methods in Constraint Programming John Hooker Carnegie Mellon University Prepared for Lloret de Mar, Spain June Slide 1

2 CP and OR Have Complementary Strengths CP: Inference methods Modeling Exploits local structure OR: Relaxation methods Duality theory Exploits global structure Let s bring them together! 2007 Slide 2

3 How much background in OR is required by this tutorial? None! 2007 Slide 3

4 Background Reading This tutorial is based on: J. N. Hooker, Integrated Methods for Optimization, Springer (2007). J. N. Hooker, Operations research methods in constraint programming, in F. Rossi, P. van Beek and T. Walsh, eds., Handbook of Constraint Programming, Elsevier (2006), pp Slide 4

5 1. Introduction: Why OR in CP? 1.1 Advantages of Using OR Methods in CP 1.2 Schemes for Integrating OR into CP 1.3 Outline of the Tutorial 2007 Slide 5

6 1.1 Advantages of using OR methods in CP OR methods can exploit structure missed by CP. Many models have been analyzed in 50+ years of research. OR is good at problem relaxations. Linear programming, Lagrangean. OR relaxations can exploit global structure. Global relaxations rather than propagation. OR has developed a powerful duality theory. Linear programming duality, Lagrangean duality, etc. OR and CP have the same mission. Prescriptive modeling Slide 6

7 Computational Advantage of Integrating CP and OR Using CP + relaxation from MILP Focacci, Lodi, Milano (1999) Refalo (1999) Hooker & Osorio (1999) Thorsteinsson & Ottosson (2001) Problem Lesson timetabling Piecewise linear costs Flow shop scheduling, etc. Product configuration Speedup 2 to 50 times faster than CP 2 to 200 times faster than MILP 4 to 150 times faster than MILP. 30 to 40 times faster than CP, MILP 2007 Slide 7

8 Computational Advantage of Integrating CP and MILP Using CP + relaxation from MILP 2007 Slide 8 Sellmann & Fahle (2001) Van Hoeve (2001) Bollapragada, Ghattas & Hooker (2001) Beck & Refalo (2003) Problem Automatic recording Stable set problem Structural design (nonlinear) Scheduling with earliness & tardiness costs Speedup 1 to 10 times faster than CP, MILP Better than CP in less time Up to 600 times faster than MILP. 2 problems: <6 min vs >20 hrs for MILP Solved 67 of 90, CP solved only 12

9 Computational Advantage of Integrating CP and MILP Using CP-based Branch and Price Yunes, Moura & de Souza (1999) Easton, Nemhauser & Trick (2002) Problem Urban transit crew scheduling Traveling tournament scheduling Speedup Optimal schedule for 210 trips, vs. 120 for traditional branch and price First to solve 8-team instance 2007 Slide 9

10 Computational Advantage of Integrating CP and MILP Using CP/MILP Benders methods Jain & Grossmann (2001) Thorsteinsson (2001) Timpe (2002) Problem Min-cost planning & scheduing Min-cost planning & scheduling Polypropylene batch scheduling at BASF Speedup 20 to 1000 times faster than CP, MILP 10 times faster than Jain & Grossmann Solved previously insoluble problem in 10 min 2007 Slide 10

11 Computational Advantage of Integrating CP and MILP Using CP/MILP Benders methods Benoist, Gaudin, Rottembourg (2002) Hooker (2004) Hooker (2005) Problem Call center scheduling Min-cost, min-makespan planning & cumulative scheduling Min tardiness planning & cumulative scheduling Speedup Solved twice as many instances as traditional Benders times faster than CP, MILP times faster than CP, MILP 2007 Slide 11

12 1.2 Schemes for integrating OR into CP. Problem relaxation in the form of an OR model. Solution of the relaxation can help reduce domains and guide the search. Exploit recursive structure. Dynamic programming, and nonserial DP (bucket elimination) Branch and price to combine CP and MILP. CP replaces traditional LP-based column generation. Generalized Benders methods. CP generates constraints (nogoods) that guide the search Slide 12

13 1.3 Outline of the Tutorial 1. Introduction: Why OR in CP? 2. Relaxation 3. Linear Programming 4. Mixed Integer Linear Modeling 5. Cutting Planes 6. Lagrangean Relaxation 7. Dynamic Programming 8. Branch and Price 9. Benders Decomposition Relax the CP problem with an OR-based model Exploit recursive structure Decompose problem into CP and OR parts 2007 Slide 13

14 Detailed Outline 1. Introduction: Why OR? 1.1 Advantages of Using OR Methods in CP 1.2 Schemes for Integrating OR into CP 1.3 Outline of the Tutorial 2. Relaxation 2.1 Why Relax? 2.2 Example: Freight Transfer 3. Linear Programming 3.1 Algebraic Analysis 3.2 Linear Programming Duality 3.3 LP-Based Domain Filtering 3.4 Example: Single-Vehicle Routing 2007 Slide 14

15 Detailed Outline 4. Mixed Integer/Linear Modeling 4.1 MILP Representability 4.2 Disjunctive Modeling 4.3 Knapsack Modeling 5. Cutting Planes Knapsack Cuts 5.2 Gomory Cuts 5.3 Mixed Integer Rounding Cuts 5.4 Example: Product Configuration 6. Lagrangean Relaxation 6.1 Lagrangean Duality 6.2 Properties of the Lagrangean Dual 6.3 Example: Fast Linear Programming 6.4 Domain Filtering 6.5 Example: Continuous Global Optimization 2007 Slide 15

16 Detailed Outline 7. Dynamic Programming 7.1 Example: Capital Budgeting 7.2 Domain Filtering 7.3 Recursive Optimization 8. Branch and Price 8.1 Basic Idea 8.2 Example: Airline Crew Scheduling 9. Benders Decomposition 9.1 Benders Decomposition in the Abstract 9.2 Classical Benders Decomposition 9.3 Example: Machine Scheduling 2007 Slide 16

17 What we don t cover Network models used in filtering algorithms for global constraints. Generally covered in a discussion of global constraints. Heuristic and local search methods. A long history in OR. Stochastic optimization. Stochastic programming, Markov decision models Slide 17

18 2. Relaxation 2.1 Why Relax? 2.2 Example: Freight Transfer 2007 Slide 18

19 2.1 Why Relax? Solving a relaxation of a CP problem can: Tighten variable bounds. Possibly solve original problem. Guide the search in a promising direction. Filter domains using reduced costs or Lagrange multipliers. Prune the search tree using a bound on the optimal value. Provide a more global view, because a single OR relaxation can pool relaxations of several constraints Slide 19

20 Some OR models that can provide relaxations: Linear programming (LP). Mixed integer linear programming (MILP) Can itself be relaxed as an LP. LP relaxation can be strengthened with cutting planes. Lagrangean relaxation. Specialized relaxations. For particular problem classes. For global constraints Slide 20

21 2.2 Example: Freight Transfer Transport 42 tons of freight using 8 trucks, which come in 4 sizes Truck size Number available Capacity (tons) Cost per truck Slide

22 Number of trucks of type 1 Knapsack packing constraint min 90x + 60x + 50x + 40x 7x + 5x + 4x + 3x 42 {0,1,2,3} x + x + x + x 8 x i Knapsack covering constraint Truck type Number available Capacity (tons) Cost per truck Slide

23 Bounds propagation min 90x + 60x + 50x + 40x 7x + 5x + 4x + 3x 42 {0,1,2,3} x + x + x + x 8 x i x = Slide 23

24 Bounds propagation Reduced domain min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x {1,2,3}, x, x, x {0,1,2,3} x = Slide 24

25 Continuous relaxation min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Replace domains with bounds This is a linear programming problem, which is easy to solve. Its optimal value provides a lower bound on optimal value of original problem Slide 25

26 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 We can create a tighter relaxation (larger minimum value) with the addition of cutting planes Slide 26

27 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Cutting plane Continuous relaxation All feasible solutions of the original problem satisfy a cutting plane (i.e., it is valid) Slide 27 But a cutting plane may exclude ( cut off ) solutions of the continuous relaxation. Feasible solutions

28 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 {1,2} is a packing i 1 because 7x 1 + 5x 2 alone cannot satisfy the inequality, even with x 1 = x 2 = Slide 28

29 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 {1,2} is a packing i 1 So, 4x + 3x 42 ( ) 3 4 Knapsack cut which implies x 42 ( ) + x = max{ 4,3} Slide 29

30 Cutting planes (valid inequalities) Let x i have domain [L i,u i ] and let a 0. In general, a packing P for ax a 0 satisfies i P a x a a U i i 0 i i i P and generates a knapsack cut i P x i a0 max i P i P a U i { a } i i 2007 Slide 30

31 Cutting planes (valid inequalities) min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x x 3, x 1 i 1 Maximal Packings {1,2} {1,3} {1,4} Knapsack cuts x 3 + x 4 2 x 2 + x 4 2 x 2 + x Slide 31 Knapsack cuts corresponding to nonmaximal packings can be nonredundant.

32 Continuous relaxation with cuts min 90x + 60x + 50x + 40x x + 5x + 4x + 3x x + x + x + x 8 0 x 3, x 1 x x x i x 2 + x x Knapsack cuts Optimal value of is a lower bound on optimal value of original problem Slide 32

33 Branch-inferand-relax tree Propagate bounds and solve relaxation of original problem. x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ 2007 Slide 33

34 Branch-inferand-relax tree Branch on a variable with nonintegral value in the relaxation. x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 = Slide 34

35 Branch-inferand-relax tree Propagate bounds and solve relaxation. Since relaxation is infeasible, backtrack. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 = Slide 35

36 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Propagate bounds and solve relaxation. Branch on nonintegral variable. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = Slide 36

37 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Branch again. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 2 {0,1,2} x 2 = 3 x 3 {1,2} x 3 = Slide 37

38 Branch-inferand-relax tree Solution of relaxation is integral and therefore feasible in the original problem. This becomes the incumbent solution Slide 38 x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ x 1 {1,2} x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 1 = 3 x 3 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3

39 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Solution is nonintegral, but we can backtrack because value of relaxation is no better than incumbent solution. x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 = 3 x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = Slide 39 x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 3 = 3 x 1 { 3} x 2 {012 } x 3 { 3} x 4 {012 } x = (3,1½,3,½) value = 530 backtrack due to bound

40 Branch-inferand-relax tree x 1 { 123} x 2 {0123} x 3 {0123} x 4 {0123} x = (2⅓,3,2⅔,0) value = 523⅓ Another feasible solution found. No better than incumbent solution, which is optimal because search has finished Slide 40 x 1 { 12 } x 2 { 23} x 3 { 123} x 4 { 123} infeasible relaxation x 1 {1,2} x 1 { 3} x 2 {012 } x 3 { 123} x 4 {0123} x = (3,2,2¾,0) value = 527½ x 1 { 3} x 3 {1,2} x 2 { 12 } x 3 { 12 } x 4 { 123} x = (3,2,2,1) value = 530 feasible solution x 1 = 3 x 3 = 3 x 1 { 3} x 2 {0123} x 3 {0123} x 4 {0123} x = (3,2.6,2,0) value = 526 x 2 {0,1,2} x 2 = 3 x 1 { 3} x 2 {012 } x 3 { 3} x 4 {012 } x = (3,1½,3,½) value = 530 backtrack due to bound x 1 { 3} x 2 { 3} x 3 {012 } x 4 {012 } x = (3,3,0,2) value = 530 feasible solution

41 Two optimal solutions x = (3,2,2,1) x = (3,3,0,2) 2007 Slide 41

42 3. Linear Programming 3.1 Algebraic Analysis 3.2 Linear Programming Duality 3.3 LP-Based Domain Filtering 3.4 Example: Single-Vehicle Routing 2007 Slide 42

43 Motivation Linear programming is remarkably versatile for representing real-world problems. LP is by far the most widely used tool for relaxation. LP relaxations can be strengthened by cutting planes. - Based on polyhedral analysis. LP has an elegant and powerful duality theory. - Useful for domain filtering, and much else. The LP problem is extremely well solved Slide 43

44 3.1 Algebraic Analysis An example min 4x + 7x x + 3x 6 2x + x 4 x, x 0 4x 1 + 7x 2 = 12 2x 1 + x 2 4 Optimal solution x = (3,0) 2x 1 + 3x Slide 44

45 Algebraic Analysis of LP Rewrite min 4x + 7x x + 3x 6 2x + x 4 x, x 0 as min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x In general an LP has the form min cx Ax = b x Slide 45

46 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] m n matrix Basic variables Nonbasic variables Any set of m linearly independent columns of A. These form a basis for the space spanned by the columns Slide 46

47 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] Solve constraint equation for x B : x = 1 1 B B b B NxN All solutions can be obtained by setting x N to some value. The solution is basic if x N = 0. It is a basic feasible solution if x N = 0 and x B Slide 47

48 Example x 2 = basic feasible solution min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x x 2, x 3 basic 2x 1 + x 2 4 x 2, x 4 basic x 1, x 2 basic x 3, x 4 basic x 1, x 3 basic 2x 1 + 3x 2 6 x 1, x 4 basic x Slide 48

49 Algebraic analysis of LP Write min cx Ax = b x 0 as min + c x B B N N Bx + Nx = b x B B c x N, x 0 N where A = [ B N] x = B 1 b B 1 Nx Solve constraint equation for x B : Express cost in terms of nonbasic variables: B N 2007 Slide 49 c B b ( c c B N) x 1 1 B N B N Vector of reduced costs Since x N 0, basic solution (x B,0) is optimal if reduced costs are nonnegative.

50 Example x 2 min 4x + 7x 1 2 2x + 3x x = x + x x = x, x, x, x Consider this basic feasible solution x 1, x 4 basic x Slide 50

51 Example Write as c B x B c N x N [ ] x + [ ] 1 x2 min 4x1 + 7x2 min x4 x3 2x 1 + 3x2 x3 = x1 3 1 x1 6 2x1 + x2 x4 = 4 Bx B + = 2 1 x4 1 0 x4 4 x1, x2, x3, x4 0 x1 x1 0 NxN b, x4 x Slide 51

52 Example Bx B c B x B c N x N [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN b, x4 x Slide 52

53 Example c B x B c N x N [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 Bx B + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN b, x4 x4 0 Basic solution is x = B b B Nx = B b B N x1 1/ = = = x x 2 x 1, x 4 basic 2007 Slide 53 x 1

54 Example Bx B c B x B [ ] x + [ ] 1 x2 min x4 x3 2 0 x1 3 1 x1 6 + = 2 1 x4 1 0 x4 4 x1 x1 0 NxN, x4 x Slide 54 c N x N Basic solution is x = B b B Nx = B b B Reduced costs are c N 1 cbb N 1/ = [ 7 0] [ 4 0] = [ 1 2] [ 0 0] N x1 1/ = = = x Solution is optimal

55 3.2 Linear Programming Duality An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v implies Dual problem: Find the tightest lower bound on the objective function that is implied by the constraints Slide 55

56 An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v That is, some surrogate (nonnegative linear combination) of Ax b dominates cx v From Farkas Lemma: If Ax b, x 0 is feasible, x 0 λax λb dominates cx v Ax b cx v iff for some λ Slide 56 λa c and λb v

57 An LP can be viewed as an inference problem min cx Ax b x 0 = max v x 0 Ax b cx v max λ 0 λb = This is the λa c classical LP dual From Farkas Lemma: If Ax b, x 0 is feasible, x 0 λax λb dominates cx v Ax b cx v iff for some λ Slide 57 λa c and λb v

58 This equality is called strong duality. min cx = max λb Ax b x 0 λa c λ 0 If Ax b, x 0 is feasible This is the classical LP dual Note that the dual of the dual is the primal (i.e., the original LP) Slide 58

59 Example Primal min 4x + 7x = 2x + 3 x 6 ( λ ) 1 2 2x + x x, x ( λ ) 1 Dual max 6λ + 4λ = λ + 2λ 4 2 3λ + λ 7 λ, λ 0 ( x ( x 1 2 ) ) A dual solution is (λ 1,λ 2 ) = (2,0) 2007 Slide 59 2x + 3x x + x 4 4x + 6x x + 7x ( λ = 2) 1 ( λ = 0) 2 dominates Dual multipliers Surrogate Tightest bound on cost

60 Weak Duality If x* is feasible in the primal problem and λ* is feasible in the dual problem then cx* λ*b. min cx Ax b x 0 max λb λa c λ 0 This is because cx* λ*ax* λ*b λ* is dual feasible and x* 0 x* is primal feasible and λ* Slide 60

61 Dual multipliers as marginal costs Suppose we perturb the RHS of an LP (i.e., change the requirement levels): min cx Ax b + b x 0 The dual of the perturbed LP has the same constraints at the original LP: max λ( b λa c + b) λ 0 So an optimal solution λ* of the original dual is feasible in the perturbed dual Slide 61

62 Dual multipliers as marginal costs Suppose we perturb the RHS of an LP (i.e., change the requirement levels): min cx Ax b + b x 0 By weak duality, the optimal value of the perturbed LP is at least λ*(b + b) = λ*b + λ* b. Optimal value of original LP, by strong duality. So λ i * is a lower bound on the marginal cost of increasing the i-th requirement by one unit ( b i = 1). If λ i * > 0, the i-th constraint must be tight (complementary slackness) Slide 62

63 Dual of an LP in equality form Primal Dual min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) 2007 Slide 63

64 Dual of an LP in equality form Primal Dual min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is c N 1 cbb N λ = c λn N this solves the dual if (x B,0) solves the primal 2007 Slide 64

65 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is Check: 1 λb = c B B = c λ B B 1 N = cbb N cn c N 1 cbb N λ = c λn this solves the dual if (x B,0) solves the primal N Because reduced cost is nonnegative at optimal solution (x B,0) Slide 65

66 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is c N 1 cbb N λ = c λn N In the example, 2007 Slide 66 [ 4 0] 1/ 2 [ 2 0] λ = cbb = = this solves the dual if (x B,0) solves the primal

67 Dual of an LP in equality form Primal min c x + c x B B N N Bx + Nx = b x B B N, x 0 N ( λ) Dual max λb λb c λn c B N λ unrestricted ( x ( x B B ) ) Recall that reduced cost vector is 1 cbb N Note that the reduced cost of an individual variable x j is c N λ = c λn N r j = c j λaj Column j of A 2007 Slide 67

68 3.3 LP-based Domain Filtering Let min cx Ax b x 0 be an LP relaxation of a CP problem. One way to filter the domain of x j is to minimize and maximize x j subject to Ax b, x 0. - This is time consuming. A faster method is to use dual multipliers to derive valid inequalities. - A special case of this method uses reduced costs to bound or fix variables. - Reduced-cost variable fixing is a widely used technique in OR Slide 68

69 Suppose: min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*. and λ i * > 0, which means the i-th constraint is tight (complementary slackness); and the LP is a relaxation of a CP problem; and we have a feasible solution of the CP problem with value U, so that U is an upper bound on the optimal value Slide 69

70 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: If x were to change to a value other than x*, the LHS of i-th constraint A i x b i would change by some amount b i. Since the constraint is tight, this would increase the optimal value as much as changing the constraint to A i x b i + b i. So it would increase the optimal value at least λ i * b i Slide 70

71 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: We have found: a change in x that changes A i x by b i the optimal value of LP at least λ i * b i. increases Since optimal value of the LP optimal value of the CP U, we have λ i * b i U v*, or * b i U v λ * i 2007 Slide 71

72 Supposing min cx Ax b x 0 has optimal solution x*, optimal value v*, and optimal dual solution λ*: We have found: a change in x that changes A i x by b i the optimal value of LP at least λ i * b i. increases Since optimal value of the LP optimal value of the CP U, we have λ i * b i U v*, or * Since b i = A i x A i x* = A i x b i, this implies the inequality 2007 Slide 72 i A x U v bi + λ * i * b i U v λ * i which can be propagated.

73 Example min x + 7x 1 2 2x + 3x 6 ( λ = 2) 2x + x 4 ( λ = 0) x, x A x Suppose we have a feasible solution of the original CP with value U = 13. Since the first constraint is tight, we can propagate the inequality U v b + λ 1 * 1 * or x1 + 3x2 6 + = Slide 73

74 Reduced-cost domain filtering Suppose x j * = 0, which means the constraint x j 0 is tight. The inequality i A x * U v bi + becomes λ * i x j U r j v * The dual multiplier for x j 0 is the reduced cost r j of x j, because increasing x j (currently 0) by 1 increases optimal cost by r j. Similar reasoning can bound a variable below when it is at its upper bound Slide 74

75 Example min x + 7x 1 2 2x + 3x 6 ( λ = 2) 2x + x 4 ( λ = 0) x, x Since x 2 * = 0, we have or x 2 Suppose we have a feasible solution of the original CP with value U = = x 2 U r 2 v * If x 2 is required to be integer, we can fix it to zero. This is reduced-cost variable fixing Slide 75

76 3.4 Example: Single-Vehicle Routing A vehicle must make several stops and return home, perhaps subject to time windows. The objective is to find the order of stops that minimizes travel time. This is also known as the traveling salesman problem (with time windows). Travel time c ij Stop i Stop j 2007 Slide 76

77 Assignment Relaxation min ij ij ij j j x ij x = x = 1, all i { } c x ij ji 0,1, all i, j = 1 if stop i immediately precedes stop j Stop i is preceded and followed by exactly one stop Slide 77

78 Assignment Relaxation min ij = x = 1, all 0 x 1, all i, j ij x ij j j ij c x ij ji = 1 if stop i immediately precedes stop j i Stop i is preceded and followed by exactly one stop. Because this problem is totally unimodular, it can be solved as an LP. The relaxation provides a very weak lower bound on the optimal value. But reduced-cost variable fixing can be very useful in a CP context Slide 78

79 4. Mixed Integer/Linear Modeling 4.1 MILP Representability 4.2 Disjunctive Modeling 4.3 Knapsack Modeling 2007 Slide 79

80 Motivation A mixed integer/linear programming (MILP) problem has the form min cx + dy Ax + by b x, y 0 y integer We can relax a CP problem by modeling some constraints with an MILP. If desired, we can then relax the MILP by dropping the integrality constraint, to obtain an LP. The LP relaxation can be strengthened with cutting planes. The first step is to learn how to write MILP models Slide 80

81 4.2 MILP Representability R n A subset S of is MILP representable if it is the projection onto x of some MILP constraint set of the form Ax + Bu + Dy b x, y 0 { } n m x R, u R, y 0,1 k 2007 Slide 81

82 4.2 MILP Representability R n A subset S of is MILP representable if it is the projection onto x of some MILP constraint set of the form Ax + Bu + Dy b x, y 0 { } n m x R, u R, y 0,1 k Recession cone of polyhedron R Theorem. S is MILP representable if and only if S is the union of finitely many polyhedra having the same recession cone. n Polyhedron 2007 Slide 82

83 Example: Fixed charge function Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x Slide 83 x 1

84 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Feasible set 2007 Slide 84 x 1

85 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Union of two polyhedra P 1, P 2 P Slide 85 x 1

86 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 Union of two polyhedra P 1, P 2 P 2 P Slide 86 x 1

87 Example Minimize a fixed charge function: min x x 2 1 x 2 0 if x1 = 0 f + cx1 if x1 > 0 0 x 2 The polyhedra have different recession cones. P 1 P Slide 87 x 1 P 1 recession cone P 2 recession cone

88 Example Minimize a fixed charge function: Add an upper bound on x 1 min x if x1 = 0 x2 f + cx1 if x1 > 0 0 x M x 2 The polyhedra have the same recession cone. P 1 P Slide 88 M x 1 P 1 recession cone P 2 recession cone

89 Modeling a union of polyhedra Start with a disjunction of linear systems to represent the union of polyhedra. The kth polyhedron is {x A k x b} min cx ( k k A x b ) k Introduce a 0-1 variable y k that is 1 when x is in polyhedron k. Disaggregate x to create an x k for each k. min cx k k k A x b y, all k k k y x = y k k = 1 x k { 0,1} k 2007 Slide 89

90 Example Start with a disjunction of linear systems to represent the union of polyhedra min x 2 x1 = 0 0 x1 M x 0 x f + cx x 2 P 2 P Slide 90 M x 1

91 Example Start with a disjunction of linear systems to represent the union of polyhedra Introduce a 0-1 variable y k that is 1 when x is in polyhedron k. Disaggregate x to create an x k for each k. min x 2 x1 = 0 0 x1 M x 0 x f + cx min cx x = 0, x x1 My 2 cx1 + x2 fy2 0, { } y + y = 1, y 0,1 x = x + x k 2007 Slide 91

92 Example To simplify: Replace x 12 with x 1. Replace x 22 with x 2. min = 0, x x1 My 2 cx1 + x2 fy2 0, Replace y 2 with y. y + y = 1, y { 0,1} x x x = x + x k This yields min 0 x x { 0,1} My x fy + cx y or min fy + cx 0 x y { 0,1} My Big M 2007 Slide 92

93 4.3 Disjunctive Modeling Disjunctions often occur naturally in problems and can be given an MILP model. Recall that a disjunction of linear systems (representing polyhedra min k k with the same recession cone) ( A x b ) k cx 2007 Slide 93 has the MILP model min cx k k k A x b y, all k k k y x = y k k = 1 x k { 0,1} k

94 Example: Uncapacitated facility location m possible factory locations n markets Locate factories to serve markets so as to minimize total fixed cost and transport cost. No limit on production capacity of each factory. Fixed cost f i i c ij Transport cost j 2007 Slide 94

95 Uncapacitated facility location m possible factory locations n markets Disjunctive model: min i ij z + c x i i ij ij ij xij = 0, all j 0 xij 1, all j, all i zi 0 zi f = i x = 1, all j Fraction of market j s demand satisfied from location i Fixed cost f i i c ij Transport cost j No factory at location i Factory at location i 2007 Slide 95

96 Uncapacitated facility location MILP formulation: min 0 x y, all i, j y i f y + c x ij i { 0,1} i i ij ij ij i Disjunctive model: min i i ij ij ij xij = 0, all j 0 xij 1, all j, all zi 0 zi f = i x = 1, all j i ij z + c x i No factory at location i Factory at location i 2007 Slide 96

97 Uncapacitated facility location Maximum output from location i MILP formulation: min i 0 x y, all i, j y i f y + c x ij { 0,1} i i ij ij ij i Beginner s model: min y j i i x ny, all i, j ij f y + c x { 0,1} i i ij ij ij i Based on capacitated location model. It has a weaker continuous relaxation (obtained by replacing y i {0,1} with 0 y i 1) Slide 97 This beginner s mistake can be avoided by starting with disjunctive formulation.

98 Tight Relaxations The name of the game is to write a model with a tight continuous relaxation. This provides a tighter lower bound. Basic fact: The continuous relaxation of the disjunctive MILP model provides a convex hull relaxation of the disjunction. Union of polyhedra Convex hull relaxation (tightest linear relaxation) 2007 Slide 98

99 Tight Relaxations min cx To derive convex hull relaxation of a disjunction Write each solution as a convex combination of points in the polyhedron min k k k A x b, all k k x y k = y k cx k = 1 y x k { 0,1} k Change of variable k x = y x k 1 x x k k k A x b y, all k k x y k = y k k = 1 x k { 0,1} 2 x k Convex hull relaxation (tightest linear relaxation) 2007 Slide 99

100 4.4 Knapsack Modeling Knapsack models consist of knapsack covering and knapsack packing constraints. The freight transfer model presented earlier is an example. We will consider a similar example that combines disjunctive and knapsack modeling. Most OR professionals are unlikely to write a model as good as the one presented here Slide 100

101 Note on tightness of knapsack models The continuous relaxation of a knapsack model is not in general a convex hull relaxation. - A disjunctive formulation would provide a convex hull relaxation, but there are exponentially many disjuncts. Knapsack cuts can significantly tighten the relaxation Slide 101

102 Example: Package transport Each package j has size a j Each truck i has capacity Q i and costs c i to operate 2007 Slide 102 Truck i used 1 if truck i carries package j Disjunctive model min i i Q y a ; x = 1, all j i i j ij i j i ij i z yi = 1 z 0 i = c y i i = a zi 0, all i j xij Q = i j xij = 0 0 xij 1, all j x, y 0,1 { } 1 if truck i is used Knapsack constraints Truck i not used

103 Example: Package transport MILP model min i Q y a ; x = 1, all j a x Q y, all i x y, all i, j { }, y 0,1 i i i i j ij i j i x j ij ij j ij i i i i c y Disjunctive model min i i Q y a ; x = 1, all j i i j ij i j i yi = 1 z 0 i = c y i i = a zi 0, all j xij Q = i j xij = 0 0 xij 1, all j x, y 0,1 ij i z { } i 2007 Slide 103

104 Example: Package transport MILP model min i Q y a ; x = 1, all j a x Q y, all i x y, all i, j { }, y 0,1 i i i i j ij i j i x j ij ij j ij i i i i c y Modeling trick; unobvious without disjunctive approach Most OR professionals would omit this constraint, since it is the sum over i of the next constraint. But it generates very effective knapsack cuts Slide 104

105 5. Cutting Planes Knapsack Cuts 5.2 Gomory Cuts 5.3 Mixed Integer Rounding Cuts 5.4 Example: Product Configuration 2007 Slide 105

106 To review A cutting plane (cut, valid inequality) for an MILP model: is valid - It is satisfied by all feasible solutions of the model. cuts off solutions of the continuous relaxation. - This makes the relaxation tighter. Cutting plane Continuous relaxation Feasible solutions 2007 Slide 106

107 Motivation Cutting planes (cuts) tighten the continuous relaxation of an MILP model. Knapsack cuts - Generated for individual knapsack constraints. - We saw general integer knapsack cuts earlier knapsack cuts and lifting techniques are well studied and widely used. Rounding cuts - Generated for the entire MILP, they are widely used. - Gomory cuts for integer variables only. - Mixed integer rounding cuts for any MILP Slide 107

108 Knapsack Cuts 0-1 knapsack cuts are designed for knapsack constraints with 0-1 variables. The analysis is different from that of general knapsack constraints, to exploit the special structure of 0-1 inequalities Slide 108

109 0-1 Knapsack Cuts 0-1 knapsack cuts are designed for knapsack constraints with 0-1 variables. The analysis is different from that of general knapsack constraints, to exploit the special structure of 0-1 inequalities. Consider a 0-1 knapsack packing constraint ax a 0. (Knapsack covering constraints are similarly analyzed.) Index set J is a cover if j 0 j J x j J 1 ax a j J 0 a > The cover inequality is a 0-1 knapsack cut for a 2007 Slide 109 Only minimal covers need be considered.

110 Example J = {1,2,3,4} is a cover for 6x + 5x + 5x + 5x + 8x + 3x This gives rise to the cover inequality x1 + x2 + x3 + x4 3 Index set J is a cover if j 0 j J x j J 1 ax a j J 0 a > The cover inequality is a 0-1 knapsack cut for a 2007 Slide 110 Only minimal covers need be considered.

111 Sequential lifting A cover inequality can often be strengthened by lifting it into a higher dimensional space. That is, by adding variables. Sequential lifting adds one variable at a time. Sequence-independent lifting adds several variables at once Slide 111

112 Sequential lifting To lift a cover inequality x J 1 j J add a term to the left-hand side x + π x J 1 j j J j k k where π k is the largest coefficient for which the inequality is still valid. So, π = J 1 max x a x a a k x 0 j { 0,1 j j j k } j J j J for j J This can be done repeatedly (by dynamic programming) Slide 112

113 Example Given To lift add a term to the left-hand side where π = 3 max This yields 6x + 5x + 5x + 5x + 8x + 3x 17 x1 + x2 + x3 + x4 3 { x x x x x x x x } { 0,1} x j for j {1,2,3,4} x + x + x + x + 2x Further lifting leaves the cut unchanged. x x x x π x But if the variables are added in the order x 6, x 5, the result is different: x1 + x2 + x3 + x4 + x5 + x Slide 113

114 Sequence-independent lifting Sequence-independent lifting usually yields a weaker cut than sequential lifting. But it adds all the variables at once and is much faster. Commonly used in commercial MILP solvers Slide 114

115 Sequence-independent lifting To lift a cover inequality x J 1 j J j add terms to the left-hand side x + ρ( a ) x J 1 where with j J j j k j J { } { } j if Aj u Aj + 1 and j 0,, p 1 ρ( u) = j + ( u Aj )/ if Aj u < Aj and j 1,, p 1 p + ( u Ap )/ if Ap u = J j J a j a { 1,, p} = 0 A j j = k = 1 A 0 = 0 a k 2007 Slide 115

116 Example Given To lift Add terms 6x + 5x + 5x + 5x + 8x + 3x 17 x1 + x2 + x3 + x4 3 x + x + x + x + ρ(8) x + ρ(3) x 3 where ρ(u) is given by This yields the lifted cut x + x + x + x + (5 / 4) x + (1/ 4) x Slide 116

117 5.2 Gomory Cuts When an integer programming problem has a nonintegral solution, we can generate at least one Gomory cut to cut off that solution. - This is a special case of a separating cut, because it separates the current solution of the relaxation from the feasible set. Gomory cuts are widely used and very effective in MILP solvers. Separating cut Solution of continuous relaxation Feasible solutions 2007 Slide 117

118 Gomory cuts Given an integer programming problem min Ax x = cx b 0 and integral Let (x B,0) be an optimal solution of the continuous relaxation, where x ˆ ˆ B = b NxN ˆ 1 ˆ 1 b B = b, N = B N Then if x i is nonintegral in this solution, the following Gomory cut is violated by (x B,0): x ˆ ˆ i + N i x N b i 2007 Slide 118

119 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / Slide 119

120 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral The Gomory cut is x [ ] 2 or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j x ˆ ˆ i + N i x N b i x / 9 1/ 9 2 / 3 x 4 0 and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / 3 or x2 x3 0 In x 1,x 2 space this is x1 + 2x Slide 120

121 Example min 2x + 3x x x 3 4x + 3x 6 x, x 0 and integral or min 2x + 3x 1 2 x + 3x x = x + 3x x = 6 x j and integral Optimal solution of the continuous relaxation has x x 1 2 / 3 1 B = = x2 1/ 3 1/ 3 ˆ N = 4 / 9 1/ 9 1 bˆ = 2 / 3 bˆ 1 = 2 / 3 Gomory cut x 1 + 2x Slide 121 Gomory cut after re-solving LP with previous cut.

122 5.3 Mixed Integer Rounding Cuts Mixed integer rounding (MIR) cuts can be generated for solutions of any relaxed MILP in which one or more integer variables has a fractional value. Like Gomory cuts, they are separating cuts. MIR cuts are widely used in commercial solvers Slide 122

123 MIR cuts Given an MILP problem min cx + dy Ax + Dy = b x, y 0 and y integral In an optimal solution of the continuous relaxation, let J = { j y j is nonbasic} K = { j x j is nonbasic} N = nonbasic cols of [A D] Then if y i is nonintegral in this solution, the following MIR cut is violated by the solution of the relaxation: frac( Nˆ ) ˆ ˆ ij 1 ˆ + y ˆ ˆ i + N ij y j + N ij + Nij x j Nij bi frac( ˆ + ) frac( ˆ j J ) 1 j J b 2 i b i j K where J1 { j J frac( Nˆ ) frac( ˆ ij bj )} J = J \ J = Slide 123

124 Example 3x + 4x 6y 4y = x + 2x y y = x, y 0, y integer j j j Take basic solution (x 1,y 1 ) = (8/3,17/3). Then N 1/ 3 2 / 3 ˆ 8 / 3 = 2 / 3 8 / 3 bˆ = 17 / 3 J = {2}, K = {2}, J 1 =, J 2 = {2} 1/ The MIR cut is y1 + 1/ 3 + y2 + (2 / 3) x2 8 / 3 2 / 3 2 / 3 y + (1/ 2) y + x 3 or Slide 124

125 5.4 Example: Product Configuration This example illustrates: Combination of propagation and relaxation. Processing of variable indices. Continuous relaxation of element constraint Slide 125

126 The problem Choose what type of each component, and how many Memory Memory Memory Memory Memory Memory Personal computer Power supply Disk drive Disk drive Disk drive Disk drive Disk drive Power supply Power supply Power supply 2007 Slide 126

127 Model of the problem Unit cost of producing attribute j Amount of attribute j produced (< 0 if consumed): memory, heat, power, weight, etc. min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i Amount of attribute j produced by type t i of component i t i is a variable index Quantity of component i installed 2007 Slide 127

128 To solve it: Branch on domains of t i and q i. Propagate element constraints and bounds on v j. Variable index is converted to specially structured element constraint. Valid knapsack cuts are derived and propagated. Use linear continuous relaxations. Special purpose MILP relaxation for element Slide 128

129 Propagation min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is propagated in the usual way 2007 Slide 129

130 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is rewritten as This is propagated in the usual way 2007 Slide 130

131 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This can be propagated by (a) using specialized filters for element constraints of this form 2007 Slide 131

132 Propagation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This is propagated by (a) using specialized filters for element constraints of this form, (b) adding knapsack cuts for the valid inequalities: 2007 Slide 132 i i max k D t i min k D t i { } A q v, all j ijk { } i A q v, all j ijk i j and (c) propagating the knapsack cuts. j [ v, v ] j j is current domain of v j

133 Relaxation min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is relaxed as v j v j v j 2007 Slide 133

134 Relaxation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i min j c v v = q A, all j j i ijt ik L v U, all j j j j j j i This is relaxed by relaxing this and adding the knapsack cuts. This is relaxed as v j v j v j 2007 Slide 134

135 Relaxation v z, all j j = i i ( ) element t,( q, A,, q A ), z, all i, j i i ij1 i ijn i This is relaxed by replacing each element constraint with a disjunctive convex hull relaxation: z = A q, q = q i ijk ik i ik k D k D ti ti 2007 Slide 135

136 So the following LP relaxation is solved at each node of the search tree to obtain a lower bound: 2007 Slide 136 Relaxation min j v = A q, all j j ijk ik i k D q = q, all i j k D t i v v v, all j q q q, all i { } knapsack cuts for max A q v, all j { } knapsack cuts for min A q v, all j q 0, all i, k j t i ik j j j i i i ik c v j i i k D k D t i t i ijk i j ijk i j

137 6. Lagrangean Relaxation 6.1 Lagrangean Duality 6.2 Properties of the Lagrangean Dual 6.3 Example: Fast Linear Programming 6.4 Domain Filtering 6.5 Example: Continuous Global Optimization 2007 Slide 137

138 Motivation Lagrangean relaxation can provide better bounds than LP relaxation. The Lagrangean dual generalizes LP duality. It provides domain filtering analogous to that based on LP duality. - This is a key technique in continuous global optimization. Lagrangean relaxation gets rid of troublesome constraints by dualizing them. - That is, moving them into the objective function. - The Lagrangean relaxation may decouple Slide 138

139 6.1 Lagrangean Duality Consider an inequality-constrained problem min f ( x) g( x) 0 x S Hard constraints Easy constraints The object is to get rid of (dualize) the hard constraints by moving them into the objective function Slide 139

140 Lagrangean Duality Consider an inequality-constrained problem min f ( x) g( x) 0 x S It is related to an inference problem max v s S g( x) b f ( x) v implies Lagrangean Dual problem: Find the tightest lower bound on the objective function that is implied by the constraints Slide 140

141 Primal Dual min f ( x) max v g( x) 0 s S g( x) b f ( x) v x S Surrogate Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S 2007 Slide 141

142 Primal Dual min f ( x) max v g( x) 0 s S g( x) b f ( x) v x S Surrogate Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S Or v min { f ( x) λg( x) } x S 2007 Slide 142

143 Primal min f ( x) g( x) 0 x S Let us say that x S λg( x) 0 dominates f ( x) v 0 g( x) 0 f ( x) v iff for some λ 0 So the dual becomes 2007 Slide 143 max v λg(x) f(x) v for all x S That is, v f(x) λg(x) for all x S Or Dual max { λ } v s S g( x) b f ( x) v v min f ( x) g( x) for some λ 0 x S Surrogate { λ } v min f ( x) g( x) x S

144 Now we have Primal min f ( x) g( x) 0 x S These constraints are dualized Dual max v or where max θ( λ) λ 0 { λ } v min f ( x) g( x) for some λ 0 x S { f x λg x } θ( λ) = min ( ) ( ) x S Lagrangean relaxation Vector of Lagrange multipliers 2007 Slide 144 The Lagrangean dual can be viewed as the problem of finding the Lagrangean relaxation that gives the tightest bound.

145 Example min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 The Lagrangean relaxation is { x x x x x x } θ( λ, λ ) = min λ ( + 3 ) λ (2 + 5) x {0,,3} x {0,,3} j { λ λ x λ λ x λ } = min (3 + 2 ) + (4 3 ) + 5 j The Lagrangean relaxation is easy to solve for any given λ 1, λ 2 : Strongest surrogate x x if 3 + λ1 2λ2 0 = 3 otherwise 0 if 4 3λ1 λ2 0 = 3 otherwise 2007 Slide 145 Optimal solution (2,1)

146 Example θ(λ 1,λ 2 ) is piecewise linear and concave. min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 λ 2 θ(λ)=5 θ(λ)=9 2/7 θ(λ)=0 λ 1 θ(λ)=0 θ(λ)=7.5 Solution of Lagrangean dual: (λ 1,λ 2 ) = (5/7, 13/7), θ(λ) = 9 2/ Slide 146 Optimal solution (2,1) Value = 10 Note duality gap between 10 and 9 2/7 (no strong duality).

147 Example min 3x + 4x x + 3x 0 2x + x 5 0 { } x, x 0,1,2,3 Note: in this example, the Lagrangean dual provides the same bound (9 2/7) as the continuous relaxation of the IP. This is because the Lagrangean relaxation can be solved as an LP: { x x } θ( λ, λ ) = min (3 + λ 2 λ ) + (4 3 λ λ ) + 5λ 1 2 x {0,,3} x 3 j { λ λ x λ λ x λ } = min (3 + 2 ) + (4 3 ) + 5 j Lagrangean duality is useful when the Lagrangean relaxation is tighter than an LP but nonetheless easy to solve Slide 147

148 6.2 Properties of the Lagrangean dual Weak duality: For any feasible x* and any λ* 0, f(x*) θ(λ*). In particular, min f ( x) g( x) 0 x S max θ( λ) λ 0 Concavity: θ(λ) is concave. It can therefore be maximized by local search methods. Complementary slackness: If x* and λ* are optimal, and there is no duality gap, then λ*g(x*) = Slide 148

149 Solving the Lagrangean dual Let λ k be the kth iterate, and let λ + = λ + α ξ k 1 k k k Subgradient of θ(λ) at λ = λ k If x k solves the Lagrangean relaxation for λ = λ k, then ξ k = g(x k ). This is because θ(λ) = f(x k ) + λg(x k ) at λ = λ k. The stepsize α k must be adjusted so that the sequence converges but not before reaching a maximum Slide 149

150 6.3 Example: Fast Linear Programming In CP contexts, it is best to process each node of the search tree very rapidly. Lagrangean relaxation may allow very fast calculation of a lower bound on the optimal value of the LP relaxation at each node. The idea is to solve the Lagrangean dual at the root node (which is an LP) and use the same Lagrange multipliers to get an LP bound at other nodes Slide 150

151 At root node, solve Dualize Special structure, e.g. variable bounds min cx Ax Dx b d x 0 ( λ) The (partial) LP dual solution λ* solves the Lagrangean dual in which x 0 { cx λ Ax b } θ( λ) = min ( ) Dx d 2007 Slide 151

152 At root node, solve Dualize Special structure, e.g. variable bounds min cx Ax Dx b d x 0 ( λ) The (partial) LP dual solution λ* solves the Lagrangean dual in which x 0 { cx λ Ax b } θ( λ) = min ( ) Dx d At another node, the LP is Here θ(λ*) is still a lower bound on the optimal value of the LP and can be quickly calculated by solving a specially structured LP. min Ax Dx Hx x 0 cx b d h ( λ) Branching constraints, etc Slide 152

153 6.4 Domain Filtering Suppose: min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*. and λ i * > 0, which means the i-th constraint is tight (complementary slackness); and the problem is a relaxation of a CP problem; and we have a feasible solution of the CP problem with value U, so that U is an upper bound on the optimal value Slide 153

154 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: If x were to change to a value other than x*, the LHS of i-th constraint g i (x) 0 would change by some amount i. Since the constraint is tight, this would increase the optimal value as much as changing the constraint to g i (x) i 0. So it would increase the optimal value at least λ i * i. (It is easily shown that Lagrange multipliers are marginal costs. Dual multipliers for LP are a special case of Lagrange multipliers.) 2007 Slide 154

155 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: We have found: a change in x that changes g i (x) by i the optimal value at least λ i * i. increases Since optimal value of this problem optimal value of the CP U, we have λ i * i U v*, or * U v i λ * i 2007 Slide 155

156 Supposing min f ( x) g( x) 0 x S has optimal solution x*, optimal value v*, and optimal Lagrangean dual solution λ*: We have found: a change in x that changes g i (x) by i the optimal value at least λ i * i. increases Since optimal value of this problem optimal value of the CP U, we have λ i * i U v*, or * Since i = g i (x) g i (x*) = g i (x), this implies the inequality 2007 Slide 156 g ( x) i U v λ * i * U v i λ * i which can be propagated.

157 6.5 Example: Continuous Global Optimization Some of the best continuous global solvers (e.g., BARON) combine OR-style relaxation with CP-style interval arithmetic and domain filtering. The use of Lagrange multipliers for domain filtering is a key technique in these solvers Slide 157

158 Continuous Global Optimization max 1 2 x 4x x = x 1 2 2x + x 2 x + [0,1], x [0,2] 1 2 x 2 Global optimum Local optimum Feasible set 2007 Slide 158 x 1

159 To solve it: Search: split interval domains of x 1, x 2. Each node of search tree is a problem restriction. Propagation: Interval propagation, domain filtering. Use Lagrange multipliers to infer valid inequality for propagation. Reduced-cost variable fixing is a special case. Relaxation: Use function factorization to obtain linear continuous relaxation Slide 159

160 Interval propagation x 2 Propagate intervals [0,1], [0,2] through constraints to obtain [1/8,7/8], [1/4,7/4] x Slide 160

161 Relaxation (function factorization) Factor complex functions into elementary functions that have known linear relaxations. Write 4x 1 x 2 = 1 as 4y = 1 where y = x 1 x 2. This factors 4x 1 x 2 into linear function 4y and bilinear function x 1 x 2. Linear function 4y is its own linear relaxation Slide 161

162 Relaxation (function factorization) Factor complex functions into elementary functions that have known linear relaxations. Write 4x 1 x 2 = 1 as 4y = 1 where y = x 1 x 2. This factors 4x 1 x 2 into linear function 4y and bilinear function x 1 x 2. Linear function 4y is its own linear relaxation. Bilinear function y = x 1 x 2 has relaxation: x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x where domain of x j is [ x, x ] j j 2007 Slide 162

163 Relaxation (function factorization) The linear relaxation becomes: min x 4y = x 1 2 2x + x 2 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j 2007 Slide 163

164 Relaxation (function factorization) x 2 Solve linear relaxation. x Slide 164

165 Relaxation (function factorization) x 2 Solve linear relaxation. x2 [1,1.75] Since solution is infeasible, split an interval and branch. x2 [0.25,1] x Slide 165

166 x2 [1,1.75] x2 [0.25,1] x 2 x 2 x 1 x Slide 166

167 x2 [1,1.75] x2 [0.25,1] x 2 Solution of relaxation is feasible, value = 1.25 This becomes incumbent solution x 2 x 1 x Slide 167

168 x2 [1,1.75] x2 [0.25,1] x 2 Solution of relaxation is feasible, value = 1.25 This becomes incumbent solution x 2 Solution of relaxation is not quite feasible, value = Also use Lagrange multipliers for domain filtering x 1 x Slide 168

169 Relaxation (function factorization) min x 4y = x 1 2 2x + x 2 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j Associated Lagrange multiplier in solution of relaxation is λ 2 = Slide 169

170 Relaxation (function factorization) min x 4y = x 1 2 2x + x 2 Associated Lagrange multiplier in solution of relaxation is λ 2 = 1.1 x x + x x x x y x x + x x x x x x + x x x x y x x + x x x x x x x, j = 1,2 j j j 2007 Slide 170 This yields a valid inequality for propagation: x1 + x2 2 = Value of relaxation Lagrange multiplier Value of incumbent solution

171 7. Dynamic Programming 7.1 Example: Capital Budgeting 7.2 Domain Filtering 7.3 Recursive Optimization 2007 Slide 171

172 Motivation Dynamic programming (DP) is a highly versatile technique that can exploit recursive structure in a problem. Domain filtering is straightforward for problems modeled as a DP. DP is also important in designing filters for some global constraints, such as the stretch constraint (employee scheduling). Nonserial DP is related to bucket elimination in CP and exploits the structure of the primal graph. DP modeling is the art of keeping the state space small while maintaining a Markovian property. We will examine only one simple example of serial DP Slide 172

173 7.1 Example: Capital Budgeting We wish to built power plants with a total cost of at most 12 million Euros. There are three types of plants, costing 4, 2 or 3 million Euros each. We must build one or two of each type. The problem has a simple knapsack packing model: Number of factories of type j 4x + 2x + 3x 12 x j { 1,2 } 2007 Slide 173

174 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } State s k Stage k In general the recursion for ax b is { } f ( s ) = max f ( s + a x ) k k k + 1 x D k k k k x k x 3 =2 = 1 if there is a path from state s k to a feasible solution, 0 otherwise 2007 Slide 174 State is sum of first k terms of ax x 3 =1 f 3 (8) = max{f 4 (8+3 1), f 4 (8+3 2)} = max{1,0} = 1 f 4 (14)=0 f 4 (11)=1

175 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } In general the recursion for ax b is { } f ( s ) = max f ( s + a x ) k k k + 1 x D k k k k x k Boundary condition: f 1 if sn+ 1 b ( s ) = 0 otherwise n+ 1 n f k (s k ) for each state s k 2007 Slide 175

176 Example: Capital Budgeting 4x + 2x + 3x 12 x j { 1,2 } The problem is feasible. 1 0 Each path to 0 is a feasible solution Path 1: x = (1,2,1) 1 1 Path 2: x = (1,1,2) 1 Path 3: x = (1,1,1) Possible costs are 9,11,12. f k (s k ) for each state s k 2007 Slide 176

177 7.2 Domain Filtering 4x + 2x + 3x 12 x j { 1,2 } To filter domains: observe what values of x k occur on feasible paths. x 1 =1 x 2 =2 x 3 =1 D = x 3 D = x 2 D = x 1 { 1,2 } { 1,2 } { 1} x 2 =1 x 3 =2 x 3 = Slide 177

178 7.3 Recursive Optimization max 15x + 10x + 12x { 1,2 } x + 2x + 3x 12 x j Maximize revenue The recursion includes arc values: { } f ( s ) = max c x + f ( s + a x ) k k k k k + 1 x D k k k k x k = value on max value path from s k to final stage (value to go) 2007 Slide 178 Arc value f 3 (8) = max{12 1+f 4 (8+3 1), 12 2+f 4 (8+3 2)} = max{12, } = 12 f 4 (14)= f 4 (11)=0

179 f Recursive optimization max 15x + 10x + 12x { 1,2 } Boundary condition: 0 if sn+ 1 b ( s ) = otherwise n+ 1 n x + 2x + 3x 12 x j The recursion includes arc values: { } f ( s ) = max c x + f ( s + a x ) k k k k k + 1 k k k f k (s k ) for each state s k Slide 179

180 Recursive optimization max 15x + 10x + 12x { 1,2 } x + 2x + 3x 12 x j 49 The maximum revenue is The optimal path is easy to retrace (x 1,x 2,x 3 ) = (1,1,2) 0 f k (s k ) for each state s k 2007 Slide 180

181 8. Branch and Price 8.1 Basic Idea 8.2 Example: Airline Crew Scheduling 2007 Slide 181

182 Motivation Branch and price allows solution of integer programming problems with a huge number of variables. The problem is solved by a branch-and-relax method. The difference lies in how the LP relaxation is solved. Variables are added to the LP relaxation only as needed. Variables are priced to find which ones should be added. CP is useful for solving the pricing problem, particularly when constraints are complex. CP-based branch and price has been successfully applied to airline crew scheduling, transit scheduling, and other transportation-related problems Slide 182

183 8.1 Basic Idea Suppose the LP relaxation of an integer programming problem has a huge number of variables: We will solve a restricted master problem, which has a small subset of the variables: Column j of A min cx Ax = b x 0 min j J x j j 0 j J A x j c x j = b j ( λ) Adding x k to the problem would improve the solution if x k has a negative reduced cost: r = c λa < 0 k k k 2007 Slide 183

184 Basic Idea Adding x k to the problem would improve the solution if x k has a negative reduced cost: r = c λa < k k k 0 Computing the reduced cost of x k is known as pricing x k. Cost of column y So we solve the pricing problem: min y cy λy is a column of A If the solution y* satisfies c y* λy* < 0, then we can add column y to the restricted master problem Slide 184

185 Basic Idea The pricing problem max λy y is a column of A need not be solved to optimality, so long as we find a column with negative reduced cost. However, when we can no longer find an improving column, we solved the pricing problem to optimality to make sure we have the optimal solution of the LP. If we can state constraints that the columns of A must satisfy, CP may be a good way to solve the pricing problem Slide 185

186 8.2 Example: Airline Crew Scheduling We want to assign crew members to flights to minimize cost while covering the flights and observing complex work rules. Flight data Start time 2007 Slide 186 Finish time A roster is the sequence of flights assigned to a single crew member. The gap between two consecutive flights in a roster must be from 2 to 3 hours. Total flight time for a roster must be between 6 and 10 hours. For example, flight 1 cannot immediately precede 6 flight 4 cannot immediately precede 5. The possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6)

187 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 187

188 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 188 Rosters that cover flight 1.

189 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 189 Rosters that cover flight 2.

190 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 190 Rosters that cover flight 3.

191 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 191 Rosters that cover flight 4.

192 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 192 Rosters that cover flight 5.

193 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 193 Rosters that cover flight 6.

194 Airline Crew Scheduling There are 2 crew members, and the possible rosters are: (1,3,5), (1,4,6), (2,3,5), (2,4,6) The LP relaxation of the problem is: Cost c 12 of assigning crew member 1 to roster 2 = 1 if we assign crew member 1 to roster 2, = 0 otherwise. Each crew member is assigned to exactly 1 roster. Each flight is assigned at least 1 crew member Slide 194 In a real problem, there can be millions of rosters.

195 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Optimal dual solution u 1 u 2 v 1 v 2 v 3 v 4 v 5 v Slide 195

196 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Dual variables u 1 u 2 v 1 v 2 v 3 v 4 v 5 v Slide 196

197 Airline Crew Scheduling We start by solving the problem with a subset of the columns: Dual variables u 1 u 2 v 1 v 2 v 3 v 4 v 5 v 6 The reduced cost of an excluded roster k for crew member i is c u v ik i j j in roster k We will formulate the pricing problem as a shortest path problem Slide 197

198 Pricing problem 2 Crew member 1 Crew member Slide 198

199 Pricing problem Each s-t path corresponds to a roster, provided the flight time is within bounds. 2 Crew member 1 Crew member Slide 199

200 Pricing problem Cost of flight 3 if it immediately follows flight 1, offset by dual multiplier for flight 1 2 Crew member 1 Crew member Slide 200

201 Pricing problem Cost of transferring from home to flight 1, offset by dual multiplier for crew member 1 2 Crew member 1 Dual multiplier omitted to break symmetry Crew member Slide 201

202 Pricing problem Length of a path is reduced cost of the corresponding roster. 2 Crew member 1 Crew member Slide 202

203 Pricing problem Arc lengths using dual solution of LP relaxation Crew member Crew member Slide 203

204 Pricing problem Solution of shortest path problems Crew member 1 Reduced cost = 1 Add x 12 to problem Crew member 2 Reduced cost = 2 Add x 23 to problem Slide After x 12 and x 23 are added to the problem, no remaining variable has negative reduced cost.

205 Pricing problem The shortest path problem cannot be solved by traditional shortest path algorithms, due to the bounds on total path length. It can be solved by CP: Set of flights assigned to crew member i Path length Graph Path global constraint Setsum global constraint Path( X, z, G), all flights i min i j X i { } i T f s T max X flights, z < 0, all i i ( j j ) i Duration of flight j 2007 Slide 205

206 2007 Slide 206

207 9. Benders Decomposition 9.1 Benders Decomposition in the Abstract 9.2 Classical Benders Decomposition 9.3 Example: Machine Scheduling 2007 Slide 207

A Framework for Integrating Optimization and Constraint Programming

A Framework for Integrating Optimization and Constraint Programming A Framework for Integrating Optimization and Constraint Programming John Hooker Carnegie Mellon University SARA 2007 SARA 07 Slide Underlying theme Model vs. solution method. How can we match the model

More information

Combining Optimization and Constraint Programming

Combining Optimization and Constraint Programming Combining Optimization and Constraint Programming John Hooker Carnegie Mellon University GE Research Center 7 September 2007 GE 7 Sep 07 Slide 1 Optimization and Constraint Programming Optimization: focus

More information

Tutorial: Operations Research in Constraint Programming

Tutorial: Operations Research in Constraint Programming Tutorial: Operations Research in Constraint Programming John Hooker Carnegie Mellon University May 2009 Revised June 2009 May 2009 Slide 1 Why Integrate OR and CP? Complementary strengths Computational

More information

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008 How to Relax Slide 1 John Hooker Carnegie Mellon University September 2008 Two ways to relax Relax your mind and body. Relax your problem formulations. Slide 2 Relaxing a problem Feasible set of original

More information

An Integrated Approach to Truss Structure Design

An Integrated Approach to Truss Structure Design Slide 1 An Integrated Approach to Truss Structure Design J. N. Hooker Tallys Yunes CPAIOR Workshop on Hybrid Methods for Nonlinear Combinatorial Problems Bologna, June 2010 How to Solve Nonlinear Combinatorial

More information

Integrating CP and Mathematical Programming

Integrating CP and Mathematical Programming Integrating CP and Mathematical Programming John Hooker Carnegie Mellon University June 2011 June 2011 Slide 1 Why Integrate CP and MP? Complementary strengths Computational advantages Outline of the Tutorial

More information

Integrating Solution Methods. through Duality. US-Mexico Workshop on Optimization. Oaxaca, January John Hooker

Integrating Solution Methods. through Duality. US-Mexico Workshop on Optimization. Oaxaca, January John Hooker Integrating Solution Methods through Duality John Hooker US-Meico Workshop on Optimization Oaaca, January 2011 Zapotec Duality mask Late classical period Represents life/death, day/night, heat/cold See

More information

A Scheme for Integrated Optimization

A Scheme for Integrated Optimization A Scheme for Integrated Optimization John Hooker ZIB November 2009 Slide 1 Outline Overview of integrated methods A taste of the underlying theory Eamples, with results from SIMPL Proposal for SCIP/SIMPL

More information

A Principled Approach to Mixed Integer/Linear Problem Formulation

A Principled Approach to Mixed Integer/Linear Problem Formulation A Principled Approach to Mixed Integer/Linear Problem Formulation J N Hooker September 9, 2008 Abstract We view mixed integer/linear problem formulation as a process of identifying disjunctive and knapsack

More information

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the multi-dimensional knapsack problem, additional

More information

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009 Lessons from MIP Search John Hooker Carnegie Mellon University November 2009 Outline MIP search The main ideas Duality and nogoods From MIP to AI (and back) Binary decision diagrams From MIP to constraint

More information

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING 1. Lagrangian Relaxation Lecture 12 Single Machine Models, Column Generation 2. Dantzig-Wolfe Decomposition Dantzig-Wolfe Decomposition Delayed Column

More information

arxiv: v1 [cs.cc] 5 Dec 2018

arxiv: v1 [cs.cc] 5 Dec 2018 Consistency for 0 1 Programming Danial Davarnia 1 and J. N. Hooker 2 1 Iowa state University davarnia@iastate.edu 2 Carnegie Mellon University jh38@andrew.cmu.edu arxiv:1812.02215v1 [cs.cc] 5 Dec 2018

More information

Single-Facility Scheduling by Logic-Based Benders Decomposition

Single-Facility Scheduling by Logic-Based Benders Decomposition Single-Facility Scheduling by Logic-Based Benders Decomposition Elvin Coban J. N. Hooker Tepper School of Business Carnegie Mellon University ecoban@andrew.cmu.edu john@hooker.tepper.cmu.edu September

More information

Projection in Logic, CP, and Optimization

Projection in Logic, CP, and Optimization Projection in Logic, CP, and Optimization John Hooker Carnegie Mellon University Workshop on Logic and Search Melbourne, 2017 Projection as a Unifying Concept Projection is a fundamental concept in logic,

More information

Duality in Optimization and Constraint Satisfaction

Duality in Optimization and Constraint Satisfaction Duality in Optimization and Constraint Satisfaction J. N. Hooker Carnegie Mellon University, Pittsburgh, USA john@hooker.tepper.cmu.edu Abstract. We show that various duals that occur in optimization and

More information

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 8 Dr. Ted Ralphs ISE 418 Lecture 8 1 Reading for This Lecture Wolsey Chapter 2 Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Duality for Mixed-Integer

More information

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010 Section Notes 9 IP: Cutting Planes Applied Math 121 Week of April 12, 2010 Goals for the week understand what a strong formulations is. be familiar with the cutting planes algorithm and the types of cuts

More information

Logic-Based Benders Decomposition

Logic-Based Benders Decomposition Logic-Based Benders Decomposition J. N. Hooker Graduate School of Industrial Administration Carnegie Mellon University, Pittsburgh, PA 15213 USA G. Ottosson Department of Information Technology Computing

More information

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications François Vanderbeck University of Bordeaux INRIA Bordeaux-Sud-Ouest part : Defining Extended Formulations

More information

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini In the name of God Network Flows 6. Lagrangian Relaxation 6.3 Lagrangian Relaxation and Integer Programming Fall 2010 Instructor: Dr. Masoud Yaghini Integer Programming Outline Branch-and-Bound Technique

More information

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018 Section Notes 9 Midterm 2 Review Applied Math / Engineering Sciences 121 Week of December 3, 2018 The following list of topics is an overview of the material that was covered in the lectures and sections

More information

Lagrangian Relaxation in MIP

Lagrangian Relaxation in MIP Lagrangian Relaxation in MIP Bernard Gendron May 28, 2016 Master Class on Decomposition, CPAIOR2016, Banff, Canada CIRRELT and Département d informatique et de recherche opérationnelle, Université de Montréal,

More information

3.10 Lagrangian relaxation

3.10 Lagrangian relaxation 3.10 Lagrangian relaxation Consider a generic ILP problem min {c t x : Ax b, Dx d, x Z n } with integer coefficients. Suppose Dx d are the complicating constraints. Often the linear relaxation and the

More information

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 21 Dr. Ted Ralphs IE406 Lecture 21 1 Reading for This Lecture Bertsimas Sections 10.2, 10.3, 11.1, 11.2 IE406 Lecture 21 2 Branch and Bound Branch

More information

Logic, Optimization and Data Analytics

Logic, Optimization and Data Analytics Logic, Optimization and Data Analytics John Hooker Carnegie Mellon University United Technologies Research Center, Cork, Ireland August 2015 Thesis Logic and optimization have an underlying unity. Ideas

More information

An Integrated Solver for Optimization Problems

An Integrated Solver for Optimization Problems WORKING PAPER WPS-MAS-08-01 Department of Management Science School of Business Administration, University of Miami Coral Gables, FL 33124-8237 Created: December 2005 Last update: July 2008 An Integrated

More information

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs Computational Integer Programming Lecture 2: Modeling and Formulation Dr. Ted Ralphs Computational MILP Lecture 2 1 Reading for This Lecture N&W Sections I.1.1-I.1.6 Wolsey Chapter 1 CCZ Chapter 2 Computational

More information

Scheduling Home Hospice Care with Logic-Based Benders Decomposition

Scheduling Home Hospice Care with Logic-Based Benders Decomposition Scheduling Home Hospice Care with Logic-Based Benders Decomposition John Hooker Carnegie Mellon University Joint work with Aliza Heching Ryo Kimura Compassionate Care Hospice CMU Lehigh University October

More information

Recent Developments in Integrated Methods for Optimization

Recent Developments in Integrated Methods for Optimization Recent Developments in Integrated Methods for Optimization John Hooker Carnegie Mellon University August 2012 Premise Constraint programming and mathematical programming can work together. There is underlying

More information

Resource Constrained Project Scheduling Linear and Integer Programming (1)

Resource Constrained Project Scheduling Linear and Integer Programming (1) DM204, 2010 SCHEDULING, TIMETABLING AND ROUTING Lecture 3 Resource Constrained Project Linear and Integer Programming (1) Marco Chiarandini Department of Mathematics & Computer Science University of Southern

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2017 04 07 Lecture 8 Linear and integer optimization with applications

More information

Advanced Linear Programming: The Exercises

Advanced Linear Programming: The Exercises Advanced Linear Programming: The Exercises The answers are sometimes not written out completely. 1.5 a) min c T x + d T y Ax + By b y = x (1) First reformulation, using z smallest number satisfying x z

More information

The Separation Problem for Binary Decision Diagrams

The Separation Problem for Binary Decision Diagrams The Separation Problem for Binary Decision Diagrams J. N. Hooker Joint work with André Ciré Carnegie Mellon University ISAIM 2014 Separation Problem in Optimization Given a relaxation of an optimization

More information

3.7 Cutting plane methods

3.7 Cutting plane methods 3.7 Cutting plane methods Generic ILP problem min{ c t x : x X = {x Z n + : Ax b} } with m n matrix A and n 1 vector b of rationals. According to Meyer s theorem: There exists an ideal formulation: conv(x

More information

Linear Programming Inverse Projection Theory Chapter 3

Linear Programming Inverse Projection Theory Chapter 3 1 Linear Programming Inverse Projection Theory Chapter 3 University of Chicago Booth School of Business Kipp Martin September 26, 2017 2 Where We Are Headed We want to solve problems with special structure!

More information

Planning and Scheduling by Logic-Based Benders Decomposition

Planning and Scheduling by Logic-Based Benders Decomposition Planning and Scheduling by Logic-Based Benders Decomposition J. N. Hooker Carnegie Mellon University john@hooker.tepper.cmu.edu July 2004, Revised October 2005 Abstract We combine mixed integer linear

More information

Decomposition-based Methods for Large-scale Discrete Optimization p.1

Decomposition-based Methods for Large-scale Discrete Optimization p.1 Decomposition-based Methods for Large-scale Discrete Optimization Matthew V Galati Ted K Ralphs Department of Industrial and Systems Engineering Lehigh University, Bethlehem, PA, USA Départment de Mathématiques

More information

An Integrated Method for Planning and Scheduling to Minimize Tardiness

An Integrated Method for Planning and Scheduling to Minimize Tardiness An Integrated Method for Planning and Scheduling to Minimize Tardiness J N Hooker Carnegie Mellon University john@hookerteppercmuedu Abstract We combine mixed integer linear programming (MILP) and constraint

More information

Discrete (and Continuous) Optimization WI4 131

Discrete (and Continuous) Optimization WI4 131 Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek e-mail: C.Roos@ewi.tudelft.nl

More information

Integer program reformulation for robust branch-and-cut-and-price

Integer program reformulation for robust branch-and-cut-and-price Integer program reformulation for robust branch-and-cut-and-price Marcus Poggi de Aragão Informática PUC-Rio Eduardo Uchoa Engenharia de Produção Universidade Federal Fluminense Outline of the talk Robust

More information

Integer programming: an introduction. Alessandro Astolfi

Integer programming: an introduction. Alessandro Astolfi Integer programming: an introduction Alessandro Astolfi Outline Introduction Examples Methods for solving ILP Optimization on graphs LP problems with integer solutions Summary Introduction Integer programming

More information

Optimization Methods in Logic

Optimization Methods in Logic Optimization Methods in Logic John Hooker Carnegie Mellon University February 2003, Revised December 2008 1 Numerical Semantics for Logic Optimization can make at least two contributions to boolean logic.

More information

Integer Programming ISE 418. Lecture 13. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 13. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 13 Dr. Ted Ralphs ISE 418 Lecture 13 1 Reading for This Lecture Nemhauser and Wolsey Sections II.1.1-II.1.3, II.1.6 Wolsey Chapter 8 CCZ Chapters 5 and 6 Valid Inequalities

More information

Decision Diagrams: Tutorial

Decision Diagrams: Tutorial Decision Diagrams: Tutorial John Hooker Carnegie Mellon University CP Summer School Cork, Ireland, June 2016 Decision Diagrams Used in computer science and AI for decades Logic circuit design Product configuration

More information

Lecture 7: Lagrangian Relaxation and Duality Theory

Lecture 7: Lagrangian Relaxation and Duality Theory Lecture 7: Lagrangian Relaxation and Duality Theory (3 units) Outline Lagrangian dual for linear IP Lagrangian dual for general IP Dual Search Lagrangian decomposition 1 / 23 Joseph Louis Lagrange Joseph

More information

Projection, Inference, and Consistency

Projection, Inference, and Consistency Projection, Inference, and Consistency John Hooker Carnegie Mellon University IJCAI 2016, New York City A high-level presentation. Don t worry about the details. 2 Projection as a Unifying Concept Projection

More information

IP Cut Homework from J and B Chapter 9: 14, 15, 16, 23, 24, You wish to solve the IP below with a cutting plane technique.

IP Cut Homework from J and B Chapter 9: 14, 15, 16, 23, 24, You wish to solve the IP below with a cutting plane technique. IP Cut Homework from J and B Chapter 9: 14, 15, 16, 23, 24, 31 14. You wish to solve the IP below with a cutting plane technique. Maximize 4x 1 + 2x 2 + x 3 subject to 14x 1 + 10x 2 + 11x 3 32 10x 1 +

More information

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra

LP Duality: outline. Duality theory for Linear Programming. alternatives. optimization I Idea: polyhedra LP Duality: outline I Motivation and definition of a dual LP I Weak duality I Separating hyperplane theorem and theorems of the alternatives I Strong duality and complementary slackness I Using duality

More information

Introduction to optimization and operations research

Introduction to optimization and operations research Introduction to optimization and operations research David Pisinger, Fall 2002 1 Smoked ham (Chvatal 1.6, adapted from Greene et al. (1957)) A meat packing plant produces 480 hams, 400 pork bellies, and

More information

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints. Section Notes 8 Integer Programming II Applied Math 121 Week of April 5, 2010 Goals for the week understand IP relaxations be able to determine the relative strength of formulations understand the branch

More information

Solving Mixed-Integer Nonlinear Programs

Solving Mixed-Integer Nonlinear Programs Solving Mixed-Integer Nonlinear Programs (with SCIP) Ambros M. Gleixner Zuse Institute Berlin MATHEON Berlin Mathematical School 5th Porto Meeting on Mathematics for Industry, April 10 11, 2014, Porto

More information

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING Lecture 3 and Mixed Integer Programg Marco Chiarandini 1. Resource Constrained Project Model 2. Mathematical Programg 2 Outline Outline 1. Resource Constrained

More information

Stochastic Decision Diagrams

Stochastic Decision Diagrams Stochastic Decision Diagrams John Hooker CORS/INFORMS Montréal June 2015 Objective Relaxed decision diagrams provide an generalpurpose method for discrete optimization. When the problem has a dynamic programming

More information

Operations Research Lecture 6: Integer Programming

Operations Research Lecture 6: Integer Programming Operations Research Lecture 6: Integer Programming Notes taken by Kaiquan Xu@Business School, Nanjing University May 12th 2016 1 Integer programming (IP) formulations The integer programming (IP) is the

More information

Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016

Linear Programming. Larry Blume Cornell University, IHS Vienna and SFI. Summer 2016 Linear Programming Larry Blume Cornell University, IHS Vienna and SFI Summer 2016 These notes derive basic results in finite-dimensional linear programming using tools of convex analysis. Most sources

More information

On mathematical programming with indicator constraints

On mathematical programming with indicator constraints On mathematical programming with indicator constraints Andrea Lodi joint work with P. Bonami & A. Tramontani (IBM), S. Wiese (Unibo) University of Bologna, Italy École Polytechnique de Montréal, Québec,

More information

Decision Diagrams for Discrete Optimization

Decision Diagrams for Discrete Optimization Decision Diagrams for Discrete Optimization Willem Jan van Hoeve Tepper School of Business Carnegie Mellon University www.andrew.cmu.edu/user/vanhoeve/mdd/ Acknowledgments: David Bergman, Andre Cire, Samid

More information

Combinatorial Optimization

Combinatorial Optimization Combinatorial Optimization Lecture notes, WS 2010/11, TU Munich Prof. Dr. Raymond Hemmecke Version of February 9, 2011 Contents 1 The knapsack problem 1 1.1 Complete enumeration..................................

More information

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following

More information

Integer Linear Programming Modeling

Integer Linear Programming Modeling DM554/DM545 Linear and Lecture 9 Integer Linear Programming Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. Assignment Problem Knapsack Problem

More information

Primal/Dual Decomposition Methods

Primal/Dual Decomposition Methods Primal/Dual Decomposition Methods Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Subgradients

More information

Multivalued Decision Diagrams. Postoptimality Analysis Using. J. N. Hooker. Tarik Hadzic. Cork Constraint Computation Centre

Multivalued Decision Diagrams. Postoptimality Analysis Using. J. N. Hooker. Tarik Hadzic. Cork Constraint Computation Centre Postoptimality Analysis Using Multivalued Decision Diagrams Tarik Hadzic Cork Constraint Computation Centre J. N. Hooker Carnegie Mellon University London School of Economics June 2008 Postoptimality Analysis

More information

Polyhedral Approach to Integer Linear Programming. Tepper School of Business Carnegie Mellon University, Pittsburgh

Polyhedral Approach to Integer Linear Programming. Tepper School of Business Carnegie Mellon University, Pittsburgh Polyhedral Approach to Integer Linear Programming Gérard Cornuéjols Tepper School of Business Carnegie Mellon University, Pittsburgh 1 / 30 Brief history First Algorithms Polynomial Algorithms Solving

More information

Introduction to integer programming II

Introduction to integer programming II Introduction to integer programming II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects of Optimization

More information

Benders decomposition [7, 17] uses a problem-solving strategy that can be generalized to a larger context. It assigns some of the variables trial valu

Benders decomposition [7, 17] uses a problem-solving strategy that can be generalized to a larger context. It assigns some of the variables trial valu Logic-Based Benders Decomposition Λ J. N. Hooker Graduate School of Industrial Administration Carnegie Mellon University, Pittsburgh, PA 15213 USA G. Ottosson Department of Information Technology Computing

More information

Solving Dual Problems

Solving Dual Problems Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem

More information

Decomposition Methods for Integer Programming

Decomposition Methods for Integer Programming Decomposition Methods for Integer Programming J.M. Valério de Carvalho vc@dps.uminho.pt Departamento de Produção e Sistemas Escola de Engenharia, Universidade do Minho Portugal PhD Course Programa Doutoral

More information

AM 121: Intro to Optimization! Models and Methods! Fall 2018!

AM 121: Intro to Optimization! Models and Methods! Fall 2018! AM 121: Intro to Optimization Models and Methods Fall 2018 Lecture 15: Cutting plane methods Yiling Chen SEAS Lesson Plan Cut generation and the separation problem Cutting plane methods Chvatal-Gomory

More information

Lecture 8: Column Generation

Lecture 8: Column Generation Lecture 8: Column Generation (3 units) Outline Cutting stock problem Classical IP formulation Set covering formulation Column generation A dual perspective Vehicle routing problem 1 / 33 Cutting stock

More information

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory

An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory An Integer Cutting-Plane Procedure for the Dantzig-Wolfe Decomposition: Theory by Troels Martin Range Discussion Papers on Business and Economics No. 10/2006 FURTHER INFORMATION Department of Business

More information

Integer Programming Duality

Integer Programming Duality Integer Programming Duality M. Guzelsoy T. K. Ralphs July, 2010 1 Introduction This article describes what is known about duality for integer programs. It is perhaps surprising that many of the results

More information

4.6 Linear Programming duality

4.6 Linear Programming duality 4.6 Linear Programming duality To any minimization (maximization) LP we can associate a closely related maximization (minimization) LP Different spaces and objective functions but in general same optimal

More information

1 Column Generation and the Cutting Stock Problem

1 Column Generation and the Cutting Stock Problem 1 Column Generation and the Cutting Stock Problem In the linear programming approach to the traveling salesman problem we used the cutting plane approach. The cutting plane approach is appropriate when

More information

A Search-Infer-and-Relax Framework for Integrating Solution Methods

A Search-Infer-and-Relax Framework for Integrating Solution Methods Carnegie Mellon University Research Showcase @ CMU Tepper School of Business 2005 A Search-Infer-and-Relax Framework for Integrating Solution Methods John N. Hooker Carnegie Mellon University, john@hooker.tepper.cmu.edu

More information

Lagrangean relaxation

Lagrangean relaxation Lagrangean relaxation Giovanni Righini Corso di Complementi di Ricerca Operativa Joseph Louis de la Grange (Torino 1736 - Paris 1813) Relaxations Given a problem P, such as: minimize z P (x) s.t. x X P

More information

Linear Programming Redux

Linear Programming Redux Linear Programming Redux Jim Bremer May 12, 2008 The purpose of these notes is to review the basics of linear programming and the simplex method in a clear, concise, and comprehensive way. The book contains

More information

Reformulation and Decomposition of Integer Programs

Reformulation and Decomposition of Integer Programs Reformulation and Decomposition of Integer Programs François Vanderbeck 1 and Laurence A. Wolsey 2 (Reference: CORE DP 2009/16) (1) Université Bordeaux 1 & INRIA-Bordeaux (2) Université de Louvain, CORE.

More information

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review

CE 191: Civil & Environmental Engineering Systems Analysis. LEC 17 : Final Review CE 191: Civil & Environmental Engineering Systems Analysis LEC 17 : Final Review Professor Scott Moura Civil & Environmental Engineering University of California, Berkeley Fall 2014 Prof. Moura UC Berkeley

More information

BBM402-Lecture 20: LP Duality

BBM402-Lecture 20: LP Duality BBM402-Lecture 20: LP Duality Lecturer: Lale Özkahya Resources for the presentation: https://courses.engr.illinois.edu/cs473/fa2016/lectures.html An easy LP? which is compact form for max cx subject to

More information

Stochastic Programming: From statistical data to optimal decisions

Stochastic Programming: From statistical data to optimal decisions Stochastic Programming: From statistical data to optimal decisions W. Römisch Humboldt-University Berlin Department of Mathematics (K. Emich, H. Heitsch, A. Möller) Page 1 of 24 6th International Conference

More information

Math 5593 Linear Programming Week 1

Math 5593 Linear Programming Week 1 University of Colorado Denver, Fall 2013, Prof. Engau 1 Problem-Solving in Operations Research 2 Brief History of Linear Programming 3 Review of Basic Linear Algebra Linear Programming - The Story About

More information

Lecture 9: Dantzig-Wolfe Decomposition

Lecture 9: Dantzig-Wolfe Decomposition Lecture 9: Dantzig-Wolfe Decomposition (3 units) Outline Dantzig-Wolfe decomposition Column generation algorithm Relation to Lagrangian dual Branch-and-price method Generated assignment problem and multi-commodity

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Sito web: http://home.deib.polimi.it/amaldi/ott-13-14.shtml A.A. 2013-14 Edoardo

More information

CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming

CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming CSCI 1951-G Optimization Methods in Finance Part 01: Linear Programming January 26, 2018 1 / 38 Liability/asset cash-flow matching problem Recall the formulation of the problem: max w c 1 + p 1 e 1 = 150

More information

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming: models and applications; complexity Ann-Brith Strömberg 2011 04 01 Modelling with integer variables (Ch. 13.1) Variables Linear programming (LP) uses continuous

More information

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 12. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 12 Dr. Ted Ralphs ISE 418 Lecture 12 1 Reading for This Lecture Nemhauser and Wolsey Sections II.2.1 Wolsey Chapter 9 ISE 418 Lecture 12 2 Generating Stronger Valid

More information

The Dual Simplex Algorithm

The Dual Simplex Algorithm p. 1 The Dual Simplex Algorithm Primal optimal (dual feasible) and primal feasible (dual optimal) bases The dual simplex tableau, dual optimality and the dual pivot rules Classical applications of linear

More information

2001 Dennis L. Bricker Dept. of Industrial Engineering The University of Iowa. Reducing dimensionality of DP page 1

2001 Dennis L. Bricker Dept. of Industrial Engineering The University of Iowa. Reducing dimensionality of DP page 1 2001 Dennis L. Bricker Dept. of Industrial Engineering The University of Iowa Reducing dimensionality of DP page 1 Consider a knapsack with a weight capacity of 15 and a volume capacity of 12. Item # Value

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-16-17.shtml Academic year 2016-17

More information

CO 250 Final Exam Guide

CO 250 Final Exam Guide Spring 2017 CO 250 Final Exam Guide TABLE OF CONTENTS richardwu.ca CO 250 Final Exam Guide Introduction to Optimization Kanstantsin Pashkovich Spring 2017 University of Waterloo Last Revision: March 4,

More information

Introduction. Very efficient solution procedure: simplex method.

Introduction. Very efficient solution procedure: simplex method. LINEAR PROGRAMMING Introduction Development of linear programming was among the most important scientific advances of mid 20th cent. Most common type of applications: allocate limited resources to competing

More information

Duality Theory of Constrained Optimization

Duality Theory of Constrained Optimization Duality Theory of Constrained Optimization Robert M. Freund April, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 The Practical Importance of Duality Duality is pervasive

More information

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA

Gestion de la production. Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA Gestion de la production Book: Linear Programming, Vasek Chvatal, McGill University, W.H. Freeman and Company, New York, USA 1 Contents 1 Integer Linear Programming 3 1.1 Definitions and notations......................................

More information

Relation of Pure Minimum Cost Flow Model to Linear Programming

Relation of Pure Minimum Cost Flow Model to Linear Programming Appendix A Page 1 Relation of Pure Minimum Cost Flow Model to Linear Programming The Network Model The network pure minimum cost flow model has m nodes. The external flows given by the vector b with m

More information

Mixed Integer Programming Solvers: from Where to Where. Andrea Lodi University of Bologna, Italy

Mixed Integer Programming Solvers: from Where to Where. Andrea Lodi University of Bologna, Italy Mixed Integer Programming Solvers: from Where to Where Andrea Lodi University of Bologna, Italy andrea.lodi@unibo.it November 30, 2011 @ Explanatory Workshop on Locational Analysis, Sevilla A. Lodi, MIP

More information

Logic-based Benders Decomposition

Logic-based Benders Decomposition Logic-based Benders Decomposition A short Introduction Martin Riedler AC Retreat Contents 1 Introduction 2 Motivation 3 Further Notes MR Logic-based Benders Decomposition June 29 July 1 2 / 15 Basic idea

More information

Lectures 6, 7 and part of 8

Lectures 6, 7 and part of 8 Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,

More information

MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms

MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms MINLP: Theory, Algorithms, Applications: Lecture 3, Basics of Algorothms Jeff Linderoth Industrial and Systems Engineering University of Wisconsin-Madison Jonas Schweiger Friedrich-Alexander-Universität

More information

Chap6 Duality Theory and Sensitivity Analysis

Chap6 Duality Theory and Sensitivity Analysis Chap6 Duality Theory and Sensitivity Analysis The rationale of duality theory Max 4x 1 + x 2 + 5x 3 + 3x 4 S.T. x 1 x 2 x 3 + 3x 4 1 5x 1 + x 2 + 3x 3 + 8x 4 55 x 1 + 2x 2 + 3x 3 5x 4 3 x 1 ~x 4 0 If we

More information