Discrete (and Continuous) Optimization WI4 131

Size: px
Start display at page:

Download "Discrete (and Continuous) Optimization WI4 131"

Transcription

1 Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek URL: roos November December, A.D. 2004

2 Course Schedule 1. Formulations (18 pages) 2. Optimality, Relaxation, and Bounds (10 pages) 3. Well-solved Problems (13 pages) 4. Matching and Assigments (10 pages) 5. Dynamic Programming (11 pages) 6. Complexity and Problem Reduction (8 pages) 7. Branch and Bound (17 pages) 8. Cutting Plane Algorithms (21 pages) 9. Strong Valid Inequalities (22 pages) 10. Lagrangian Duality (14 pages) 11. Column Generation Algorithms (16 pages) 12. Heuristic Algorithms (15 pages) 13. From Theory to Solutions (20 pages) Optimization Group 1

3 Capter 2 Optimality, Relaxation, and Bounds Optimization Group 2

4 Optimality and Relaxation (IP) z = max {c(x) : x X Z n } Basic idea underlying methods for solving (IP): find a lower bound z z and an upper bound z z such that z = z = z. Practically, this means that any algorithm will find a decreasing sequence z 1 > z 2 >... > z s z of upper bounds, and an increasing sequence z 1 < z 2 <... < z t z of lower bounds. The stopping criterion in general takes the form z s z t ǫ, where epsilon is some suitably chosen small nonnegative number. Optimization Group 3

5 How to obtain Bounds? Every feasible solution x X provides a lower bound z = c(x) z. This is essentially the only way to obtain lower bounds. For some IPs, it is easy to find a feasible solution (e.g. Assignment, TSP, Knapsack), but for other IPs, finding a feasible solution may be very difficult. The most important approach for finding upper bounds is by relaxation. The given IP is replaced by a simpler problem whose optimal value is at least as large as z. There are two obvious ways to get a relaxation: (i) Enlarge the feasible set. (ii) Replace the objective function by a function that has the same or a larger value everywhere. Definition 1 The problem (RP) z R = max {f(x) : x T Z n } is a relaxation of (IP) if X T and f(x) c(x) for all x X. Proposition 1 If (RP) is a relaxation of (IP) then z R z. Proof: If x is optimal for (IP), then x X T and z = c(x ) f(x ). As x T, f(x ) is a lower bound for z R, it follows that z f(x ) z R. Optimization Group 4

6 Linear Relaxations Definition 2 For the IP max { c T x : x X = P Z n} with formulation P, a linear relaxation is the LO problem z LP = max { c T x : x P }. Recall that P = { x R n + : Ax b}. As P Z n P and the objective is unchanged, this is clearly a relaxation. Not surprisingly, better formulations give tighter (upper) bounds. Proposition 2 If P 1 and P 2 are two formulations for the feasible set X in an IP, and P 1 P 2, the the respective upper bounds zi LP (i = 1,2) satisfy z1 LP z2 LP. Sometimes the relaxation RP immediately solves the IP. Proposition 3 If the relaxation (RP) is infeasible, then so is (IP). On the other hand, if (RP) has an optimal solution x and x X and c(x ) = f(x ), then x is an optimal solution of (IP). Proof: If (RP) is infeasible then T =. Since X T, also X =. For the second part of the lemma: as x X, z c(x ) = f(x ) = z R and z z R we get c(x ) = z. Optimization Group 5

7 Example z = max {4x y : 7x 2y 14, y 3, 2x 2y 3, x, y 0, x, y integer} The figure below illustrates the situation. y c x The figure makes clear that x = 2, y = 1 is the optimal solution, with 7 as objective value. The optimal solution of the LP relaxation is x = 20 7, y = 3 with 59 7 as objective value. This is an upper bound. Rounding this bound to an integer gives 8 as the linear relaxation bound. Optimization Group 6

8 Combinatorial Relaxations Whenever the relaxation is a combinatorial optimization problem, we speak of a combinatorial relaxation. Below follow some examples. TSP: The TSP, on a digraph D = (V, A), amounts to finding a (salesman, or Hamiltonian) tour T with minimal length in terms of given arc weights c ij, (i, j) A. We have seen that a tour is an assignment without subtours. So we have z TSP = min (i,j) T c ij : T is a tour min (i,j) T c ij : T is an assigment Symmetric TSP: The STSP, on a graph G = (V, E), amounts to finding a tour T with minimal length in terms of given edge weights c e, e E. Definition 3 A 1-tree is a subgraph consisting of two edges adjacent to node 1, plus the edges of a tree on the remaining nodes {2,..., n}. Observe that a tour consists of two edges adjacent to node 1, plus the edges of a path through the remaining nodes. Since a path is a special case of a tree we have z STSP = min e T c e : T is a tour min e T c e : T is a 1-tree.. Optimization Group 7

9 Combinatorial Relaxations (cont.) Quadratic 0-1 Problem: This is the (in general hard!) problem of maximizing a quadratic function over the unit cube: z = max 1 i<j n q ij x i x j + 1 i n p i x i : x {0,1} n Replacing all terms q ij x i x j with q ij < 0 by 0 the objective function does not decrease. So we have the relaxation z R = max 1 i<j n max { q ij,0 } x i x j + 1 i n p i x i : x {0,1} n This problem is also a quadratic 0-1 problem, but now the quadratic terms have nonnegative coefficients. Such a problem can be solved by solved by solving a series of (easy!) maximum flow problems (see Chapter 9). Knapsack Problem: The set underlying this problem is X = This set can be extended to X R = the largest integer less than or equal to a. x ZN + : n j=1 x ZN + : n aj xj b j=1 a j x j b., where a denotes Optimization Group 8

10 Consider the IP Lagrangian Relaxations (IP) z = max { c T x : Ax b, x X Z n}. If this problem is too difficult to solve directly, one possible way to proceed is to drop the constraint Ax b. This enlarges the feasible region and so yields a relaxation of (IP). An important extension of this idea is not just to drop complicating constraints, but then to add them into the objective function with Lagrange multipliers. Proposition 4 Suppose that (IP) has an optimal solution, and let u R m. Define z(u) = max { c T x + u T (b Ax) : x X }. Then z(u) z for all u 0. Proof: Let x be optimal for (IP). Then c T x = z, Ax b and x X. Since u 0, it follows that c T x + u T (b Ax ) c T x = z. The main challenge is, of course, to find Lagrange multipliers that minimize z(u). The best Lagrange multipliers are found by solving (if we can!) min z(u) = min max { c T x + u T (b Ax) }. u 0 u 0 x X Optimization Group 9

11 LO Duality (recapitulation) Consider the LO problem (P) z = max { c T x : Ax b, x 0 }. Its dual problem is (D) w = min { b T y : A T y c, y 0 }. Proposition 5 (Weak duality) If x is feasible for (P) and y is feasible for (D) then c T x b T y. Proof: c T x ( A T y ) T x = y T Ax y T b = b T y. Proposition 6 (Strong duality) Let x be feasible for (P) and y feasible for (D). Then x and y are optimal if and only if c T x = b T y. Scheme for dualizing: Primal problem (P) Dual problem (D) min c T x max b T y equality constraint free variable inequality constraint variable 0 inequality constraint variable 0 free variable equality constraint variable 0 inequality constraint variable 0 inequality constraint Optimization Group 10

12 Duality In the case of Linear Optimization every feasible solution of the dual problem gives a bound for the optimal value of the primal problem. It is natural to ask whether it is possible to find a dual problem for an IP. Definition 4 The two problems (IP) z = max {c(x) : x X}; (D) w = min {ω(u) : u U} form a (weak)-dual pair if c(x) ω(u) for all x X and all u U. When moreover w = z, they form a strong-dual pair. N.B. Any feasible solution of a dual problem yields an upper bound for IP. On the contrary, for a relaxation only its optimal solution yields an upper bound. Proposition 7 The IP z = max { c T x : Ax b, x Z n +} and the IP w LP = min { u T b : A T u c, u Z m +} form a dual pair. Proposition 8 Suppose that (IP) and (D) are a dual pair. If (D) is unbounded then (IP) is infeasible. On the other hand, if x X and u U satisfy c(x ) = w(u ), then x is an optimal solution of (IP) and u is an optimal solution of (D). Optimization Group 11

13 A Dual for the Matching Problem Given a graph G = (V, E), a matching M E is a set of (vertex-)disjoint edges. A covering by nodes is a set R V of nodes such that every edge has at least one end point in R. In the graph at the left the red edges form a matching, and the green nodes form a covering by nodes. Proposition 9 The problem of finding a maximum cardinality matching: extra tak max { M : M is a matching} M E and the problem of finding a minimum cardinality covering by nodes: min R V form a weak-dual pair. { R : R is covering by nodes} Proof: If M is a matching then the end nodes of its edges are distinct, so there number is 2k, where k = M. Any covering by nodes R must contain at least one of the end nodes of each edge in M. Hence R k. Therefore, R M. Unfortunately, this duality is not strong! Since the given matching is maximal and the covering by node is minimal, as easily can be verified, the above graph proves this. Optimization Group 12

14 A Dual for the Matching Problem (cont.) The former result can also be obtained from LO duality. Definition 5 The node-edge matrix of a graph G = (V, E) is an n = V by m = E matrix A with A i,e = 1 when node i is incident with edge e, and A i,e = 0 otherwise. With the help of the node-edge matrix A of G the matching problem can be formulated as the following IP: and the covering by nodes problem as: Using LO duality we may write z = max { 1 T x : Ax 1, x Z m + w = min { 1 T y : A T y 1, y Z n +}. z = max { 1 T x : Ax 1, x Z m + max { 1 T x : Ax 1, x 0 } = min { 1 T y : A T y 1, y 0 } min { 1 T y : A T y 1, y Z n +} = w. } } Optimization Group 13

15 Primal Bounds: Greedy Search The idea of a greedy heuristic is to construct a solution from scratch (the empty set), choosing at each step the item bringing the best immediate award. We give some examples. 0-1 Knapsack problem: z = max 12x 1 +8x 2 +17x 3 +11x 4 +6x 5 +2x 6 +2x 7 4x 1 +3x 2 +7x 3 +5x 4 +3x 5 +2x 6 +3x 7 9 x {0,1} 7 Greedy Solution: Order the variables so that their profit per unit is nondecreasing. already done. This is Variables with a low index are now more attractive than variables with higher indices. So we proceed as shown in the table. var. c i a i value use of resource resource remaining x x x x x x x The resulting solution is x G = (1,1,0,0,0,1,0) with objective value z G = 22. All we know is that 22 is a lower bound for the optimal value. Observe that x = (1,0,0,1,0,0,0) is feasible with the higher value 23. Optimization Group 14

16 Primal Bounds: Greedy Search (cont.) Symmetric TSP: Consider an instance with the distance matrix: Greedy Solution: Order the edges according to nondecreasing cost, and seek to use them in this order. So we proceed as follows to construct the tour at the left Heuristic tour step arc length 1 (1,3) 2 accept 2 (4,6) 3 accept 3 (3,6) 6 accept 4 (2,3) 7 conflict in node 3 5 (1,4) 8 creates subtour 6 (1,2) 9 accept 7 (2,5) 10 accept 8 (4,5) 24 forced to accept Better tour 6 5 The length of the created tour is z G = 54. The tour is shorter: it has length 49. Optimization Group 15

17 Primal Bounds: Local Search Local search methods assume that a feasible solution is known. It is called the incumbent. The idea of a local search heuristic is to define a neighborhood of solutions close to the incumbent. Then the best solution in this neighborhood is found. If it is better than the incumbent, it replaces it, and the procedure is repeated. Otherwise, the incumbent is locally optimal with respect to the neighborhood, and the heuristic terminates. Below we give two examples. Optimization Group 16

18 Primal Bounds: Local Search (cont.) Uncapacitated Facility Location: Consider an instance with m = 6 clients and n = 4 depots, and costs as shown below: (c ij ) = and (f j ) = (21, 16, 11, 24) Let N = {1,2,3,4} denote the set of depots, and S the set of open depots. Let the incumbent be the solution with depots 1 and 2 open, so S = {1,2}. Each client is served by the open depot with cheapest cost for the client. So the costs for the incumbent are ( ) + ( ) = 61. A possible neighborhood Q(S) of S is the set of all solutions obtained from S by adding or removing a single depot: Q(S) = {T N : T = S {j} for j / S or T = S \ {i} for i S}. In the current example: Q(S) = {{1}, {2}, {1, 2, 3}, {1, 2, 4}}. A simple computation makes clear that the costs for these 4 solutions are 63, 66, 60 and 84, respectively. So S = {1,2,3} is the next incumbent. The new neighborhood becomes Q(S) = {{1,2}, {1,3}, {2,3}, {1,2,3,4}}, with minimal costs 42 for S = {2,3}, which is the new incumbent. The new neighborhood becomes Q(S) = {{2}, {3}, {1,2,3}, {2,3,4}}, with minimal costs 31 for S = {3}, the new incumbent. The new neighborhood becomes Q(S) = {{1,3}, {2,3}, {3,4}, }, with all costs > 31. So S = {3} is a locally optimal solution. Optimization Group 17

19 Primal Bounds: Local Search (cont.) Graph Equipartition Problem: Given a graph G = (V, E) and n = V, the problem is to find a subset S V with S = n 2 for which the number c(s) of edges in the cut set δ(s, V \ S) is minimized, where δ(s, V \ S) = {(i, j) E : i S, j / S}. In this example all feasible sets have the same size, n 2. A natural neighborhood of a feasible set S V therefore consists of all subsets of nodes obtained by replacing one element in S by one element not in S: Q(S) = {T N : T \ S = S \ T = 1}. 1 Example: In the graph shown left we start with S = {1,2,3} for which c(s) = 6. Then 2 6 Q(S) = {{1,2,4}, {1,2,5}, {1,2,6}, {1,3,4}, {1,3,5}, {1,3,6}, {2,3,4}, {2,3,5}, {2,3,6}} for which c(t) = 6,5,4,4,5,6,5,2,5 respectively. The new incumbent is S = {2,3,5} with c(s) = 2. Q(S) does not contain a better solution, as may be easily verified, so S = {2,3,5} is locally optimal. Optimization Group 18

Discrete (and Continuous) Optimization Solutions of Exercises 2 WI4 131

Discrete (and Continuous) Optimization Solutions of Exercises 2 WI4 131 Discrete (and Continuous) Optimization Solutions of Exercises 2 WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek

More information

Discrete (and Continuous) Optimization WI4 131

Discrete (and Continuous) Optimization WI4 131 Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek e-mail: C.Roos@ewi.tudelft.nl

More information

Dual bounds: can t get any better than...

Dual bounds: can t get any better than... Bounds, relaxations and duality Given an optimization problem z max{c(x) x 2 }, how does one find z, or prove that a feasible solution x? is optimal or close to optimal? I Search for a lower and upper

More information

3.4 Relaxations and bounds

3.4 Relaxations and bounds 3.4 Relaxations and bounds Consider a generic Discrete Optimization problem z = min{c(x) : x X} with an optimal solution x X. In general, the algorithms generate not only a decreasing sequence of upper

More information

3.10 Lagrangian relaxation

3.10 Lagrangian relaxation 3.10 Lagrangian relaxation Consider a generic ILP problem min {c t x : Ax b, Dx d, x Z n } with integer coefficients. Suppose Dx d are the complicating constraints. Often the linear relaxation and the

More information

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting Summary so far z =max{c T x : Ax apple b, x 2 Z n +} I Modeling with IP (and MIP, and BIP) problems I Formulation for a discrete set that is a feasible region of an IP I Alternative formulations for the

More information

3.7 Cutting plane methods

3.7 Cutting plane methods 3.7 Cutting plane methods Generic ILP problem min{ c t x : x X = {x Z n + : Ax b} } with m n matrix A and n 1 vector b of rationals. According to Meyer s theorem: There exists an ideal formulation: conv(x

More information

Course Notes for MS4315 Operations Research 2

Course Notes for MS4315 Operations Research 2 Course Notes for MS4315 Operations Research 2 J. Kinsella October 29, 2015 0-0 MS4315 Operations Research 2 0-1 Contents 1 Integer Programs; Examples & Formulations 4 1.1 Linear Programs A Reminder..............

More information

and to estimate the quality of feasible solutions I A new way to derive dual bounds:

and to estimate the quality of feasible solutions I A new way to derive dual bounds: Lagrangian Relaxations and Duality I Recall: I Relaxations provide dual bounds for the problem I So do feasible solutions of dual problems I Having tight dual bounds is important in algorithms (B&B), and

More information

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini In the name of God Network Flows 6. Lagrangian Relaxation 6.3 Lagrangian Relaxation and Integer Programming Fall 2010 Instructor: Dr. Masoud Yaghini Integer Programming Outline Branch-and-Bound Technique

More information

Travelling Salesman Problem

Travelling Salesman Problem Travelling Salesman Problem Fabio Furini November 10th, 2014 Travelling Salesman Problem 1 Outline 1 Traveling Salesman Problem Separation Travelling Salesman Problem 2 (Asymmetric) Traveling Salesman

More information

Bounds on the Traveling Salesman Problem

Bounds on the Traveling Salesman Problem Bounds on the Traveling Salesman Problem Sean Zachary Roberson Texas A&M University MATH 613, Graph Theory A common routing problem is as follows: given a collection of stops (for example, towns, stations,

More information

Discrete Optimization 2010 Lecture 8 Lagrangian Relaxation / P, N P and co-n P

Discrete Optimization 2010 Lecture 8 Lagrangian Relaxation / P, N P and co-n P Discrete Optimization 2010 Lecture 8 Lagrangian Relaxation / P, N P and co-n P Marc Uetz University of Twente m.uetz@utwente.nl Lecture 8: sheet 1 / 32 Marc Uetz Discrete Optimization Outline 1 Lagrangian

More information

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 10. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 10 Dr. Ted Ralphs IE406 Lecture 10 1 Reading for This Lecture Bertsimas 4.1-4.3 IE406 Lecture 10 2 Duality Theory: Motivation Consider the following

More information

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 8. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 8 Dr. Ted Ralphs ISE 418 Lecture 8 1 Reading for This Lecture Wolsey Chapter 2 Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Duality for Mixed-Integer

More information

Relaxations and Bounds. 6.1 Optimality and Relaxations. Suppose that we are given an IP. z = max c T x : x X,

Relaxations and Bounds. 6.1 Optimality and Relaxations. Suppose that we are given an IP. z = max c T x : x X, 6 Relaxations and Bounds 6.1 Optimality and Relaxations Suppose that we are given an IP z = max c T x : x X, where X = x : Ax b,x Z n and a vector x X which is a candidate for optimality. Is there a way

More information

Lecture 7: Lagrangian Relaxation and Duality Theory

Lecture 7: Lagrangian Relaxation and Duality Theory Lecture 7: Lagrangian Relaxation and Duality Theory (3 units) Outline Lagrangian dual for linear IP Lagrangian dual for general IP Dual Search Lagrangian decomposition 1 / 23 Joseph Louis Lagrange Joseph

More information

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming Marc Uetz University of Twente m.uetz@utwente.nl Lecture 8: sheet 1 / 32 Marc Uetz Discrete Optimization Outline 1 Intro: The Matching

More information

3.7 Strong valid inequalities for structured ILP problems

3.7 Strong valid inequalities for structured ILP problems 3.7 Strong valid inequalities for structured ILP problems By studying the problem structure, we can derive strong valid inequalities yielding better approximations of conv(x ) and hence tighter bounds.

More information

CO759: Algorithmic Game Theory Spring 2015

CO759: Algorithmic Game Theory Spring 2015 CO759: Algorithmic Game Theory Spring 2015 Instructor: Chaitanya Swamy Assignment 1 Due: By Jun 25, 2015 You may use anything proved in class directly. I will maintain a FAQ about the assignment on the

More information

Combinatorial optimization problems

Combinatorial optimization problems Combinatorial optimization problems Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Optimization In general an optimization problem can be formulated as:

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 1 Homework Problems Exercise

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2017 04 07 Lecture 8 Linear and integer optimization with applications

More information

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight. In the multi-dimensional knapsack problem, additional

More information

Duality of LPs and Applications

Duality of LPs and Applications Lecture 6 Duality of LPs and Applications Last lecture we introduced duality of linear programs. We saw how to form duals, and proved both the weak and strong duality theorems. In this lecture we will

More information

Lecture 15 (Oct 6): LP Duality

Lecture 15 (Oct 6): LP Duality CMPUT 675: Approximation Algorithms Fall 2014 Lecturer: Zachary Friggstad Lecture 15 (Oct 6): LP Duality Scribe: Zachary Friggstad 15.1 Introduction by Example Given a linear program and a feasible solution

More information

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs Computational Integer Programming Lecture 2: Modeling and Formulation Dr. Ted Ralphs Computational MILP Lecture 2 1 Reading for This Lecture N&W Sections I.1.1-I.1.6 Wolsey Chapter 1 CCZ Chapter 2 Computational

More information

BBM402-Lecture 20: LP Duality

BBM402-Lecture 20: LP Duality BBM402-Lecture 20: LP Duality Lecturer: Lale Özkahya Resources for the presentation: https://courses.engr.illinois.edu/cs473/fa2016/lectures.html An easy LP? which is compact form for max cx subject to

More information

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018

Section Notes 9. Midterm 2 Review. Applied Math / Engineering Sciences 121. Week of December 3, 2018 Section Notes 9 Midterm 2 Review Applied Math / Engineering Sciences 121 Week of December 3, 2018 The following list of topics is an overview of the material that was covered in the lectures and sections

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Website: http://home.deib.polimi.it/amaldi/opt-16-17.shtml Academic year 2016-17

More information

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming: models and applications; complexity Ann-Brith Strömberg 2011 04 01 Modelling with integer variables (Ch. 13.1) Variables Linear programming (LP) uses continuous

More information

Lecture 8: Column Generation

Lecture 8: Column Generation Lecture 8: Column Generation (3 units) Outline Cutting stock problem Classical IP formulation Set covering formulation Column generation A dual perspective Vehicle routing problem 1 / 33 Cutting stock

More information

What is an integer program? Modelling with Integer Variables. Mixed Integer Program. Let us start with a linear program: max cx s.t.

What is an integer program? Modelling with Integer Variables. Mixed Integer Program. Let us start with a linear program: max cx s.t. Modelling with Integer Variables jesla@mandtudk Department of Management Engineering Technical University of Denmark What is an integer program? Let us start with a linear program: st Ax b x 0 where A

More information

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 21 Dr. Ted Ralphs IE406 Lecture 21 1 Reading for This Lecture Bertsimas Sections 10.2, 10.3, 11.1, 11.2 IE406 Lecture 21 2 Branch and Bound Branch

More information

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints. Section Notes 8 Integer Programming II Applied Math 121 Week of April 5, 2010 Goals for the week understand IP relaxations be able to determine the relative strength of formulations understand the branch

More information

Introduction to Bin Packing Problems

Introduction to Bin Packing Problems Introduction to Bin Packing Problems Fabio Furini March 13, 2015 Outline Origins and applications Applications: Definition: Bin Packing Problem (BPP) Solution techniques for the BPP Heuristic Algorithms

More information

Modeling with Integer Programming

Modeling with Integer Programming Modeling with Integer Programg Laura Galli December 18, 2014 We can use 0-1 (binary) variables for a variety of purposes, such as: Modeling yes/no decisions Enforcing disjunctions Enforcing logical conditions

More information

Lectures 6, 7 and part of 8

Lectures 6, 7 and part of 8 Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,

More information

Modelling linear and linear integer optimization problems An introduction

Modelling linear and linear integer optimization problems An introduction Modelling linear and linear integer optimization problems An introduction Karen Aardal October 5, 2015 In optimization, developing and analyzing models are key activities. Designing a model is a skill

More information

Chapter 3: Discrete Optimization Integer Programming

Chapter 3: Discrete Optimization Integer Programming Chapter 3: Discrete Optimization Integer Programming Edoardo Amaldi DEIB Politecnico di Milano edoardo.amaldi@polimi.it Sito web: http://home.deib.polimi.it/amaldi/ott-13-14.shtml A.A. 2013-14 Edoardo

More information

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING 1. Lagrangian Relaxation Lecture 12 Single Machine Models, Column Generation 2. Dantzig-Wolfe Decomposition Dantzig-Wolfe Decomposition Delayed Column

More information

1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u)

1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u) CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh (rezab@stanford.edu) Final Review Session 03/20/17 1. Let G = (V, E) be an unweighted, undirected graph. Let λ 1 be the maximum eigenvalue

More information

The traveling salesman problem

The traveling salesman problem Chapter 58 The traveling salesman problem The traveling salesman problem (TSP) asks for a shortest Hamiltonian circuit in a graph. It belongs to the most seductive problems in combinatorial optimization,

More information

Exercises NP-completeness

Exercises NP-completeness Exercises NP-completeness Exercise 1 Knapsack problem Consider the Knapsack problem. We have n items, each with weight a j (j = 1,..., n) and value c j (j = 1,..., n) and an integer B. All a j and c j

More information

The Greedy Algorithm for the Symmetric TSP

The Greedy Algorithm for the Symmetric TSP The Greedy Algorithm for the Symmetric TSP Gregory Gutin Anders Yeo Abstract We corrected proofs of two results on the greedy algorithm for the Symmetric TSP and answered a question in Gutin and Yeo, Oper.

More information

Introduction to Integer Linear Programming

Introduction to Integer Linear Programming Lecture 7/12/2006 p. 1/30 Introduction to Integer Linear Programming Leo Liberti, Ruslan Sadykov LIX, École Polytechnique liberti@lix.polytechnique.fr sadykov@lix.polytechnique.fr Lecture 7/12/2006 p.

More information

Advanced Linear Programming: The Exercises

Advanced Linear Programming: The Exercises Advanced Linear Programming: The Exercises The answers are sometimes not written out completely. 1.5 a) min c T x + d T y Ax + By b y = x (1) First reformulation, using z smallest number satisfying x z

More information

Lecture 9: Dantzig-Wolfe Decomposition

Lecture 9: Dantzig-Wolfe Decomposition Lecture 9: Dantzig-Wolfe Decomposition (3 units) Outline Dantzig-Wolfe decomposition Column generation algorithm Relation to Lagrangian dual Branch-and-price method Generated assignment problem and multi-commodity

More information

Decomposition and Reformulation in Integer Programming

Decomposition and Reformulation in Integer Programming and Reformulation in Integer Programming Laurence A. WOLSEY 7/1/2008 / Aussois and Reformulation in Integer Programming Outline 1 Resource 2 and Reformulation in Integer Programming Outline Resource 1

More information

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n

4. Duality Duality 4.1 Duality of LPs and the duality theorem. min c T x x R n, c R n. s.t. ai Tx = b i i M a i R n 2 4. Duality of LPs and the duality theorem... 22 4.2 Complementary slackness... 23 4.3 The shortest path problem and its dual... 24 4.4 Farkas' Lemma... 25 4.5 Dual information in the tableau... 26 4.6

More information

(P ) Minimize 4x 1 + 6x 2 + 5x 3 s.t. 2x 1 3x 3 3 3x 2 2x 3 6

(P ) Minimize 4x 1 + 6x 2 + 5x 3 s.t. 2x 1 3x 3 3 3x 2 2x 3 6 The exam is three hours long and consists of 4 exercises. The exam is graded on a scale 0-25 points, and the points assigned to each question are indicated in parenthesis within the text. Problem 1 Consider

More information

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009 Branch-and-Bound Algorithm November 3, 2009 Outline Lecture 23 Modeling aspect: Either-Or requirement Special ILPs: Totally unimodular matrices Branch-and-Bound Algorithm Underlying idea Terminology Formal

More information

3.8 Strong valid inequalities

3.8 Strong valid inequalities 3.8 Strong valid inequalities By studying the problem structure, we can derive strong valid inequalities which lead to better approximations of the ideal formulation conv(x ) and hence to tighter bounds.

More information

CS 6820 Fall 2014 Lectures, October 3-20, 2014

CS 6820 Fall 2014 Lectures, October 3-20, 2014 Analysis of Algorithms Linear Programming Notes CS 6820 Fall 2014 Lectures, October 3-20, 2014 1 Linear programming The linear programming (LP) problem is the following optimization problem. We are given

More information

New Integer Programming Formulations of the Generalized Travelling Salesman Problem

New Integer Programming Formulations of the Generalized Travelling Salesman Problem American Journal of Applied Sciences 4 (11): 932-937, 2007 ISSN 1546-9239 2007 Science Publications New Integer Programming Formulations of the Generalized Travelling Salesman Problem Petrica C. Pop Department

More information

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1 Definition: An Integer Linear Programming problem is an optimization problem of the form (ILP) min

More information

Nonlinear Programming

Nonlinear Programming Nonlinear Programming Kees Roos e-mail: C.Roos@ewi.tudelft.nl URL: http://www.isa.ewi.tudelft.nl/ roos LNMB Course De Uithof, Utrecht February 6 - May 8, A.D. 2006 Optimization Group 1 Outline for week

More information

4. Algebra and Duality

4. Algebra and Duality 4-1 Algebra and Duality P. Parrilo and S. Lall, CDC 2003 2003.12.07.01 4. Algebra and Duality Example: non-convex polynomial optimization Weak duality and duality gap The dual is not intrinsic The cone

More information

Operations Research Lecture 6: Integer Programming

Operations Research Lecture 6: Integer Programming Operations Research Lecture 6: Integer Programming Notes taken by Kaiquan Xu@Business School, Nanjing University May 12th 2016 1 Integer programming (IP) formulations The integer programming (IP) is the

More information

Introduction to optimization and operations research

Introduction to optimization and operations research Introduction to optimization and operations research David Pisinger, Fall 2002 1 Smoked ham (Chvatal 1.6, adapted from Greene et al. (1957)) A meat packing plant produces 480 hams, 400 pork bellies, and

More information

5 Flows and cuts in digraphs

5 Flows and cuts in digraphs 5 Flows and cuts in digraphs Recall that a digraph or network is a pair G = (V, E) where V is a set and E is a multiset of ordered pairs of elements of V, which we refer to as arcs. Note that two vertices

More information

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex

More information

21. Set cover and TSP

21. Set cover and TSP CS/ECE/ISyE 524 Introduction to Optimization Spring 2017 18 21. Set cover and TSP ˆ Set covering ˆ Cutting problems and column generation ˆ Traveling salesman problem Laurent Lessard (www.laurentlessard.com)

More information

15.081J/6.251J Introduction to Mathematical Programming. Lecture 24: Discrete Optimization

15.081J/6.251J Introduction to Mathematical Programming. Lecture 24: Discrete Optimization 15.081J/6.251J Introduction to Mathematical Programming Lecture 24: Discrete Optimization 1 Outline Modeling with integer variables Slide 1 What is a good formulation? Theme: The Power of Formulations

More information

Part III: Traveling salesman problems

Part III: Traveling salesman problems Transportation Logistics Part III: Traveling salesman problems c R.F. Hartl, S.N. Parragh 1/282 Motivation Motivation Why do we study the TSP? c R.F. Hartl, S.N. Parragh 2/282 Motivation Motivation Why

More information

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week Cuts, Branch & Bound, and Lagrangean Relaxation Week 11 1 Integer Linear Programming This week we will discuss solution methods for solving integer linear programming problems. I will skip the part on complexity theory, Section 11.8, although this is

More information

Linear Programming Redux

Linear Programming Redux Linear Programming Redux Jim Bremer May 12, 2008 The purpose of these notes is to review the basics of linear programming and the simplex method in a clear, concise, and comprehensive way. The book contains

More information

Introduction to Integer Programming

Introduction to Integer Programming Lecture 3/3/2006 p. /27 Introduction to Integer Programming Leo Liberti LIX, École Polytechnique liberti@lix.polytechnique.fr Lecture 3/3/2006 p. 2/27 Contents IP formulations and examples Total unimodularity

More information

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover

Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover duality 1 Dual fitting approximation for Set Cover, and Primal Dual approximation for Set Cover Guy Kortsarz duality 2 The set cover problem with uniform costs Input: A universe U and a collection of subsets

More information

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008 How to Relax Slide 1 John Hooker Carnegie Mellon University September 2008 Two ways to relax Relax your mind and body. Relax your problem formulations. Slide 2 Relaxing a problem Feasible set of original

More information

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010 Section Notes 9 IP: Cutting Planes Applied Math 121 Week of April 12, 2010 Goals for the week understand what a strong formulations is. be familiar with the cutting planes algorithm and the types of cuts

More information

Week 8. 1 LP is easy: the Ellipsoid Method

Week 8. 1 LP is easy: the Ellipsoid Method Week 8 1 LP is easy: the Ellipsoid Method In 1979 Khachyan proved that LP is solvable in polynomial time by a method of shrinking ellipsoids. The running time is polynomial in the number of variables n,

More information

Introduction to integer programming II

Introduction to integer programming II Introduction to integer programming II Martin Branda Charles University in Prague Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics Computational Aspects of Optimization

More information

Solutions to Exercises

Solutions to Exercises 1/13 Solutions to Exercises The exercises referred to as WS 1.1(a), and so forth, are from the course book: Williamson and Shmoys, The Design of Approximation Algorithms, Cambridge University Press, 2011,

More information

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs Instructor: Shaddin Dughmi Outline 1 Introduction 2 Shortest Path 3 Algorithms for Single-Source Shortest

More information

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1 Branch-and-Bound Leo Liberti LIX, École Polytechnique, France INF431 2011, Lecture p. 1 Reminders INF431 2011, Lecture p. 2 Problems Decision problem: a question admitting a YES/NO answer Example HAMILTONIAN

More information

Duality Theory, Optimality Conditions

Duality Theory, Optimality Conditions 5.1 Duality Theory, Optimality Conditions Katta G. Murty, IOE 510, LP, U. Of Michigan, Ann Arbor We only consider single objective LPs here. Concept of duality not defined for multiobjective LPs. Every

More information

Linear Programming. Scheduling problems

Linear Programming. Scheduling problems Linear Programming Scheduling problems Linear programming (LP) ( )., 1, for 0 min 1 1 1 1 1 11 1 1 n i x b x a x a b x a x a x c x c x z i m n mn m n n n n! = + + + + + + = Extreme points x ={x 1,,x n

More information

Integer Linear Programming (ILP)

Integer Linear Programming (ILP) Integer Linear Programming (ILP) Zdeněk Hanzálek, Přemysl Šůcha hanzalek@fel.cvut.cz CTU in Prague March 8, 2017 Z. Hanzálek (CTU) Integer Linear Programming (ILP) March 8, 2017 1 / 43 Table of contents

More information

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs Instructor: Shaddin Dughmi Outline 1 Introduction 2 Shortest Path 3 Algorithms for Single-Source

More information

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM EXERCISES Prepared by Natashia Boland 1 and Irina Dumitrescu 2 1 Applications and Modelling 1.1

More information

6. Linear Programming

6. Linear Programming Linear Programming 6-1 6. Linear Programming Linear Programming LP reduction Duality Max-flow min-cut, Zero-sum game Integer Programming and LP relaxation Maximum Bipartite Matching, Minimum weight vertex

More information

Integer Programming: Cutting Planes

Integer Programming: Cutting Planes OptIntro 1 / 39 Integer Programming: Cutting Planes Eduardo Camponogara Department of Automation and Systems Engineering Federal University of Santa Catarina October 2016 OptIntro 2 / 39 Summary Introduction

More information

Linear and Integer Programming - ideas

Linear and Integer Programming - ideas Linear and Integer Programming - ideas Paweł Zieliński Institute of Mathematics and Computer Science, Wrocław University of Technology, Poland http://www.im.pwr.wroc.pl/ pziel/ Toulouse, France 2012 Literature

More information

IE 5531: Engineering Optimization I

IE 5531: Engineering Optimization I IE 5531: Engineering Optimization I Lecture 7: Duality and applications Prof. John Gunnar Carlsson September 29, 2010 Prof. John Gunnar Carlsson IE 5531: Engineering Optimization I September 29, 2010 1

More information

Lecture 4: NP and computational intractability

Lecture 4: NP and computational intractability Chapter 4 Lecture 4: NP and computational intractability Listen to: Find the longest path, Daniel Barret What do we do today: polynomial time reduction NP, co-np and NP complete problems some examples

More information

On the Existence of Ideal Solutions in Multi-objective 0-1 Integer Programs

On the Existence of Ideal Solutions in Multi-objective 0-1 Integer Programs On the Existence of Ideal Solutions in Multi-objective -1 Integer Programs Natashia Boland a, Hadi Charkhgard b, and Martin Savelsbergh a a School of Industrial and Systems Engineering, Georgia Institute

More information

where X is the feasible region, i.e., the set of the feasible solutions.

where X is the feasible region, i.e., the set of the feasible solutions. 3.5 Branch and Bound Consider a generic Discrete Optimization problem (P) z = max{c(x) : x X }, where X is the feasible region, i.e., the set of the feasible solutions. Branch and Bound is a general semi-enumerative

More information

Planning and Optimization

Planning and Optimization Planning and Optimization C23. Linear & Integer Programming Malte Helmert and Gabriele Röger Universität Basel December 1, 2016 Examples Linear Program: Example Maximization Problem Example maximize 2x

More information

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications

Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications Extended Formulations, Lagrangian Relaxation, & Column Generation: tackling large scale applications François Vanderbeck University of Bordeaux INRIA Bordeaux-Sud-Ouest part : Defining Extended Formulations

More information

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs

- Well-characterized problems, min-max relations, approximate certificates. - LP problems in the standard form, primal and dual linear programs LP-Duality ( Approximation Algorithms by V. Vazirani, Chapter 12) - Well-characterized problems, min-max relations, approximate certificates - LP problems in the standard form, primal and dual linear programs

More information

Algorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem

Algorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem Algorithmic Game Theory and Applications Lecture 7: The LP Duality Theorem Kousha Etessami recall LP s in Primal Form 1 Maximize c 1 x 1 + c 2 x 2 +... + c n x n a 1,1 x 1 + a 1,2 x 2 +... + a 1,n x n

More information

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003 CS6999 Probabilistic Methods in Integer Programming Randomized Rounding April 2003 Overview 2 Background Randomized Rounding Handling Feasibility Derandomization Advanced Techniques Integer Programming

More information

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES SANTOSH N. KABADI AND ABRAHAM P. PUNNEN Abstract. Polynomially testable characterization of cost matrices associated

More information

EXACT ALGORITHMS FOR THE ATSP

EXACT ALGORITHMS FOR THE ATSP EXACT ALGORITHMS FOR THE ATSP Branch-and-Bound Algorithms: Little-Murty-Sweeney-Karel (Operations Research, ); Bellmore-Malone (Operations Research, ); Garfinkel (Operations Research, ); Smith-Srinivasan-Thompson

More information

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2)

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2) Algorithms and Theory of Computation Lecture 13: Linear Programming (2) Xiaohui Bei MAS 714 September 25, 2018 Nanyang Technological University MAS 714 September 25, 2018 1 / 15 LP Duality Primal problem

More information

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle Directed Hamiltonian Cycle Chapter 8 NP and Computational Intractability Claim. G has a Hamiltonian cycle iff G' does. Pf. Suppose G has a directed Hamiltonian cycle Γ. Then G' has an undirected Hamiltonian

More information

Branch-and-Bound for the Travelling Salesman Problem

Branch-and-Bound for the Travelling Salesman Problem Branch-and-Bound for the Travelling Salesman Problem Leo Liberti LIX, École Polytechnique, F-91128 Palaiseau, France Email:liberti@lix.polytechnique.fr March 15, 2011 Contents 1 The setting 1 1.1 Graphs...............................................

More information

Asymmetric Traveling Salesman Problem (ATSP): Models

Asymmetric Traveling Salesman Problem (ATSP): Models Asymmetric Traveling Salesman Problem (ATSP): Models Given a DIRECTED GRAPH G = (V,A) with V = {,, n} verte set A = {(i, j) : i V, j V} arc set (complete digraph) c ij = cost associated with arc (i, j)

More information

Optimization Exercise Set n. 4 :

Optimization Exercise Set n. 4 : Optimization Exercise Set n. 4 : Prepared by S. Coniglio and E. Amaldi translated by O. Jabali 2018/2019 1 4.1 Airport location In air transportation, usually there is not a direct connection between every

More information