Task Assignment. Consider this very small instance: t1 t2 t3 t4 t5 p p p p p

Similar documents
CS Algorithms and Complexity

Algorithm Design Strategies V

IE418 Integer Programming

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

Eigenvalues and eigenvectors

maxz = 3x 1 +4x 2 2x 1 +x 2 6 2x 1 +3x 2 9 x 1,x 2

P,NP, NP-Hard and NP-Complete

Check off these skills when you feel that you have mastered them. Write in your own words the definition of a Hamiltonian circuit.

Math Models of OR: Branch-and-Bound

where X is the feasible region, i.e., the set of the feasible solutions.

Data Structures in Java

Optimization Prof. A. Goswami Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture - 20 Travelling Salesman Problem

P vs. NP. Data Structures and Algorithms CSE AU 1

CS 6901 (Applied Algorithms) Lecture 3

CS3233 Competitive i Programming

Constraint sat. prob. (Ch. 6)

Exponents Drill. Warm-up Problems. Problem 1 If (x 3 y 3 ) -3 = (xy) -z, what is z? A) -6 B) 0 C) 1 D) 6 E) 9. Problem 2 36 =?

CS320 The knapsack problem

Decision Procedures An Algorithmic Point of View

Outline / Reading. Greedy Method as a fundamental algorithm design technique

Review Solutions, Exam 2, Operations Research

min3x 1 + 4x 2 + 5x 3 2x 1 + 2x 2 + x 3 6 x 1 + 2x 2 + 3x 3 5 x 1, x 2, x 3 0.

Optimisation and Operations Research

23. Cutting planes and branch & bound

Dynamic Programming. Cormen et. al. IV 15

III. What Are the Prices in 7-11?

Integer Linear Programming

Complexity Theory Part II

P, NP, NP-Complete. Ruth Anderson

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting

Algorithms and Complexity theory

Section Notes 9. IP: Cutting Planes. Applied Math 121. Week of April 12, 2010

Introduction to Integer Linear Programming

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Operations and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Introduction to Bin Packing Problems

20. Dynamic Programming II

Intractable Problems Part Two

(tree searching technique) (Boolean formulas) satisfying assignment: (X 1, X 2 )

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

CS 273 Prof. Serafim Batzoglou Prof. Jean-Claude Latombe Spring Lecture 12 : Energy maintenance (1) Lecturer: Prof. J.C.

CSE541 Class 22. Jeremy Buhler. November 22, Today: how to generalize some well-known approximation results

Knapsack and Scheduling Problems. The Greedy Method

Lecture 23 Branch-and-Bound Algorithm. November 3, 2009

of a bimatrix game David Avis McGill University Gabriel Rosenberg Yale University Rahul Savani University of Warwick

Jeffrey D. Ullman Stanford University

Section Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.

is called an integer programming (IP) problem. model is called a mixed integer programming (MIP)

Now just show p k+1 small vs opt. Some machine gets k=m jobs of size at least p k+1 Can also think of it as a family of (1 + )-approximation algorithm

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE

Easy Problems vs. Hard Problems. CSE 421 Introduction to Algorithms Winter Is P a good definition of efficient? The class P

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Interesting Patterns. Jilles Vreeken. 15 May 2015

Combinatorial optimization problems

Fundamentals of Operations Research. Prof. G. Srinivasan. Indian Institute of Technology Madras. Lecture No. # 15

Search and Lookahead. Bernhard Nebel, Julien Hué, and Stefan Wölfl. June 4/6, 2012

MITOCW ocw f99-lec23_300k

Lecture 6: Greedy Algorithms I

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture.

1 Adjacency matrix and eigenvalues

0-1 Knapsack Problem in parallel Progetto del corso di Calcolo Parallelo AA

Branch-and-Bound. Leo Liberti. LIX, École Polytechnique, France. INF , Lecture p. 1

Approximation Algorithms

COSC 341: Lecture 25 Coping with NP-hardness (2)

Classification and Regression Trees

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

CSE 206A: Lattice Algorithms and Applications Spring Basic Algorithms. Instructor: Daniele Micciancio

Outline. 1 NP-Completeness Theory. 2 Limitation of Computation. 3 Examples. 4 Decision Problems. 5 Verification Algorithm

Integer Linear Programming Modeling

What is an integer program? Modelling with Integer Variables. Mixed Integer Program. Let us start with a linear program: max cx s.t.

To Find the Product of Monomials. ax m bx n abx m n. Let s look at an example in which we multiply two monomials. (3x 2 y)(2x 3 y 5 )

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Topics in Complexity

Linear Programming: Simplex

9 Knapsack Cryptography

Exercises NP-completeness

GRASP A speedy introduction

Probabilistic Near-Duplicate. Detection Using Simhash

Module 3 Study Guide. GCF Method: Notice that a polynomial like 2x 2 8 xy+9 y 2 can't be factored by this method.

1 Column Generation and the Cutting Stock Problem

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

NAME: Be clear and concise. You may use the number of points assigned toeach problem as a rough

30.5. Iterative Methods for Systems of Equations. Introduction. Prerequisites. Learning Outcomes

Limits to Approximability: When Algorithms Won't Help You. Note: Contents of today s lecture won t be on the exam

SOLVING INTEGER LINEAR PROGRAMS. 1. Solving the LP relaxation. 2. How to deal with fractional solutions?

Calculating Frobenius Numbers with Boolean Toeplitz Matrix Multiplication

Lecture 13: Dynamic Programming Part 2 10:00 AM, Feb 23, 2018

Part III: Traveling salesman problems

CS/COE

CS168: The Modern Algorithmic Toolbox Lecture #8: PCA and the Power Iteration Method

Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 4. Matrix products

The Fast Optimal Voltage Partitioning Algorithm For Peak Power Density Minimization

Introduction to Parallel Programming in OpenMP Dr. Yogish Sabharwal Department of Computer Science & Engineering Indian Institute of Technology, Delhi

Design and Analysis of Algorithms

6. DYNAMIC PROGRAMMING I

CS360 Homework 12 Solution

Solving Problems By Searching

Transcription:

Task Assignment Task Assignment The Task Assignment problem starts with n persons and n tasks, and a known cost for each person/task combination. The goal is to assign each person to an unique task so as to minimize the total cost. This problem can be solved in polynomial time using an algorithm called the Hungarian Method. However we will develop a Branch and Bound solution as an exploration of some useful B&B techniques. Decision sequence: At stage i, we will select a task for person i from the tasks not yet assigned Objective function: As given minimize the total cost Initial value of Global Upper Bound: representing the input as a matrix, sum the diagonal elements (i.e. Assign person i to task i, for all i) Cost-so-far: the sum of the assignments already made Guaranteed-future-cost: the sum of the least cost possible assignment for each remaining person Feasible-future-cost: the result of applying the sum the diagonal method to the remaining persons and tasks Consider this very small instance: t1 t2 t3 t4 t5 p1 4 3 5 4 6 p2 8 4 6 9 3 p3 3 2 4 7 3 p4 7 10 8 8 2 p5 4 8 5 4 3 Initial value of Global Upper Bound U: 4+4+4+8+3 = 23 Page 1

Task Assignment We can represent each partial solution by the assignments made so far, the cost-so-far, and the rows and columns of the remaining matrix, and the lower and upper bounds. The first five partial solutions are the result of assigning p1 to each of the 5 available tasks. Assignments CSF Matrix L U 1:1 4 t2 t3 t4 t5 14 23 p2 4 6 9 3 p3 2 4 7 3 p4 10 8 8 2 p5 8 5 4 3 (The yellow highlighted cells are the minimum values in each row; the sum of these is the GFC) 1:2 3 t1 t3 t4 t5 14 26 p2 8 6 9 3 p3 3 4 7 3 p4 7 8 8 2 p5 4 5 4 3 1:3 5 t1 t2 t4 t5 15 26 p2 8 4 9 3 p3 3 2 7 3 p4 7 10 8 2 p5 4 8 4 3 Page 2

Task Assignment 1:4 4 t1 t2 t3 t5 14 25 p2 8 4 6 3 p3 3 2 4 3 p4 7 10 8 2 p5 4 8 5 3 1:5 6 t1 t2 t3 t4 23 28 p2 8 4 6 9 p3 3 2 4 7 p4 7 10 8 8 p5 4 8 5 4 The algorithm would choose the partial solution with the lowest L value for expansion, then generate new partial solutions by assigning p2 to each of the 4 available tasks. In this example, 1:1, 1:2 and 1:4 are tied for lowest L we could choose between them randomly, or we could choose the one with the lower U value (in this case, 1:1) in hopes that it will potentially lead to even lower U values, which will let us eliminate other partial solutions. This algorithm is guaranteed to find the optimal solution (B&B algorithms always do). However we can attempt to improve its efficiency by extracting more information about costs from the matrix. For example, observe that p1 costs at least 3, no matter which task is assigned. We can subtract 3 from every element of the first row, and add 3 to the CSF of all the initial partial solutions. Then we can do the same for all the other rows. Page 3

Task Assignment We end up with t1 t2 t3 t4 t5 Extracted cost p1 1 0 2 1 3 3 p2 5 1 3 6 0 3 p3 1 0 2 5 1 2 p4 5 8 6 6 0 2 p5 1 5 2 1 0 3 Total = 13 So every solution will cost at least 13 But now look at t1, t3 and t4. It will cost at least 1 to assign any worker to t1, at least 2 to assign someone to t3, and at least 1 to assign someone to t4. We can subtract the appropriate value from each of these columns, giving t1 t2 t3 t4 t5 Extracted cost p1 0 0 0 0 3 3 p2 4 1 1 5 0 3 p3 0 0 0 4 1 2 p4 4 8 4 5 0 2 p5 0 5 0 0 0 3 Total = 13 ` 1 2 1 Total = 13 + 1+2+1 = 17 Thus every solution carries a cost of at least 17. Page 4

Task Assignment We can see how this affects the partial solutions developed above: Assignments CSF Matrix L U 1:1 17 t2 t3 t4 t5 17 23 p2 1 1 5 0 p3 0 0 4 1 p4 8 4 5 0 p5 5 0 0 0 1:2 17 t1 t3 t4 t5 17 26 p2 4 1 5 0 p3 0 0 4 1 p4 4 4 5 0 p5 0 0 0 0 1:3 17 t1 t2 t4 t5 17 26 p2 4 1 5 0 p3 0 0 4 1 p4 4 8 5 0 p5 0 5 0 0 1:4 17 t1 t2 t3 t5 17 25 p2 4 1 1 0 p3 0 0 0 1 p4 4 8 4 0 p5 0 5 0 0 Page 5

Task Assignment 1:5 20 t1 t2 t3 t4 p2 4 1 1 5 p3 0 0 0 4 p4 4 8 4 5 p5 0 5 0 0 and look at this matrix! It has two rows with all elements > 0, so we can extract even more guaranteed costs: 1 from the first row and 4 from the third row. So our CSF value becomes 25 (previously extracted: 17 + current assignment cost: 3 + new extracted: 5). Thus the partial solution looks like this: 1:5 25 t1 t2 t3 t4 25 28 p2 3 0 0 4 p3 0 0 0 4 p4 0 4 0 1 p5 0 5 0 0 Since our Global Upper Bound = 23, we can immediately discard this partial solution (and thereby prune off an entire branch of the tree of partial solutions). It may seem that extracting these guaranteed costs is a lot of effort for not much return, but as the example shows, it can help us reduce the set of partial solutions by raising some L values past the Global Upper Bound. In general, the benefit is that this reduces the set of possible solution values for each partial solution which may eliminate some overlaps. As soon as the L value for some partial solution exceeds the U value for another partial solution, the first one can be discarded because it cannot lead to an optimal solution. Page 6

Task Assignment The guaranteed-cost-extraction method can help by quickly identifying choices that have high future costs. For example, consider the following very small example: 1 0 1 0 100 50 30 40 0 The selections highlighted in blue and green may look similar, but selecting the blue one creates a reduced matrix where we can extract a cost of 90 (check this) whereas selecting the green one creates a reduced matrix where the extracted cost is only 40 (check this too). We can apply the same reduction operation each time the new matrix has a row or column where all values are >= 1. This method for computing L takes more time than the simple approach, but hopefully it improves the lower bounds so much that the overall time of the algorithm is reduced. As mentioned above another way to try to improve the algorithm's efficiency is to use the U values of partial Solutions to break ties, if the partial solutions have equal L values. For example, if two partial solutions had (L,U) pairs (17,25) and (17,23), then we could choose the second one to expand. There is no guarantee that this will lead to the optimal solution faster, but it may. It is important to note that making a deliberate choice in this situation (where the simple algorithm might be considered to be making a random choice between equal options) cannot prevent us from reaching the optimal solution. Page 7

Task Assignment Yet another improvement would be to do something more clever than sum the diagonal when computing the feasible-future-cost. For example, a greedy heuristic might be used at this point, such as for each remaining person, assign the least cost job that has not yet been assigned (Remember, these are temporary assignments for the purpose of establishing U, not committed assignments, but they must still be feasible we can't assign the same job to two persons.) This might lower the U values for partial solutions, which would give us better information about them, which would in turn allow us to eliminate more partial solutions from consideration, and thereby close in on the optimal solution faster. Page 8

0/1 Knapsack version 1 Knapsack - version 1 Item 1 2 3 4 5 6 Value 130 100 60 60 120 40 Mass 10 8 5 5 10 5 Value/Mass 13 12.5 12 12 12 8 Global U Let k = 20 Initial upper bound = greedy solution 110000 Value in = 230 Value out = 280 280 CSF = cost of things we have decided not to take GFC = cost of things that individually will not fit in the knapsack FFC = cost of things left on the table after we run a greedy heuristic on the remaining objects L= CSF + GFC U = CSF + FFC In the lines below is used to signify the partial solution with the lowest lower bound is used to signify a partial solution that we reject as soon as it is generated CSF GFC L FFC U Global U PS1 1 0 0 0 280 280 280 PS0 0 130 0 130 160 290 280 At this point PS1 has the lowest lower bound so we choose it for expansion. S now contains: PS0 0 130 0 130 160 290 280 Page 9

Knapsack - version 1 PS11 1 1 0 280 280 280 280 280 PS10 1 0 100 0 100 160 260 260 At this point PS10 has the lowest lower bound so we choose it for expansion. S now contains: PS0 0 130 0 130 160 290 260 PS11 1 1 0 280 280 280 280 260 PS101 1 0 1 100 120 220 160 260 260 PS100 1 0 0 160 0 160 120 280 260 At this point PS0 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 260 PS101 1 0 1 100 120 220 160 260 260 PS100 1 0 0 160 0 160 120 280 260 PS01 0 1 130 0 130 160 290 260 PS00 0 0 230 0 230 40 270 260 At this point PS01 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 260 PS101 1 0 1 100 120 220 160 260 260 PS100 1 0 0 160 0 160 120 280 260 PS00 0 0 230 0 230 40 270 260 PS011 0 1 1 130 120 250 160 290 260 PS010 0 1 0 190 0 190 120 310 260 At this point PS100 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 260 PS101 1 0 1 100 120 220 160 260 260 PS00 0 0 230 0 230 40 270 260 PS011 0 1 1 130 120 250 160 290 260 PS010 0 1 0 190 0 190 120 310 260 PS1001 1 0 0 1 160 120 280 260 Page 10

Knapsack - version 1 PS1000 1 0 0 0 220 0 220 40 260 260 At this point PS010 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 260 PS101 1 0 1 100 120 220 160 260 260 PS00 0 0 230 0 230 40 270 260 PS011 0 1 1 130 120 250 160 290 260 PS1000 1 0 0 0 220 0 220 40 260 260 PS0101 0 1 0 1 190 120 310 260 PS0100 0 1 0 0 250 0 250 40 290 260 At this point PS101 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 260 PS00 0 0 230 0 230 40 270 260 PS011 0 1 1 130 120 250 160 290 260 PS1000 1 0 0 0 220 0 220 40 260 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS1010 1 0 1 0 160 120 280 260 PS1000 has the lowest lower bound. PS11 1 1 0 280 280 280 280 260 PS00 0 0 230 0 230 40 270 260 PS011 0 1 1 130 120 250 160 290 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS10000 1 0 0 0 0 340 0 340 260 PS00 has the lowest lower bound. PS11 1 1 0 280 280 280 280 260 Page 11

Knapsack - version 1 PS011 0 1 1 130 120 250 160 290 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS001 0 0 1 230 0 230 40 270 260 PS000 0 0 0 290 0 290 260 PS001 has the lowest lower bound. PS11 1 1 0 280 280 280 280 260 PS011 0 1 1 130 120 250 160 290 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS0011 0 0 1 1 230 0 230 40 270 260 PS0010 0 0 1 0 290 0 290 260 PS0011 has the lowest lower bound PS11 1 1 0 280 280 280 280 260 PS011 0 1 1 130 120 250 160 290 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS00111 0 0 1 1 1 230 40 270 PS00110 0 0 1 1 0 350 0 350 PS011 has the lowest lower bound PS11 1 1 0 280 280 280 280 260 PS0100 0 1 0 0 250 0 250 40 290 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS0111 0 1 1 1 130 160 290 PS0110 0 1 1 0 190 120 310 Page 12

Knapsack - version 1 PS0100 has the lowest lower bound PS11 1 1 0 280 280 280 280 260 PS1011 1 0 1 1 100 160 260 160 260 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS01001 0 1 0 0 1 250 40 290 250 260 PS01000 0 1 0 0 0 370 0 370 370 260 We expand PS1011 PS11 1 1 0 280 280 280 280 260 PS10001 1 0 0 0 1 220 40 260 40 260 260 PS10111 1 0 1 1 1 not feasible 260 PS10110 1 0 1 1 0 220 40 260 40 260 260 We expand PS10001 PS11 1 1 0 280 280 280 280 260 PS10110 1 0 1 1 0 220 40 260 40 260 260 SP100011 1 0 0 0 1 1 not feasible 260 PS100010 1 0 0 0 1 0 260 0 260 0 260 260 We expand PS10110 PS11 1 1 0 280 280 280 280 260 PS100010 1 0 0 0 1 0 260 0 260 0 260 260 PS101101 1 0 1 1 0 1 not feasible 260 PS101100 1 0 1 1 0 0 260 0 260 0 260 260 We expand PS100010 but wait! It is a complete solution so we know it the optimal solution. We are done. Total number of partial solutions generated: 34 Page 13

Knapsack - version 1 Total number of potential complete solutions: 64 Branch and bound reduced the total amount of work by about 50% Note regarding PS11: When the Global Upper bound U drops to 260, PS11 could be pruned. However finding all such redundant partial solutions would require searching and updating the data structure being used to hold S. For large-scale problems this might be worth doing since we always want to keep S small if possible. On the other hand the extra time required to do this every time we may update U may not be worth it. Page 14

Knapsack - version 1 See note below Page 15

0/1 Knapsack version 2 Knapsack - version 2 In Version 1 of our solution to this problem we used a very simple method of computing GFC, and we did not use any particular method of breaking ties between partial solutions with the same L value In Version 2 we will improve GFC by first eliminating any objects that won't fit, then putting the remaining objects in groups that cannot all go in we know we must leave out at least one object from each group We will also break ties in favour of the partial solution that is closest to being a complete solution ie in case of a tie, we will choose the partial solution that includes more decisions Item 1 2 3 4 5 6 Value 130 100 60 60 120 40 Mass 10 8 5 5 10 5 Value/Mass 13 12.5 12 12 12 8 Global U Let k = 20 Initial upper bound =greedy solution 110000 Value in 230 Value out = 280 280 CSF = cost of things we have decided not to take GFC = as defined above FFC = cost of things left on the table after we run a greedy heuristic on the remaining objects L= CSF + GFC U = CSF + FFC In the lines below is used to signify the partial solution with the lowest lower bound is used to signify a partial solution that we reject as soon as it is generated Page 16

Knapsack - version 2 CSF GFC L FFC U PS1 1 0 120 120 280 280 280 PS0 0 130 60 190 160 290 At this point PS1 has the lowest lower bound so we choose it for expansion. S now contains: PS0 0 130 60 190 160 290 PS11 1 1 0 280 280 280 280 PS10 1 0 100 60 160 160 260 260 At this point PS 10 has the lowest lower bound so we choose it for expansion. S now contains: PS0 0 130 60 190 160 290 PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS100 1 0 0 160 60 220 120 280 At this point PS0 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS100 1 0 0 160 60 220 120 280 PS01 0 1 130 60 190 160 290 PS00 0 0 230 40 270 At this point PS01 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS100 1 0 0 160 60 220 120 280 PS011 0 1 1 130 160 290 PS010 0 1 0 190 60 250 120 310 At this point PS100 has the lowest lower bound so we choose it for expansion. S now contains: Page 17

Knapsack - version 2 PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS010 0 1 0 190 60 250 120 310 PS1001 1 0 0 1 160 120 280 PS1000 1 0 0 0 220 40 260 40 260 At this point PS010 has the lowest lower bound so we choose it for expansion. S now contains: PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS1000 1 0 0 0 220 40 260 40 260 PS0101 0 1 0 1 190 120 310 PS0100 0 1 0 0 250 40 290 At this point PS101 and PS1000 are tied. We choose PS1000 because it contains more decisions. PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS10001 1 0 0 0 1 220 40 260 40 260 PS10000 1 0 0 0 0 340 0 340 PS101 and PS10001 are tied. We choose PS10001 because it contains more decisions. PS11 1 1 0 280 280 280 280 PS101 1 0 1 100 160 260 160 260 PS100011 1 0 0 0 1 1 infeasible 260 PS100010 1 0 0 0 1 0 260 0 260 0 260 PS101 and PS100010 are tied. We choose PS100010 because it contains more decisions. Oh! PS100010 is a complete solution, and we have selected it as having the lowest L value in S. We know it is an optimal solution. Total number of partial solutions generated: 18 Page 18

Knapsack - version 2 Total number of potential complete solutions: 64 Branch and bound reduced the total amount of work to less than 30% of the work involved in exhaustive search of the full set of potential solutions. This example shows that with a little more work in computing L and U for each partial solution, we can greatly accelerate the search for an optimal solution. Further improvement: when a partial solution reaches the point that none of the remaining items will fit, then there is only one feasible completion of this partial solution we can replace the partial solution by the only full solution it extends to. Note regarding PS11: When the Global Upper bound U drops to 260, PS11 could be pruned. However finding all such redundant partial solutions would require searching and updating the data structure being used to hold S. For large-scale problems this might be worth doing since we always want to keep S small if possible. For smaller problems the overhead involved in always keeping S small may not be worth it. Page 19

TSP v1 v2 v3 v4 v5 v6 v1 infinity 3 6 9 4 2 v2 3 infinity 8 2 7 6 v3 6 8 infinity 5 2 8 v4 9 2 5 infinity 6 5 v5 4 7 2 6 infinity 8 v6 2 6 8 5 8 infinity CSF = sum of cost of chosen edges + min cost to leave each vertex GFC = 0, due to computation of CSF FFC = sum of edges in a greedy heuristic solution Start by extracting fixed costs. Every edge leaving v1 costs at least 2, etc Reducing edge (x,y) also reduces edge (y,x) care is needed v1 v2 v3 v4 v5 v6 v1 infinity 0 2 6 2 0 2 v2 0 infinity 5 0 6 5 1 v3 2 5 infinity 2 0 6 2 v4 6 0 2 infinity 5 4 1 v5 2 6 0 5 infinity 8 0 v6 0 5 6 4 8 infinity 0 sum of fixe Represent each partial solution by the permutation it has built so far Page 20

TSP P12 v1 v2 v3 v4 v5 v6 v1 infinity 0 2 6 2 0 cost of edge = 0 v2 0 infinity 5 0 6 5 v3 2 5 infinity 2 0 6 v4 6 0 2 infinity 5 4 v5 2 6 0 5 infinity 8 v6 0 5 6 4 8 infinity reduce matrix v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 v3 2 infinity 2 0 6 v4 6 2 infinity 5 4 v5 2 0 5 infinity 8 v6 0 6 4 8 infinity remove fixed costs v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 0 Page 21

v3 2 infinity 0 0 6 0 v4 4 0 infinity 3 2 2 v5 2 0 3 infinity 8 0 v6 0 6 2 8 infinity 0 TSP P13 v1 v2 v3 v4 v5 v6 v1 infinity 0 2 6 2 0 cost of edge = 2 v2 0 infinity 5 0 6 5 v3 2 5 infinity 2 0 6 v4 6 0 2 infinity 5 4 v5 2 6 0 5 infinity 8 v6 0 5 6 4 8 infinity reduce matrix v1 v2 v4 v5 v6 v2 0 infinity 0 6 5 v3 infinity 5 2 0 6 v4 6 0 infinity 5 4 v5 2 6 5 infinity 8 v6 0 5 4 8 infinity Page 22

remove fixed costs TSP v1 v2 v4 v5 v6 v2 0 infinity 0 6 5 0 v3 infinity 5 2 0 6 0 v4 6 0 infinity 5 4 0 v5 0 4 3 infinity 6 2 v6 0 5 4 8 infinity 0 sum of new P14 v1 v2 v3 v4 v5 v6 v1 infinity 0 2 6 2 0 cost of edge = 6 v2 0 infinity 5 0 6 5 v3 2 5 infinity 2 0 6 v4 6 0 2 infinity 5 4 v5 2 6 0 5 infinity 8 v6 0 5 6 4 8 infinity reduce matrix v1 v2 v3 v5 v6 v2 0 infinity 5 6 5 Page 23

TSP v3 2 5 infinity 0 6 v4 infinity 0 2 5 4 v5 2 6 0 infinity 8 v6 0 5 6 8 infinity remove fixed costs v1 v2 v3 v5 v6 v2 0 infinity 5 6 5 0 v3 2 5 infinity 0 6 0 v4 infinity 0 2 5 4 0 v5 2 6 0 infinity 8 0 v6 0 5 6 8 infinity 0 1 sum of new P15 v1 v2 v3 v4 v5 v6 v1 infinity 0 2 6 2 0 cost of edge = 2 v2 0 infinity 5 0 6 5 v3 2 5 infinity 2 0 6 v4 6 0 2 infinity 5 4 Page 24

TSP v5 2 6 0 5 infinity 8 v6 0 5 6 4 8 infinity reduce matrix v1 v2 v3 v4 v6 v2 0 infinity 5 0 5 v3 2 5 infinity 2 6 v4 6 0 2 infinity 4 v5 infinity 6 0 5 8 v6 0 5 6 4 infinity remove fixed costs v1 v2 v3 v4 v6 v2 0 infinity 3 0 5 v3 0 3 infinity 0 4 2 v4 6 0 0 infinity 4 v5 infinity 6 0 5 8 v6 0 5 4 4 infinity sum of new P16 v1 v2 v3 v4 v5 v6 Page 25

v1 infinity 0 2 6 2 0 cost of edge = 0 v2 0 infinity 5 0 6 5 v3 2 5 infinity 2 0 6 v4 6 0 2 infinity 5 4 v5 2 6 0 5 infinity 8 v6 0 5 6 4 8 infinity reduce matrix TSP v1 v2 v3 v4 v5 v2 0 infinity 5 0 6 v3 2 5 infinity 2 0 v4 6 0 2 infinity 5 v5 2 6 0 5 infinity v6 infinity 5 6 4 8 remove fixed costs v1 v2 v3 v4 v5 v2 0 infinity 5 0 6 0 v3 2 5 infinity 2 0 0 v4 6 0 2 infinity 5 0 v5 2 6 0 5 infinity 0 v6 infinity 1 2 0 8 4 Page 26 sum of new

TSP Summary P12 P13 P14 P15 P16 We choose P12 for expansion v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 v3 2 infinity 0 0 6 v4 4 0 infinity 3 2 v5 2 0 3 infinity 8 v6 0 6 2 8 infinity P123 v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 cost of edge = 5 v3 2 infinity 0 0 6 Page 27

TSP v4 4 0 infinity 3 2 v5 2 0 3 infinity 8 v6 0 6 2 8 infinity reduce matrix v1 v4 v5 v6 v3 infinity 0 0 6 v4 4 infinity 3 2 v5 2 3 infinity 8 v6 0 2 8 infinity remove fixed costs v1 v4 v5 v6 v3 infinity 0 0 6 v4 2 infinity 0 0 2 v5 1 0 infinity 7 1 v6 0 0 7 infinity sum of new P124 v1 v3 v4 v5 v6 Page 28

v2 infinity 5 0 6 5 cost of edge = 0 v3 2 infinity 0 0 6 v4 4 0 infinity 3 2 v5 2 0 3 infinity 8 v6 0 6 2 8 infinity reduce matrix v1 v3 v5 v6 v3 2 infinity 0 6 v4 infinity 0 3 2 v5 2 0 infinity 8 v6 0 6 8 infinity remove fixed costs TSP v1 v3 v5 v6 v3 2 infinity 0 6 0 v4 infinity 0 3 2 0 v5 2 0 infinity 8 0 v6 0 6 8 infinity 0 sum of new P125 Page 29

TSP v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 cost of edge = 6 v3 2 infinity 0 0 6 v4 4 0 infinity 3 2 v5 2 0 3 infinity 8 v6 0 6 2 8 infinity reduce matrix v1 v3 v4 v6 v3 2 infinity 0 6 v4 4 0 infinity 2 v5 infinity 0 3 8 v6 0 6 2 infinity remove fixed costs v1 v3 v4 v6 v3 2 infinity 0 6 0 v4 4 0 infinity 2 0 v5 infinity 0 3 8 0 v6 0 6 2 infinity 0 sum of new Page 30

TSP P126 v1 v3 v4 v5 v6 v2 infinity 5 0 6 5 cost of edge = 5 v3 2 infinity 0 0 6 v4 4 0 infinity 3 2 v5 2 0 3 infinity 8 v6 0 6 2 8 infinity reduce matrix v1 v3 v4 v5 v3 2 infinity 0 0 v4 4 0 infinity 3 v5 2 0 3 infinity v6 infinity 6 2 8 remove fixed costs v1 v3 v4 v5 v3 2 infinity 0 0 0 v4 4 0 infinity 3 0 v5 2 0 3 infinity 0 v6 infinity 4 0 6 2 sum of new Page 31

TSP Summary: P13 P14 P15 P16 P124 P125 We choose P124 for expansion etc. Page 32

TSP ed costs = 6 Page 33

CSF GFC L FFC U TSP Page 34

TSP 8 0 8 8 16 Page 35

TSP w fixed costs = 2 10 0 10 8 18 Page 36

TSP w fixed costs = 0 12 0 12 13 25 Page 37

TSP w fixed costs = 2 10 0 10 5 15 Page 38

TSP w fixed costs = 4 Page 39

TSP 10 0 10 7 17 8 16 10 18 12 25 10 15 10 17 Page 40

TSP w fixed costs = 3 16 0 16 7 25 Page 41

TSP w fixed costs = 0 8 0 8 8 16 Page 42

TSP w fixed costs = 0 14 0 14 2 16 Page 43

TSP w fixed costs = 2 Page 44

TSP 17 0 17 10 18 12 25 10 15 10 17 8 16 14 16 Page 45