Projection in Logic, CP, and Optimization

Similar documents
Projection, Consistency, and George Boole

Logic, Optimization and Data Analytics

Projection, Inference, and Consistency

Consistency as Projection

Scheduling Home Hospice Care with Logic-Based Benders Decomposition

Lessons from MIP Search. John Hooker Carnegie Mellon University November 2009

A Framework for Integrating Optimization and Constraint Programming

Logic-Based Benders Decomposition

Decision Diagrams for Discrete Optimization

Stochastic Decision Diagrams

Integrating Solution Methods. through Duality. US-Mexico Workshop on Optimization. Oaxaca, January John Hooker

Optimization Methods in Logic

The Separation Problem for Binary Decision Diagrams

Decision Diagrams: Tutorial

Optimization Bounds from Binary Decision Diagrams

A Scheme for Integrated Optimization

arxiv: v1 [cs.cc] 5 Dec 2018

An Integrated Approach to Truss Structure Design

Benders decomposition [7, 17] uses a problem-solving strategy that can be generalized to a larger context. It assigns some of the variables trial valu

A Framework for Integrating Exact and Heuristic Optimization Methods

Decision Diagrams for Sequencing and Scheduling

How to Relax. CP 2008 Slide 1. John Hooker Carnegie Mellon University September 2008

Combining Optimization and Constraint Programming

Duality in Optimization and Constraint Satisfaction

Alternative Methods for Obtaining. Optimization Bounds. AFOSR Program Review, April Carnegie Mellon University. Grant FA

Projection, Consistency, and George Boole

Single-Facility Scheduling by Logic-Based Benders Decomposition

An Integrated Method for Planning and Scheduling to Minimize Tardiness

Logic-based Benders Decomposition

Operations Research Methods in Constraint Programming

On Solving Boolean Combinations of UTVPI Constraints

Applications. Stephen J. Stoyan, Maged M. Dessouky*, and Xiaoqing Wang

MDD-based Postoptimality Analysis for Mixed-integer Programs

Outline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation

Benders Decomposition Methods for Structured Optimization, including Stochastic Optimization

A Search-Infer-and-Relax Framework for Integrating Solution Methods

Multivalued Decision Diagrams. Postoptimality Analysis Using. J. N. Hooker. Tarik Hadzic. Cork Constraint Computation Centre

Lecture 8: Column Generation

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

Improvements to Benders' decomposition: systematic classification and performance comparison in a Transmission Expansion Planning problem

Network Flows. 6. Lagrangian Relaxation. Programming. Fall 2010 Instructor: Dr. Masoud Yaghini

Benders' Method Paul A Jensen

Resource Constrained Project Scheduling Linear and Integer Programming (1)

Robust Scheduling with Logic-Based Benders Decomposition

On the exact solution of a large class of parallel machine scheduling problems

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

Interleaved Alldifferent Constraints: CSP vs. SAT Approaches

Discrete Optimization 2010 Lecture 7 Introduction to Integer Programming

3.10 Lagrangian relaxation

Critical Reading of Optimization Methods for Logical Inference [1]

is called an integer programming (IP) problem. model is called a mixed integer programming (MIP)

Automated Program Verification and Testing 15414/15614 Fall 2016 Lecture 3: Practical SAT Solving

Announcements. Friday Four Square! Problem Set 8 due right now. Problem Set 9 out, due next Friday at 2:15PM. Did you lose a phone in my office?

Integer Linear Programming (ILP)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Outline. Outline. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Scheduling CPM/PERT Resource Constrained Project Scheduling Model

Foundations of Artificial Intelligence

Integer vs. constraint programming. IP vs. CP: Language

Introduction to Integer Linear Programming

Inference of A Minimum Size Boolean Function by Using A New Efficient Branch-and-Bound Approach From Examples

Lecture 5. x 1,x 2,x 3 0 (1)

Integer Programming, Constraint Programming, and their Combination

Integer Linear Programming Modeling

Introduction to Arti Intelligence

Disconnecting Networks via Node Deletions

IS703: Decision Support and Optimization. Week 5: Mathematical Programming. Lau Hoong Chuin School of Information Systems

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling

Clause/Term Resolution and Learning in the Evaluation of Quantified Boolean Formulas

CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017

Job Sequencing Bounds from Decision Diagrams

Planning and Scheduling by Logic-Based Benders Decomposition

MVE165/MMG631 Linear and integer optimization with applications Lecture 8 Discrete optimization: theory and algorithms

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Introduction to optimization and operations research

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

Solutions to Sample Problems for Midterm

Solvers for the Problem of Boolean Satisfiability (SAT) Will Klieber Aug 31, 2011

A brief introduction to Logic. (slides from

Comp487/587 - Boolean Formulas

Lagrangian Relaxation in MIP

Travelling Salesman Problem

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

The core of solving constraint problems using Constraint Programming (CP), with emphasis on:

Solving Elementary Shortest-Path Problems as Mixed-Integer Programs

Topics in Model-Based Reasoning

Section #2: Linear and Integer Programming

Integer Programming and Branch and Bound

to work with) can be solved by solving their LP relaxations with the Simplex method I Cutting plane algorithms, e.g., Gomory s fractional cutting

CS 6820 Fall 2014 Lectures, October 3-20, 2014

Lecture 9: Dantzig-Wolfe Decomposition

3.3 Easy ILP problems and totally unimodular matrices

Lecture Notes on SAT Solvers & DPLL

3.7 Cutting plane methods

Benders Decomposition

Decision Procedures An Algorithmic Point of View

Discrete (and Continuous) Optimization WI4 131

1 Algebraic Methods. 1.1 Gröbner Bases Applied to SAT

Solving Mixed-Integer Nonlinear Programs

The two-machine flowshop total completion time problem: A branch-and-bound based on network-flow formulation

with Binary Decision Diagrams Integer Programming J. N. Hooker Tarik Hadzic IT University of Copenhagen Carnegie Mellon University ICS 2007, January

Transcription:

Projection in Logic, CP, and Optimization John Hooker Carnegie Mellon University Workshop on Logic and Search Melbourne, 2017

Projection as a Unifying Concept Projection is a fundamental concept in logic, constraint programming, and optimization. - Logical inference is projection onto a subset of variables. - Consistency maintenance in CP is a projection problem. - Optimization is projection onto a cost variable. 2

Projection as a Unifying Concept Projection is a fundamental concept in logic, constraint programming, and optimization. - Logical inference is projection onto a subset of variables. - Consistency maintenance in CP is a projection problem. - Optimization is projection onto a cost variable. Recognizing this unity can lead to faster search methods. - In both logic and optimization. 3

Projection as a Unifying Concept Two fundamental projection methods occur across multiple fields. 4

Projection as a Unifying Concept Two fundamental projection methods occur across multiple fields. Fourier-Motzkin Elimination and generalizations. - Polyhedral projection. - Probability logic - Propositional logic (resolution) - Integer programming (cutting planes & modular arithmetic) - Some forms of consistency maintenance 5

Projection as a Unifying Concept Two fundamental projection methods occur across multiple fields. Benders decomposition and generalizations. - Optimization. - Probability logic (column generation) - Propositional logic (conflict clauses) - First-order logic (partial instantiation) 6

Outline Projection using Fourier-Motzkin elimination Consistency maintenance as projection Projection using Benders decomposition 7

What Is Projection? Projection yields a constraint set. - We project a constraint set onto a subset of its variables to obtain another constraint set.

What Is Projection? Projection yields a constraint set. - We project a constraint set onto a subset of its variables to obtain another constraint set. Formal definition - Let - Let - Let be a constraint set. - The projection of onto is a constraint set, containing only variables in, whose satisfaction set is 9

Projection Using Fourier-Motzkin Elimination and Its Generalizations 10

Polyhedral Projection We wish to project a polyhedron onto a subspace. A method based on an idea of Fourier was proposed by Motzkin. The basic idea of Fourier-Motzkin elimination can be used to compute projections in several contets. Fourier (1827) Motzkin (1936) 11

Polyhedral Projection Eliminate variables we want to project out. To project onto 1,, k project out all variables ecept 1,, k To project out j, eliminate it from pairs of inequalities: where c 0, d 0 0 Then remove all inequalities containing j 12

Polyhedral Projection Eample Project onto 2 by projecting out 1 2 or 2 1 13

Optimization as Projection Optimization is projection onto a single variable. To solve project onto 0 to obtain an interval 14

Optimization as Projection Optimization is projection onto a single variable. To solve project onto 0 to obtain an interval Linear programming We can in principle solve with Fourier-Motzkin elimination by projecting onto 0 But this is etremely inefficient. Use simple or interior point method instead. 15

Probability Logic Inference in probability logic is a polyhedral projection problem Originally stated by George Boole. The linear programming problem can be solved, in principle, by Fourier-Motzkin elimination. The problem Given a probability interval for each of several formulas in propositional logic, Deduce a probability interval for a target formula. 16

Probability Logic if if Formula 1 Eample then then 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Boole (1854)

Probability Logic if if Formula 1 Eample then then 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Interpret if-then statements as material conditionals Boole (1854)

Probability Logic Formula 1 Eample 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Interpret if-then statements as material conditionals Boole (1854)

Probability Logic Formula 1 Eample 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Linear programming model min/ ma 0 01010101 00001111 11110011 11011101 11111111 p p p p 000 001 010 111 0 0.9 0.8 0.4 1 p 000 = probability that ( 1, 2, 3 ) = (0,0,0) Hailperin (1976) Nilsson (1986) 20

Probability Logic Formula 1 Eample 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Linear programming model min/ ma 0 01010101 00001111 11110011 11011101 11111111 p p p p 000 001 010 111 0 0.9 0.8 0.4 1 p 000 = probability that ( 1, 2, 3 ) = (0,0,0) Solution: 0 [0.1, 0.4] Hailperin (1976) Nilsson (1986) 21

Inference as Projection Projection can be viewed as the fundamental inference problem. Deduce information that pertains to a desired subset of propositional variables. In propositional logic (SAT), this can be achieved by the resolution method. CNF analog of Quine s consensus method for DNF. 22

Inference as Projection Project onto propositional variables of interest Suppose we wish to infer from these clauses everything we can about propositions 1, 2, 3 23

Inference as Projection Project onto propositional variables of interest Suppose we wish to infer from these clauses everything we can about propositions 1, 2, 3 We can deduce 1 3 This is a projection onto 1, 2, 3 24

Inference as Projection Resolution as a projection method Similar to Fourier-Motzkin elimination Actually, identical to Fourier-Motzkin elimination + rounding To project out j, eliminate it from pairs of clauses: C j D j C D Then remove all clauses containing j Quine (1952,1955) JH (1992,2012) 25

Inference as Projection Interpretation as Fourier-Motzkin + rounding Project out 1 using resolution:

Inference as Projection Interpretation as Fourier-Motzkin + rounding Project out 1 using resolution: Project out 1 using Fourier-Motzkin + rounding j = 1,0 corresponds to j = T, F Williams (1987) rounds to since j s are integer 27

Projection and Cutting Planes A resolvent is a special case of a rank 1 Chvátal cut. A general inference method for integer programming. All rank 1 cuts can be obtained by taking nonnegative linear combinations and rounding. We can deduce all valid inequalities by recursive generation of rank 1 cuts. including inequalities describing the projection onto a given subset of variables. The minimum number of iterations necessary is the Chvátal rank of the constraint set. There is no upper bound on the rank as a function of the number of variables. Chvátal 1973 28

Projection Methods Generalizations of resolution For cardinality clauses JH (1988) For 0-1 linear inequalities JH (1992) For general integer linear inequalities Williams & JH (2015) 29

Projection for Integer Programming Eample: solve

Projection for Integer Programming Eample: solve To project out 1, first combine C1 and C2:

Projection for Integer Programming Eample: solve To project out 1, first combine C1 and C2: Since 2 nd term is even, we can write this as where. This simplifies to 32

Projection for Integer Programming Eample: solve After similarly combining C1 and C3, we get the problem with 1 projected out:

Projection for Integer Programming Eample: solve After similarly combining C1 and C3, we get the problem with 1 projected out: This is equivalent to or So optimal value = 9.

Projection for Integer Programming Eample: solve optimal Number of iterations to compute a projection is bounded by number of variables projected out, unlike Chvátal cuts, for which number of iterations is unbounded.

Consistency Maintenance as Projection 36

Consistency as Projection Domain consistency Domain of variable j contains only values that j assumes in some feasible solution. Equivalently, domain of j = projection of feasible set onto j. 37

Consistency as Projection Domain consistency Domain of variable j contains only values that j takes in some feasible solution. Equivalently, domain of j = projection of feasible set onto j. Eample: Constraint set 1 2 3 alldiff,, 3 a, b a, b b, c 38

Consistency as Projection Domain consistency Domain of variable j contains only values that j takes in some feasible solution. Equivalently, domain of j = projection of feasible set onto j. Eample: Constraint set 1 2 3 alldiff,, 3 a, b a, b b, c Solutions,, 3 ( a, b, c) ( bac,, ) 39

Consistency as Projection Domain consistency Domain of variable j contains only values that j takes in some feasible solution. Equivalently, domain of j = projection of feasible set onto j. Eample: Constraint set 1 2 3 alldiff,, 3 a, b a, b b, c Solutions,, 3 ( a, b, c) ( bac,, ) Projection onto 1 a b 1, Projection onto 2 a b 2, Projection onto 3 3 c 40

Consistency as Projection Domain consistency Domain of variable j contains only values that j takes in some feasible solution. Equivalently, domain of j = projection of feasible set onto j. Eample: Constraint set 1 2 3 alldiff,, 3 a, b a, b b, c Solutions,, 3 ( a, b, c) ( bac,, ) This achieves domain consistency. Projection onto 1 a b 1, Projection onto 2 a b 2, Projection onto 3 3 c 41

Consistency as Projection k-consistency Can be defined: A constraint set S is k-consistent if: for every J {1,, n} with J = k 1, every assignment J = v J D j for which ( J, j ) does not violate S, and every variable j J, J = ( j j J) there is an assignment j = v j D j for which ( J, j ) = (v J,v j ) does not violate S. 42

Consistency as Projection k-consistency Can be defined: A constraint set S is k-consistent if: for every J {1,, n} with J = k 1, every assignment J = v J D j for which ( J, j ) does not violate S, and every variable j J, there is an assignment j = v j D j for which ( J, j ) = (v J,v j ) does not violate S. To achieve k-consistency: J = ( j j J) Project the constraints containing each set of k variables onto subsets of k 1 variables. 43

Consistency as Projection Consistency and backtracking: Strong k-consistency for entire constraint set avoids backtracking if the primal graph has width < k with respect to branching order. Freuder (1982) No point in achieving strong k-consistency for individual constraints if we propagate through domain store. Domain consistency has same effect. 44

J-Consistency A type of consistency more directly related to projection. Constraint set S is J-consistent if it contains the projection of S onto J. S is domain consistent if it is { j }-consistent for each j. J = ( j j J) 45

J-Consistency J-consistency and backtracking: If we project a constraint onto 1, 2,, k, the constraint will not cause backtracking as we branch on the remaining variables. A natural strategy is to project out n, n1, until computational burden is ecessive. 46

J-Consistency J-consistency and backtracking: If we project a constraint onto 1, 2,, k, the constraint will not cause backtracking as we branch on the remaining variables. A natural strategy is to project out n, n1, until computational burden is ecessive. No point in achieving J-consistency for individual constraints if we propagate through a domain store. However, J-consistency can be useful if we propagate through a richer data structure such as decision diagrams which can be more effective as a propagation medium. JH & Hadžić (2006,2007) Andersen, Hadžić, JH, Tiedemann (2007) Bergman, Ciré, van Hoeve, JH (2014) 47

Propagating J-Consistency Eample: among (, ),{ c, d},1,2 ( c) ( d), a, b, c, d alldiff,,, 3 4 a, b c, d 3 4 Already domain consistent for individual constraints. If we branch on 1 first, must consider all 4 branches 1 = a,b,c,d 48

Eample: ( c) ( d), a, b, c, d Propagating J-Consistency among (, ),{ c, d},1,2 alldiff,,, 3 4 a, b c, d 3 4 a,b,d a,b,c,d Suppose we propagate through a relaed decision diagram of width 2 for these constraints c d 52 paths from top to bottom represent assignments to 1, 2, 3, 4 36 of these are the feasible assignments. a,b c,d 49

Eample: ( c) ( d), a, b, c, d Propagating J-Consistency among (, ),{ c, d},1,2 alldiff,,, 3 4 a, b c, d 3 4 a,b,d a,b,c,d 52 paths from top to bottom represent assignments to 1, 2, 3, 4 36 of these are the feasible assignments. Suppose we propagate through a relaed decision diagram of width 2 for these constraints a,b c,d c d Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 50

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. a,b,d c For each arc, indicate length of shortest path from top to that arc. a,b,c,d a,b c,d d Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 51

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. a,b,d 0,0,1 c 1 For each arc, indicate length of shortest path from top to that arc. a,b,c,d 0,0,1,1 a,b c,d d 2 Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 52

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. a,b,d 0,0,1 c 1 Remove arcs with label > 1 For each arc, indicate length of shortest path from top to that arc. a,b,c,d 0,0,1,1 a,b c,d d 2 Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 53

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. a,b,d 0,0,1 c 1 Remove arcs with label > 1 For each arc, indicate length of shortest path from top to that arc. a,b,c,d 0,0,1,1 a,b c,d Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 54

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. a,b,d 0,0,1 c 1 Remove arcs with label > 1 Clean up. For each arc, indicate length of shortest path from top to that arc. a,b,c,d 0,0,1,1 a,b c,d Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 55

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. Let the length of a path be number of arcs with labels in {c,d}. For each arc, indicate length of shortest path from top to that arc. a,b,d 0,0,1 a,b,c,d 0,0,1,1 a,b c,d Remove arcs with label > 1 Clean up. Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 56

Propagating J-Consistency Let s propagate the 2 nd atmost constraint in the projected alldiff through the relaed decision diagram. We need only branch on a,b,d rather than a,b,c,d a,b,d Remove arcs with label > 1 Clean up. a,b,c,d a,b c,d Projection of alldiff onto 1, 2 is alldiff, atmost (, ),{ a, b},1 atmost (, ),{c,d},1 57

Achieving J-consistency Constraint How hard to project? among sequence regular alldiff Easy and fast. More complicated but fast. Since polyhedron is integral, can write a formula based on Fourier-Motzkin Easy and basically same labor as domain consistency. Quite complicated but practical for small domains. 58

Projection Using Benders Decomposition and Its Generalizations 59

Logic-Based Benders Logic-based Benders decomposition is a generalization of classical Benders decomposition. Solves a problem of the form min f (, y) (, y) S D JH (2000), JH & Ottosson (2003) 60

Logic-Based Benders Decompose problem into master and subproblem. Subproblem is obtained by fiing to solution value in master problem. Master problem Subproblem min z z g ( ) (Benders cuts) k D Minimize cost z subject to bounds given by Benders cuts, obtained from values of attempted in previous iterations k. Trial value that solves master Benders cut z g k () min f (, y) (, y) S Obtain proof of optimality (solution of inference dual). Use same proof to deduce cost bounds for other assignments, yielding Benders cut. 61

Logic-Based Benders Iterate until master problem value equals best subproblem value so far. This yields optimal solution. Master problem Subproblem min z z g ( ) (Benders cuts) k D Minimize cost z subject to bounds given by Benders cuts, obtained from values of attempted in previous iterations k. Trial value that solves master Benders cut z g k () min f (, y) (, y) S Obtain proof of optimality (solution of inference dual). Use same proof to deduce cost bounds for other assignments, yielding Benders cut. 62

Logic-Based Benders The Benders cuts define the projection of the feasible set onto (z,). If all possible cuts are generated. Master problem Subproblem min z z g ( ) (Benders cuts) k D Minimize cost z subject to bounds given by Benders cuts, obtained from values of attempted in previous iterations k. Trial value that solves master Benders cut z g k () min f (, y) (, y) S Obtain proof of optimality (solution of inference dual). Use same proof to deduce cost bounds for other assignments, yielding Benders cut. 63

Logic-Based Benders Fundamental concept: inference duality Primal problem: optimization min f( ) S Find best feasible solution by searching over values of. ma v P S f ( ) v P P Dual problem: Inference Find a proof of optimal value v* by searching over proofs P. 64

Logic-Based Benders Popular optimization duals are special cases of the inference dual. Result from different choices of inference method. For eample... Linear programming dual (gives classical Benders cuts) Lagrangean dual Surrogate dual Subadditive dual 65

Classical Benders Linear programming dual results in classical Benders method. The problem is min c dy A By b Master problem Subproblem min z (Benders cuts) Minimize cost z subject to bounds given by Benders cuts, obtained from values of attempted in previous iterations k. Trial value that solves master Benders cut z c + u(b A) Benders (1962) min c dy By b A Obtain proof of optimality by solving LP dual: ma u( b A) ub d, u 0 66

Application to Planning & Scheduling Assign tasks in master, schedule in subproblem. Combine mied integer programming and constraint programming Master problem Assign tasks to resources to minimize cost. Solve by mied integer programming. Trial assignment Benders cut z g k () Subproblem Schedule jobs on each machine, subject to time windows. Constraint programming obtains proof of optimality (dual solution). Use same proof to deduce cost for some other assignments, yielding Benders cut. 67

Application to Planning & Scheduling Objective function Cost is based on task assignment only. cost c, 1 if task j assigned to resource i ij ij ij ij So cost appears only in the master problem. Scheduling subproblem is a feasibility problem. 68

Application to Planning & Scheduling Objective function Cost is based on task assignment only. cost c, 1 if task j assigned to resource i ij ij ij ij So cost appears only in the master problem. Scheduling subproblem is a feasibility problem. Benders cuts They have the form jj i (1 ) 1, all ij i where J i is a set of tasks that create infeasibility when assigned to resource i. 69

Application to Planning & Scheduling Resulting Benders decomposition: Master problem Subproblem min z z cij ij Benders cuts ij jj i Trial assignment Benders cuts (1 ) 1, ij for infeasible resources i Schedule jobs on each resource. Constraint programming may obtain proof of infeasibility on some resources (dual solution). Use same proof to deduce infeasibility for some other assignments, yielding Benders cut. 70

Number of nstances solved 50 45 40 Performance profile 50 instances 35 30 25 20 Rela + strong cuts Rela + weak cuts MIP (CPLEX) 15 10 5 0 0.01 0.1 1 10 100 1000 10000 Computation time (sec) 71

Application to Probability Logic Eponentially many variables in LP model. What to do? Formula 1 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Linear programming model min/ ma 0 01010101 00001111 11110011 11011101 11111111 p p p p 000 001 010 111 0 0.9 0.8 0.4 1 p 000 = probability that ( 1, 2, 3 ) = (0,0,0) 72

Application to Probability Logic Eponentially many variables in LP model. What to do? Apply classical Benders to linear programming dual! Formula 1 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Linear programming model min/ ma 0 01010101 00001111 11110011 11011101 11111111 p p p p 000 001 010 111 0 0.9 0.8 0.4 1 p 000 = probability that ( 1, 2, 3 ) = (0,0,0) 73

Application to Probability Logic Eponentially many variables in LP model. What to do? Apply classical Benders to linear programming dual! This results in a column generation method that introduces variables into LP only as needed to find optimum. Formula 1 2 3 Probability 0.9 0.8 0.4 Deduce probability range for 3 Linear programming model min/ ma 0 01010101 00001111 11110011 11011101 11111111 p p p p 000 001 010 111 0 0.9 0.8 0.4 1 p 000 = probability that ( 1, 2, 3 ) = (0,0,0) 74

Inference as Projection Recall that logical inference is a projection problem. We wish to infer from these clauses everything we can about propositions 1, 2, 3 We can deduce 1 3 This is a projection onto 1, 2, 3 75

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem Benders cut from previous iteration 76

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem solution of master ( 1, 2, 3 ) = (0,1,0) Resulting subproblem 77

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem solution of master ( 1, 2, 3 ) = (0,1,0) Resulting subproblem Subproblem is infeasible. ( 1, 3 )=(0,0) creates infeasibility 78

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem 1 3 solution of master ( 1, 2, 3 ) = (0,1,0) Benders cut (nogood) Resulting subproblem Subproblem is infeasible. ( 1, 3 )=(0,0) creates infeasibility 79

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem 1 3 solution of master ( 1, 2, 3 ) = (0,1,1) Resulting subproblem 80

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem 1 3 solution of master ( 1, 2, 3 ) = (0,1,1) Resulting subproblem Subproblem is feasible 81

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem 1 3 3 solution of master ( 1, 2, 3 ) = (0,1,1) Enumerative Benders cut Subproblem is feasible Resulting subproblem 82

Inference as Projection Benders decomposition computes the projection! Benders cuts describe projection onto 1, 2, 3 Current Master problem 1 3 3 solution of master ( 1, 2, 3 ) = (0,1,1) Enumerative Benders cut Resulting subproblem JH (2000, 2012) Continue until master is infeasible. Black Benders cuts describe projection. 83

Inference as Projection Benders cuts = conflict clauses in a SAT algorithm! Branch on 1, 2, 3 first. JH (2012, 2016) 84

Inference as Projection Benders cuts = conflict clauses in a SAT algorithm! Branch on 1, 2, 3 first. Conflict clauses 85

Inference as Projection Benders cuts = conflict clauses in a SAT algorithm! Branch on 1, 2, 3 first. Conflict clauses Backtrack to 3 at feasible leaf nodes 86

Inference as Projection Benders cuts = conflict clauses in a SAT algorithm! Branch on 1, 2, 3 first. Conflict clauses containing 1, 2, 3 describe projection 87

Accelerating Search Logic-based Benders can speed up search in several domains. Several orders of magnitude relative to state of the art. Some applications: Circuit verification Chemical batch processing (BASF, etc.) Steel production scheduling Auto assembly line management (Peugeot-Citroën) Automated guided vehicles in fleible manufacturing Allocation and scheduling of multicore processors (IBM, Toshiba, Sony) Facility location-allocation Stochastic facility location and fleet management Capacity and distance-constrained plant location 88

Logic-Based Benders Some applications Transportation network design Traffic diversion around blocked routes Worker assignment in a queuing environment Single- and multiple-machine allocation and scheduling Permutation flow shop scheduling with time lags Resource-constrained scheduling Wireless local area network design Service restoration in a network Optimal control of dynamical systems Sports scheduling 89

First-Order Logic Partial instantiation methods for first-order logic can be viewed as Benders methods The master problem is a SAT problem for the current formula F, The solution of the master finds a satisfier mapping that makes one literal of each clause of F (the satisfier of the clause) true. JH, Rago, Chandru, Shrivastava (2002) 90

First-Order Logic Partial instantiation methods for first-order logic can be viewed as Benders methods The master problem is a SAT problem for the current formula F, The solution of the master finds a satisfier mapping that makes one literal of each clause of F (the satisfier of the clause) true. The subproblem checks whether a satisfier mapping is blocked. This means atoms assigned true and false can be unified. JH, Rago, Chandru, Shrivastava (2002) 91

First-Order Logic Partial instantiation methods for first-order logic can be viewed as Benders methods The master problem is a SAT problem for the current formula F, The solution of the master finds a satisfier mapping that makes one literal of each clause of F (the satisfier of the clause) true. The subproblem checks whether a satisfier mapping is blocked. This means atoms assigned true and false can be unified. In case of blockage, more complete instantiations of the blocked clauses are added to F as Benders cuts. JH, Rago, Chandru, Shrivastava (2002) 92

First-Order Logic Resulting Benders decomposition: Master problem Subproblem Current partially instantiated formula F. Solve SAT problem for a satisfier mapping. Satisfier mapping More fully instantiated clauses to append to F Check if the satisfier mapping is blocked by unifying atoms that receive different truth values. The dual solution is the most general unifier. Use same unifier to create Benders cuts: fuller instantiations of the relevant clauses. 93

First-Order Logic where Consider the formula 94

First-Order Logic Consider the formula True where False Solution of master problem yields satisfiers shown.

First-Order Logic Consider the formula True where False Solution of master problem yields satisfiers shown. The satisfier mapping is blocked because the atoms can be unified. and

First-Order Logic Consider the formula True where False Solution of master problem yields satisfiers shown. The satisfier mapping is blocked because the atoms can be unified. and Generate Benders cuts by applying the most general unifier of the atoms to the clauses containing them, and adding the result to F. Now, where 97

First-Order Logic Consider the formula True where False Solution of master problem yields satisfiers shown. The satisfier mapping is blocked because the atoms can be unified. and Generate Benders cuts by applying the most general unifier of the atoms to the clauses containing them, and adding the result to F. Now, where Solution of the new master problem yields a satisfier mapping that is not blocked in the subproblem, and the procedure terminates with satisfiability. 98

First-Order Logic We can accommodate full first-order logic with functions If we replace blocked with M-blocked Meaning that the satisfier mapping is blocked within a nesting depth of M. The procedure always terminates if F is unsatisfiable. It may not terminate if F is satisfiable, since first-order logic is semidecidable. The master problem has infinitely many variables, because the Herbrand base is infinite. 99

100