Scheduling with AND/OR Precedence Constraints

Similar documents
c 2004 Society for Industrial and Applied Mathematics

Minimum cost transportation problem

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithm Design. Scheduling Algorithms. Part 2. Parallel machines. Open-shop Scheduling. Job-shop Scheduling.

Flow Shop and Job Shop Models

Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

A Dynamic Programming algorithm for minimizing total cost of duplication in scheduling an outtree with communication delays and duplication

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Combinatorial optimization problems

Discrete (and Continuous) Optimization WI4 131

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Algorithm Design and Analysis

Algorithm Design and Analysis

MAS210 Graph Theory Exercises 5 Solutions (1) v 5 (1)

4 Sequencing problem with heads and tails

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Graphs and Network Flows IE411. Lecture 12. Dr. Ted Ralphs

CS781 Lecture 3 January 27, 2011

Friday, September 21, Flows

On Machine Dependency in Shop Scheduling

Clock-driven scheduling

Abstract. 1 Introduction

Algorithm Design Strategies V

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

Embedded Systems 15. REVIEW: Aperiodic scheduling. C i J i 0 a i s i f i d i

Contents college 5 and 6 Branch and Bound; Beam Search (Chapter , book)! general introduction

Lecture 13: Polynomial-Time Algorithms for Min Cost Flows. (Reading: AM&O Chapter 10)

Robust optimization for resource-constrained project scheduling with uncertain activity durations

Single Machine Models

Math 5490 Network Flows

Solutions to Exercises

CS 410/584, Algorithm Design & Analysis, Lecture Notes 4

A Branch and Bound Algorithm for the Project Duration Problem Subject to Temporal and Cumulative Resource Constraints

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

Discrete Optimization 2010 Lecture 3 Maximum Flows

On bilevel machine scheduling problems

GRAPH ALGORITHMS Week 3 (16-21 October 2017)

CMPSCI611: The Matroid Theorem Lecture 5

Theoretical Computer Science

Maximum flow problem (part I)

Introduction to Algorithms

CS 580: Algorithm Design and Analysis

Discrete Optimization

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

Single Source Shortest Paths

BRANCH-AND-BOUND ALGORITHMS FOR STOCHASTIC RESOURCE-CONSTRAINED PROJECT SCHEDULING

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Semi-Simultaneous Flows and Binary Constrained (Integer) Linear Programs

Multi-Skill Resource-Constrained Project Scheduling: Formulation and Inequalities

Embedded Systems 14. Overview of embedded systems design

Streaming Algorithms for Submodular Function Maximization

INVERSE SPANNING TREE PROBLEMS: FORMULATIONS AND ALGORITHMS

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

A scheme developed by Du Pont to figure out

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

Subcubic Equivalence of Triangle Detection and Matrix Multiplication

Vehicle Routing and Scheduling. Martin Savelsbergh The Logistics Institute Georgia Institute of Technology

SINGLE MACHINE SEQUENCING Part 2. ISE480 Sequencing and Scheduling Fall semestre

Applied Integer Programming: Modeling and Solution

An Õ m 2 n Randomized Algorithm to compute a Minimum Cycle Basis of a Directed Graph

Representations of All Solutions of Boolean Programming Problems

Real-Time Systems. Event-Driven Scheduling

Core Mathematics C1 (AS) Unit C1

3.4 Relaxations and bounds

Discrete (and Continuous) Optimization Solutions of Exercises 2 WI4 131

Chapter 3, Operations Research (OR)

Discrete Optimization 23

More Approximation Algorithms

Real-Time Systems. Event-Driven Scheduling

How to deal with uncertainties and dynamicity?

Max Flow: Algorithms and Applications

Lecture 3. 1 Polynomial-time algorithms for the maximum flow problem

Relation of Pure Minimum Cost Flow Model to Linear Programming

Average-Case Performance Analysis of Online Non-clairvoyant Scheduling of Parallel Tasks with Precedence Constraints

All-norm Approximation Algorithms

Multicommodity Flows and Column Generation

Linear programming I João Carlos Lourenço

Introduction 1.1 PROBLEM FORMULATION

2. Project management

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk

Solving Dual Problems

Computers and Mathematics with Applications. Project management for arbitrary random durations and cost attributes by applying network approaches

Advances in processor, memory, and communication technologies

Simple Dispatch Rules

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Integer Linear Programming (ILP)

A neighborhood for complex job shop scheduling problems with regular objectives

Using column generation to solve parallel machine scheduling problems with minmax objective functions

CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk. 10/8/12 CMPS 2200 Intro.

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

Robust Local Search for Solving RCPSP/max with Durational Uncertainty

Aperiodic Task Scheduling

3.10 Lagrangian relaxation

Marjan van den Akker. Han Hoogeveen Jules van Kempen

1 Ordinary Load Balancing

Introduction to Algorithms

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Transcription:

Scheduling with AND/OR Precedence Constraints Seminar Mathematische Optimierung - SS 2007 23th April 2007

Synthesis synthesis: transfer from the behavioral domain (e. g. system specifications, algorithms) to the structural domain (e. g. processor, controllers) methods: allocation assignment scheduling compute start times for operations minimize costs today: (by me) next week: Solving Project Scheduling Problems by Minimum Cut Computations (by Christian Rinder)

Example

Table of Contents 1 Definitions and Graph Representation 2 Realizations 3 Implicit Constraints 4 Minimal Representation 5 Computing Earliest Job Start Times

AND/OR Precedence Constraints Definition V : set of jobs W: set of waiting conditions w = (X, j) where job j V cannot be started until at least one of the jobs X (V \{j}) is done represented by a directed graph D = (V, A): V = V W: set of nodes every waiting condition w = (X, j) W is represented by arcs (i, w) A for every i X and an additional arc (w, j) A

AND/OR Precedence Constraints V = {j 1, j 2, j 3, j 4 } W = {w 1 =({j 1 }, j 2 ), w 2 =({j 2, j 3 }, j 1 ), w 3 =({j 1, j 4 }, j 3 )}

Realizations Definitions realization: partial order R = (V, R ) on V where for each (X, j) W there exists an i X with i R j linear realization: realization which is a total order W is feasible if and only if there exists a (linear) realization for W. Note: Every extension R = (V, R ) of a realization R (i. e. i R j, then i R j) is also a realization.

Realizations V = {j 1, j 2, j 3, j 4 } W = {w 1 =({j 1 }, j 2 ), w 2 =({j 2, j 3 }, j 1 ), w 3 =({j 1, j 4 }, j 3 )} Example j 4 R j 3 R j 1 R j 2 is a linear realization.

Idea for Checking Feasibility Is a set W of AND/OR precedence constraints feasible? Construct a linear realization L in a greedy way: Does there exist a job j V which is not a waiting job of a waiting condition in W? insert j at the end of L Is a waiting condition (X, j) satisfied? delete (X, j) from W Iterate these steps until no other job can be planned (e. g. all jobs are planned)

Algorithm for Checking Feasibility Input: set V of jobs and set W of waiting conditions Output: list L of jobs from V Q := L := for jobs j V do a(j) := {(X, j) W} if a(j) = 0 then add j to Q end end

Algorithm for Checking Feasibility (continued) while Q do remove a job i from Q insert i at the end of L for waiting conditions (X, j) W with i X do decrease a(j) by 1 if a(j) = 0 then add j to Q end remove (X, j) from W end end

Algorithm for Checking Feasibility W = {w 1, w 2, w 3 } Q = {} L = () a(j 1 ) = a(j 2 ) = a(j 3 ) = a(j 4 ) =

Algorithm for Checking Feasibility W = {w 1, w 2, w 3 } Q = {j 4 } L = () a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 1 a(j 4 ) = 0

Algorithm for Checking Feasibility W = {w 1, w 2 } Q = {j 3 } L = (j 4 ) a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 0 a(j 4 ) = 0

Algorithm for Checking Feasibility W = {w 1 } Q = {j 1 } L = (j 4, j 3 ) a(j 1 ) = 0 a(j 2 ) = 1 a(j 3 ) = 0 a(j 4 ) = 0

Algorithm for Checking Feasibility W = {} Q = {j 2 } L = (j 4, j 3, j 1 ) a(j 1 ) = 0 a(j 2 ) = 0 a(j 3 ) = 0 a(j 4 ) = 0

Algorithm for Checking Feasibility W = {} Q = {} L = (j 4, j 3, j 1, j 2 ) a(j 1 ) = 0 a(j 2 ) = 0 a(j 3 ) = 0 a(j 4 ) = 0

Algorithm for Checking Feasibility (Proof) Theorem A set of AND/OR precedence constraints is feasible if and only if the list L obtained from the algorithm contains all jobs of V.

Definition of Implicit AND/OR Precedence Constraints Goal: Derive new constraints from a given set W Definitions Let U V and j V \U. Then the waiting condition (U, j) is implied by W if and only if for every realization R = (V, R ) of W there exists an i U with i R j. Let Y V. Then W Y = {(X Y, j) (X, j) W, j Y } is the set of induced waiting conditions. But: How can we compute such constraints?

Algorithm for Detecting Implicit Constraints Theorem For given U V let L be the output of the algorithm with input V \U and W V \U. The set of waiting conditions of the form (U, j) which are implied by W is precisely {(U, j) : j V \(L U)}. Example V = {j 1, j 2, j 3, j 4 }, W = {({j 1 }, j 2 ), ({j 2, j 3 }, j 1 ), ({j 1, j 4 }, j 3 )} Choose U = {j 3 } W V \U = {({j 1 }, j 2 ), ({j 2 }, j 1 )} Algorithm: L = (j 4 ) implied waiting conditions (U, j): ({j 3 }, j 1 ), ({j 3 }, j 2 )

Minimal Representation of AND/OR Precedence Constraints Definitions W minimal: 1 no waiting condition (X, j) W is implied by W\{(X, j)}, and 2 for each waiting condition (X, j) W the set X is minimal with respect to inclusion, i. e. for all i X the waiting condition (X \{i}, j) is not implied by W. W, W equivalent: their sets of (linear) realizations coincide W minimal reduction of W: W, W equivalent and W minimal

Examples for Minimal Representation Examples V = {j 1, j 2, j 3, j 4 }, W = {({j 1 }, j 2 ), ({j 2, j 3 }, j 1 ), ({j 1, j 4 }, j 3 )} W not minimal since ({j 3 }, j 1 ) is implied by W W = {({j 1 }, j 2 ), ({j 3 }, j 1 ), ({j 4 }, j 3 )} minimal reduction of W

Unique Minimal Reduction Theorem Each feasible set of waiting conditions has a unique minimal reduction.

Definitions Definition d iw Z: time lag for job i V and waiting condition w = (X, j) W S = (S 1,..., S n ) Z V : vector of start times for all jobs in V S (Z { }) V : partial schedule, i. e. S i = means that job i V is not planned Job processing times p i can be modeled by setting d iw := p i for all w = (X, j) W with i X. d iw < 0 can be interpreted as maximal time lags.

First Model Find (optimal) S such that S j min i X (S i + d iw ) is satisfied for all waiting conditions w = (X, j) W.

Graph Representation and Example Definition AND-nodes: jobs in V OR-nodes: jobs in W

Introducing Dummy Jobs We want to simplify our model: interpret waiting conditions w W as dummy jobs add dummy job s to V and additional waiting conditions ({s}, j) to W for every j V reformulate previous constraints to S w min i X (S i + d iw ) and S j S w for every waiting condition w = (X, j) W and S s 0 since s is our starting point. It follows: S j 0 for all AND-nodes j V

Final Model for (ES) Find a componentwise minimal schedule S Z V such that S s 0 S j S w max (S w + d wj ), (w,j) A min (S j + d jw ), (j,w) A j V w W Without loss of generality: d wj = 0 (otherwise set d iw to d iw + d wj for all i in(w))

More about Schedules and (ES) S = (,..., ) is feasible for (ES) S, S feasible S = min(s, S ) feasible There exists a (unique) componentwise minimal partial schedule S. In (ES) we can replace by = without changing the optimal solution.

Computing Earliest Job Start Times Consider now only positive arc weights, i. e. d iw > 0. How can we compute the earliest job start times? Start with S s = 0, S w = 0 for all w out(s) and S w = for all other OR-nodes w Iteration: Choose not yet planned OR-node w 0 = (X, j) with minimal start time S w0 Plan w 0 with start time S w0 Plan AND-node j if all waiting conditions are now satisfied Update S w for all not yet planned OR-nodes w

Computing Earliest Job Start Times 1 Input: digraph D = (V = V W, A) with positive arc weights on the arcs in V W A Output: feasible (partial) schedule S (Z { }) V Heap := for AND-nodes j V do a(j) := in(j) S s := 0 // AND-node s is planned at time 0 for OR-nodes w W do if w out(s) then insert w in Heap with key S w := 0 else insert w in Heap with key S w := end end

Computing Earliest Job Start Times 2 3 while Heap do remove next OR-node w 0 = (X, j) from Heap // OR-node is planned reduce a(j) by 1 if a(j) = 0 then S j := max w in(j) S w // AND-node is planned for OR-nodes w out(j) do S w := min{s w, S j + d jw } decrease key of w in Heap to S w end end delete node w 0 and all incident arcs from D end

Computing Earliest Job Start Times Example Heap = {S s1 = 0, S s2 = 0, S s3 = 0, S s4 = 0, S w1 =, S w2 =, S w3 = } Planned AND-nodes = {S s = 0} Planned OR-nodes = {} a(j 1 ) = 2 a(j 2 ) = 2 a(j 3 ) = 2 a(j 4 ) = 1

Computing Earliest Job Start Times Example Heap = {S s2 = 0, S s3 = 0, S s4 = 0, S w1 =, S w2 =, S w3 = } Planned AND-nodes = {S s = 0} Planned OR-nodes = {S s1 = 0} a(j 1 ) = 1 a(j 2 ) = 2 a(j 3 ) = 2 a(j 4 ) = 1

Computing Earliest Job Start Times Example Heap = {S s3 = 0, S s4 = 0, S w1 =, S w2 =, S w3 = } Planned AND-nodes = {S s = 0} Planned OR-nodes = {S s1 = 0, S s2 = 0} a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 2 a(j 4 ) = 1

Computing Earliest Job Start Times Example Heap = {S s4 = 0, S w1 =, S w2 =, S w3 = } Planned AND-nodes = {S s = 0} Planned OR-nodes = {S s1 = 0, S s2 = 0, S s3 = 0} a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 1 a(j 4 ) = 1

Computing Earliest Job Start Times Example Heap = {S w1 =, S w2 =, S w3 = 3} Planned AND-nodes = {S s = 0, S j4 = 0} Planned OR-nodes = {S s1 = 0, S s2 = 0, S s3 = 0, S s4 = 0} a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 1 a(j 4 ) = 0

Computing Earliest Job Start Times Example Heap = {S w1 =, S w2 = 4} Planned AND-nodes = {S s = 0, S j4 = 0, S j3 = 3} Planned OR-nodes = {S s1 = 0, S s2 = 0, S s3 = 0, S s4 = 0, S w3 = 3} a(j 1 ) = 1 a(j 2 ) = 1 a(j 3 ) = 0 a(j 4 ) = 0

Computing Earliest Job Start Times Example Heap = {S w1 = 6} Planned AND-nodes = {S s = 0, S j4 = 0, S j3 = 3, S j1 = 4} Planned OR-nodes = {S s1 = 0, S s2 = 0, S s3 = 0, S s4 = 0, S w3 = 3, S w2 = 4} a(j 1 ) = 0 a(j 2 ) = 1 a(j 3 ) = 0 a(j 4 ) = 0

Computing Earliest Job Start Times Example Heap = {} Planned AND-nodes = {S s = 0, S j4 = 0, S j3 = 3, S j1 = 4, S j2 = 6} Planned OR-nodes = {S s1 = 0, S s2 = 0, S s3 = 0, S s4 = 0, S w3 = 3, S w2 = 4, S w1 = 6} a(j 1 ) = 0 a(j 2 ) = 0 a(j 3 ) = 0 a(j 4 ) = 0

Computing Earliest Job Start Times Example Start times: AND-nodes: (4, 6, 3, 0) OR-nodes: (6, 4, 3) Possible realization: j 4 w 3 j 3 w 2 j 1 w 1 j 2

Computing Earliest Job Start Times Theorem For a given set of AND/OR precedence constraints represented by a digraph D = (V W, A) with nonnegative arc weights and without cycles of length 0, the algorithm computes an optimal partial schedule S. In particular, the instance is infeasible if and only if S w = for some OR-node w.

Computing Earliest Job Start Times Theorem For a given set of AND/OR precedence constraints represented by a digraph D = (V W, A) with nonnegative arc weights and without cycles of length 0, the algorithm computes an optimal partial schedule S. In particular, the instance is infeasible if and only if S w = for some OR-node w. Lemma The algorithm can be implemented to run in O( W log W + A + V ) time.

Generalization to Nonnegative Arc Weights We also want to compute earliest job start times for d jw 0 instead only d jw > 0. Previous algorithm gives us: Theorem For a given set of AND/OR precedence constraints represented by a digraph D = (V W, A) with nonnegative arc weights and without cycles of length 0, the algorithm computes an optimal partial schedule S. In particular, the instance is infeasible if and only if S w = for some OR-node w.

Generalization to Nonnegative Arc Weights We also want to compute earliest job start times for d jw 0 instead only d jw > 0. There exists an algorithm with: Theorem For a given set of AND/OR precedence constraints represented by a digraph D = (V W, A) with nonnegative arc weights, the algorithm computes an optimal partial schedule S. In particular, the instance is infeasible if and only if S w = for some OR-node w.

Arbitrary Arc Weights Consider d iw with M < d iw < M, M 0 We cannot use feasibility results from previous sections since they require d iw = p i > 0, i. e. j V can be started if and only if all waiting conditions (X, j) W are satisfied. Definition A set W of waiting conditions is feasible if and only if there exists a feasible schedule S for (ES).

Thank you for listening! Questions?