Minimizing the Number of Tardy Jobs

Similar documents
CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Lecture 4 Scheduling 1

CSE 421 Dynamic Programming

CSE 421 Greedy Algorithms / Interval Scheduling

This means that we can assume each list ) is

Matroids. Start with a set of objects, for example: E={ 1, 2, 3, 4, 5 }

Real-time operating systems course. 6 Definitions Non real-time scheduling algorithms Real-time scheduling algorithm

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

Knapsack and Scheduling Problems. The Greedy Method

Algorithms Exam TIN093 /DIT602

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

Algorithm Design and Analysis

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

Algorithms: Lecture 12. Chalmers University of Technology

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

ECS122A Handout on NP-Completeness March 12, 2018

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

There are three priority driven approaches that we will look at

Task Models and Scheduling

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

The Greedy Method. Design and analysis of algorithms Cs The Greedy Method

15.1 Matching, Components, and Edge cover (Collaborate with Xin Yu)

Recoverable Robustness in Scheduling Problems

Intractable Problems Part Two

CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

APTAS for Bin Packing

Dynamic Programming: Interval Scheduling and Knapsack

a 1 a 2 a 3 a 4 v i c i c(a 1, a 3 ) = 3

Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

Algorithms: COMP3121/3821/9101/9801

CS 580: Algorithm Design and Analysis

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Dynamic Programming( Weighted Interval Scheduling)

Approximation Algorithms (Load Balancing)

ILP Formulations for the Lazy Bureaucrat Problem

Scheduling Lecture 1: Scheduling on One Machine

Scheduling and fixed-parameter tractability

Embedded Systems 14. Overview of embedded systems design

Simple Dispatch Rules

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

JOB SEQUENCING WITH DEADLINES

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

Simulation. Stochastic scheduling example: Can we get the work done in time?

CS 6901 (Applied Algorithms) Lecture 3

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Proof methods and greedy algorithms

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Networked Embedded Systems WS 2016/17

Real-Time and Embedded Systems (M) Lecture 5

Multiprocessor Scheduling I: Partitioned Scheduling. LS 12, TU Dortmund

arxiv: v3 [cs.ds] 23 Sep 2016

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Scheduling Lecture 1: Scheduling on One Machine

Static-Priority Scheduling. CSCE 990: Real-Time Systems. Steve Goddard. Static-priority Scheduling

Aperiodic Task Scheduling

CSE 202 Homework 4 Matthias Springer, A

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Paper Presentation. Amo Guangmo Tong. University of Taxes at Dallas February 11, 2014

CS325: Analysis of Algorithms, Fall Final Exam

Embedded Systems 15. REVIEW: Aperiodic scheduling. C i J i 0 a i s i f i d i

Introduction to Bin Packing Problems

4. How to prove a problem is NPC

Dynamic Programming. Cormen et. al. IV 15

CS 580: Algorithm Design and Analysis

CS 6783 (Applied Algorithms) Lecture 3

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

RUN-TIME EFFICIENT FEASIBILITY ANALYSIS OF UNI-PROCESSOR SYSTEMS WITH STATIC PRIORITIES

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Matching Residents to Hospitals

CS781 Lecture 3 January 27, 2011

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

P C max. NP-complete from partition. Example j p j What is the makespan on 2 machines? 3 machines? 4 machines?

NATCOR: Stochastic Modelling

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

Priority queues implemented via heaps

Lecture 2: Divide and conquer and Dynamic programming

Non-Preemptive and Limited Preemptive Scheduling. LS 12, TU Dortmund

Approximating the Stochastic Knapsack Problem: The Benefit of Adaptivity

Approximating the Stochastic Knapsack Problem: The Benefit of Adaptivity

Analysis of Algorithms - Midterm (Solutions)

Math 2930 Worksheet Introduction to Differential Equations. What is a Differential Equation and what are Solutions?

Discrete Mathematics CS October 17, 2006

A New Task Model and Utilization Bound for Uniform Multiprocessors

Embedded Systems - FS 2018

Lecture 9. Greedy Algorithm

Paper Presentation. Amo Guangmo Tong. University of Taxes at Dallas January 24, 2014

Greedy Algorithms My T. UF

Problem Set 1. CSE 373 Spring Out: February 9, 2016

A PTAS for the Uncertain Capacity Knapsack Problem

Greedy Homework Problems

arxiv: v1 [math.oc] 3 Jan 2019

One-to-one functions and onto functions

Lecture 18: P & NP. Revised, May 1, CLRS, pp

Transcription:

Minimizing the Number of Tardy Jobs 1 U j Example j p j d j 1 10 10 2 2 11 3 7 13 4 4 15 5 8 20 Ideas: Need to choose a subset of jobs S that meet their deadlines. Schedule the jobs that meet their deadlines in EDD order (Why?) Schedule the remaining jobs in an arbitrary order. Question: How do you choose the subset?

Algorithm for 1 U j Give an incremental algorithm Consider jobs in deadline order Invariant: Maintain a maximum cardinality set of jobs that meet their deadlines, among such sets, choose the one with with the smallest total amount of processing time.

Algorithm for 1 U j Give an incremental algorithm Consider jobs in deadline order Invariant: Maintain a maximum cardinality set of jobs that meet their deadlines, among such sets, choose the one with with the smallest total amount of processing time. Algorithm Sort jobs by deadlines; S = For each job j in deadline order S = S {j} if j doesn t meet it s deadline when S is scheduled S = S { job in S with largest processing time }

Analysis Run time Need to sort O(n log n) Need to maintain the schedule for S and delete the job with largest processing time. (Maintain a set of numbers doing insert, delete and delete max operations).

Analysis Run time Need to sort O(n log n) Need to maintain the schedule for S and delete the job with largest processing time. (Maintain a set of numbers doing insert, delete and delete max operations). Use a priority queue, each operations is O(log n) time. Analysis: Proof by Induction. After each step k, let S k denote S. S k schedules a maximum sized subset of { 1,...,k } Among all such subsets S k is the one with the minimum total processing time.

j p j d j 1 3 5 2 4 7 3 2 8 4 6 10 5 6 11 6 1 14 7 5 15 Another Example

Special Case of a common deadline 1 U j is easy. What about 1 w j U j Example j p j w j 1 10 10 2 20 50 3 30 20 D is 40. We are choosing a minimum weight subset of jobs that miss their deadline Equivalently: we are choosing a maximum weight subset of jobs that make their dealines. Equivalently: Choosing a maximum weight set of jobs that fit in a bin of certain size.

Knapsack max j w j x j s.t. j p j x j D A one constraint lp, a knapsack problem. If you can take objects fractionally, then the greedy algorithm ( w j /p j ) is optimal. What about the integral (non-preemptive case). Example j p j w j 1 11 12 2 9 9 3 90 89 D is 100.

Solving Knapack Via Dynamic Programming 1. Non-polynomial. We will explicitly solve the problem for all possible values of either time or weight (in this example time.) 2. Polynomial would be polynomial in n, m, log W, log D, where W = max j w j. 3. Running time will be polynomial in n, m, W, D. Called pseudopolynomial. 4. Reasonable approach when W and/or D is not too large. Main Ideas: Parameterize solution, and define optimal solutions of a certain size in terms of solutions with smaller parameter values. Build up a table of solutions, eventually obtaining the solution for the desired parameter value.

DP for Knapsack: maximum weight competing by deadlin f(j, t) will be the best way to schedule jobs 1,..., j with t or less total processing time. Best means maximum total weight. What is f(n, D)? Maximum weight way to schedule all the jobs using at most D total processing time. This is the problem we want to solve.

DP To schedule jobs 1,..., j using t total processing time there are two cases: job j job j is not scheduled. is scheduled

DP To schedule jobs 1,..., j using t total processing time there are two cases: job j job j is not scheduled. is scheduled If j is not scheduled, then the optimal solution for 1,..., j is the same as the optimal solution for 1,..., j 1, hence f(j, t) = f(j 1, t) If j is scheduled, then there are two subcases: j was also scheduled using t 1 time units, hence f(j, t) = f(j, t 1) j was not scheduled when we used t 1 time units. We need to add j to the schedule, hence we have to look at the optimal schedule using t p j units of processing, hence: f(j, t) = f(j 1, t p j ) + w j. We don t know which case happens, so we try all and take the maximum f(j, t) = max{f(j 1, t), f(j, t 1), f(j 1, t p j ) + w j }

Example f(j, t) = max{f(j 1, t), f(j, t 1), f(j 1, t p j ) + w j } j p j w j 1 11 12 2 9 9 3 90 89 D is 100.