JOB SEQUENCING WITH DEADLINES

Similar documents
DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

Matroids. Start with a set of objects, for example: E={ 1, 2, 3, 4, 5 }

CS 6901 (Applied Algorithms) Lecture 3

EECS 477: Introduction to algorithms. Lecture 12

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Simple Dispatch Rules

CS 6783 (Applied Algorithms) Lecture 3

Minimizing the Number of Tardy Jobs

Algorithm Design and Analysis

Machine Minimization for Scheduling Jobs with Interval Constraints

Dynamic Programming: Interval Scheduling and Knapsack

Real-time Scheduling of Periodic Tasks (2) Advanced Operating Systems Lecture 3

More Approximation Algorithms

A 2-Approximation Algorithm for Scheduling Parallel and Time-Sensitive Applications to Maximize Total Accrued Utility Value

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

LPT rule: Whenever a machine becomes free for assignment, assign that job whose. processing time is the largest among those jobs not yet assigned.

CSE 421 Dynamic Programming

Dynamic Programming( Weighted Interval Scheduling)

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

6. DYNAMIC PROGRAMMING I

(a) Write a greedy O(n log n) algorithm that, on input S, returns a covering subset C of minimum size.

Greedy Homework Problems

A comparison of sequencing formulations in a constraint generation procedure for avionics scheduling

Single machine scheduling with forbidden start times

Real-time Systems: Scheduling Periodic Tasks

Friday, September 21, Flows

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

On bilevel machine scheduling problems

Analysis of Algorithms - Midterm (Solutions)

A general framework for handling commitment in online throughput maximization

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Principles of Pattern Recognition. C. A. Murthy Machine Intelligence Unit Indian Statistical Institute Kolkata

6. DYNAMIC PROGRAMMING I

CPSC 320 Sample Final Examination December 2013

CE 221 Data Structures and Algorithms. Chapter 7: Sorting (Insertion Sort, Shellsort)

Greedy Algorithms My T. UF

Schedule Table Generation for Time-Triggered Mixed Criticality Systems

Bio nformatics. Lecture 23. Saad Mneimneh

0-1 Knapsack Problem

Protein folding. α-helix. Lecture 21. An α-helix is a simple helix having on average 10 residues (3 turns of the helix)

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

CSE 21 Practice Exam for Midterm 2 Fall 2017

This means that we can assume each list ) is

Embedded Systems 14. Overview of embedded systems design

CS 301: Complexity of Algorithms (Term I 2008) Alex Tiskin Harald Räcke. Hamiltonian Cycle. 8.5 Sequencing Problems. Directed Hamiltonian Cycle

20. Dynamic Programming II

Single Machine Scheduling with Job-Dependent Machine Deterioration

Flow Shop and Job Shop Models

Lecture 5: Loop Invariants and Insertion-sort

CS 161: Design and Analysis of Algorithms

IS703: Decision Support and Optimization. Week 5: Mathematical Programming. Lau Hoong Chuin School of Information Systems

SEQUENCING A SINGLE MACHINE WITH DUE DATES AND DEADLINES: AN ILP-BASED APPROACH TO SOLVE VERY LARGE INSTANCES

Bin packing and scheduling

Algorithm Design and Analysis

Minimizing Mean Flowtime and Makespan on Master-Slave Systems

PREEMPTIVE RESOURCE CONSTRAINED SCHEDULING WITH TIME-WINDOWS

HEURISTICS FOR TWO-MACHINE FLOWSHOP SCHEDULING WITH SETUP TIMES AND AN AVAILABILITY CONSTRAINT

Lecture: Workload Models (Advanced Topic)

CSE 421 Greedy Algorithms / Interval Scheduling

Design and Analysis of Algorithms

Algorithm Design Strategies V

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

arxiv: v3 [cs.ds] 23 Sep 2016

COMP 250 Fall Midterm examination

Algorithms and Complexity theory

arxiv: v2 [cs.ds] 27 Nov 2014

Logic-based Benders Decomposition

Outline. 1 NP-Completeness Theory. 2 Limitation of Computation. 3 Examples. 4 Decision Problems. 5 Verification Algorithm

CS Analysis of Recursive Algorithms and Brute Force

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

There are three priority driven approaches that we will look at

arxiv: v2 [cs.dm] 2 Mar 2017

Final exam study sheet for CS3719 Turing machines and decidability.

Dynamic Programming. Cormen et. al. IV 15

Scheduling Linear Deteriorating Jobs with an Availability Constraint on a Single Machine 1

Process Scheduling for RTS. RTS Scheduling Approach. Cyclic Executive Approach

Exercises NP-completeness

Real-time Scheduling of Periodic Tasks (1) Advanced Operating Systems Lecture 2

Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Copyright 1975, by the author(s). All rights reserved.

Designing Competitive Online Algorithms via a Primal-Dual Approach. Niv Buchbinder

Searching. Sorting. Lambdas

STABILITY OF JOHNSON S SCHEDULE WITH LIMITED MACHINE AVAILABILITY

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

Algorithms: COMP3121/3821/9101/9801

Algorithms and Their Complexity

CS781 Lecture 3 January 27, 2011

Tricks of the Trade in Combinatorics and Arithmetic

Recoverable Robustness in Scheduling Problems

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

Week 5: Quicksort, Lower bound, Greedy

Quicksort. Where Average and Worst Case Differ. S.V. N. (vishy) Vishwanathan. University of California, Santa Cruz

CSC : Homework #3

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Exam Spring Embedded Systems. Prof. L. Thiele

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

1 Ordinary Load Balancing

Transcription:

JOB SEQUENCING WITH DEADLINES The problem is stated as below. There are n jobs to be processed on a machine. Each job i has a deadline d i 0 and profit p i 0. Pi is earned iff the job is completed by its deadline. The job is completed if it is processed on a machine for unit time. Only one machine is available for processing jobs. Only one job is processed at a time on the machine. 1

JOB SEQUENCING WITH DEADLINES (Contd..) A feasible solution is a subset of jobs J such that each job is completed by its deadline. An optimal solution is a feasible solution with maximum profit value. Example : Let n = 4, (p 1,p 2,p 3,p 4 ) = (100,10,15,27), (d 1,d 2,d 3,d 4 ) = (2,1,2,1) 2

JOB SEQUENCING WITH DEADLINES (Contd..) Sr.No. Feasible Processing Profit value Solution Sequence (i) (1,2) (2,1) 110 (ii) (1,3) (1,3) or (3,1) 115 (iii) (1,4) (4,1) 127 is the optimal one (iv) (2,3) (2,3) 25 (v) (3,4) (4,3) 42 (vi) (1) (1) 100 (vii) (2) (2) 10 (viii) (3) (3) 15 (ix) (4) (4) 27 3

GREEDY ALGORITHM TO OBTAIN AN OPTIMAL SOLUTION Consider the jobs in the non increasing order of profits subject to the constraint that the resulting job sequence J is a feasible solution. In the example considered before, the nonincreasing profit vector is (100 27 15 10) (2 1 2 1) p 1 p 4 p 3 p 2 d 1 d 4 d 3 d 2 4

J = { 1} is a feasible one J = { 1, 4} is a feasible one with processing sequence ( 4,1) J = { 1, 3, 4} is not feasible J = { 1, 2, 4} is not feasible J = { 1, 4} is optimal 5

Theorem: Let J be a set of K jobs and = (i 1,i 2,.i k ) be a permutation of jobs in J such that di 1 di 2 di k. J is a feasible solution iff the jobs in J can be processed in the order without violating any deadly. 6

Proof: By definition of the feasible solution if the jobs in J can be processed in the order without violating any deadline then J is a feasible solution. So, we have only to prove that if J is a feasible one, then represents a possible order in which the jobs may be processed. 7

Suppose J is a feasible solution. Then there exists 1 = (r 1,r 2,,r k ) such that d rj j, 1 j <k i.e. d r1 1, d r2 2,, d rk k. each job requiring an unit time. 8

= (i 1,i 2,,i k ) and 1 = (r 1,r 2,,r k ) Assume 1. Then let a be the least index in which 1 and differ. i.e. a is such that r a i a. Let r b = i a, so b > a (because for all indices j less than a r j = i j ). In 1 interchange r a and r b. 9

= (i 1,i 2, i a i b i k ) [r b occurs before r a 1 = (r 1,r 2, r a r b r k ) in i 1,i 2,,i k ] i 1 =r 1, i 2 =r 2, i a-1 = r a-1, i a r b but i a = r b 10

We know di 1 di 2 di a di b di k. Since i a = r b, dr b dr a or dr a dr b. In the feasible solution dr a a dr b b So if we interchange r a and r b, the resulting permutation 11 = (s 1, s k ) represents an order with the least index in which 11 and differ is incremented by one. 11

Also the jobs in 11 may be processed without violating a deadline. Continuing in this way, 1 can be transformed into without violating any deadline. Hence the theorem is proved. 12

Theorem2:The Greedy method obtains an optimal solution to the job sequencing problem. Proof: Let(p i, d i ) 1 i n define any instance of the job sequencing problem. Let I be the set of jobs selected by the greedy method. Let J be the set of jobs in an optimal solution. Let us assume I J. 13

If J C I then J cannot be optimal, because less number of jobs gives less profit which is not true for optimal solution. Also, I C J is ruled out by the nature of the Greedy method. (Greedy method selects jobs (i) according to maximum profit order and (ii) All jobs that can be finished before dead line are included). 14

So, there exists jobs a and b such that a I, a J, b J,b I. Let a be a highest profit job such that a I, a J. It follows from the greedy method that p a p b for all jobs b J,b I. (If p b > p a then the Greedy method would consider job b before job a and include it in I). 15

Let S i and S j be feasible schedules for job sets I and J respectively. Let i be a job such that i I and i J. (i.e. i is a job that belongs to the schedules generated by the Greedy method and optimal solution). Let i be scheduled from t to t+1 in S I and t 1 to t 1 +1 in S j. 16

If t < t 1, we may interchange the job scheduled in [t 1 t 1 +1] in S I with i; if no job is scheduled in [t 1 t 1 +1] in S I then i is moved to that interval. With this, i will be scheduled at the same time in S I and S J. The resulting schedule is also feasible. If t 1 < t, then a similar transformation may be made in S j. In this way, we can obtain schedules S 1 I and S 1 J with the property that all the jobs common to I and J are scheduled at the same time. 17

Consider the interval [T a T a +1] in S I 1 in which the job a is scheduled. Let b be the job scheduled in S j 1 in this interval. As a is the highest profit job, p a p b. Scheduling job a from t a to t a +1 in S j 1 and discarding job b gives us a feasible schedule for job set J 1 = J-{b} U {a}. Clearly J 1 has a profit value no less than that of J and differs from in one less job than does J. 18

i.e., J 1 and I differ by m-1 jobs if J and I differ from m jobs. By repeatedly using the transformation, J can be transformed into I with no decrease in profit value. Hence I must also be optimal. 19

GREEDY ALGORITHM FOR JOB SEQUENSING WITH DEADLINE Procedure greedy job (D, J, n) J may be represented by // J is the set of n jobs to be completed// one dimensional array J (1: K) // by their deadlines // The deadlines are J {1} for I 2 to n do If all jobs in JU{i} can be completed D (J(1)) D(J(2)).. D(J(K)) To test if JU {i} is feasible, we insert i into J and verify by their deadlines D(J ) r 1 r k+1 then J JU{I} end if repeat end greedy-job 20

GREEDY ALGORITHM FOR SEQUENCING UNIT TIME JOBS Procedure JS(D,J,n,k) // D(i) 1, 1 i n are the deadlines // // the jobs are ordered such that // // p 1 p 2. p n // // in the optimal solution,d(j(i) D(J(i+1)) // // 1 i k // integer D(o:n), J(o:n), i, k, n, r D(0) J(0) 0 // J(0) is a fictious job with D(0) = 0 // K 1; J(1) 1 // job one is inserted into J // for i 2 to do // consider jobs in non increasing order of pi // 21

GREEDY ALGORITHM FOR SEQUENCING UNIT TIME JOBS (Contd..) // find the position of i and check feasibility of insertion // r k // r and k are indices for existing job in J // // find r such that i can be inserted after r // while D(J(r)) > D(i) and D(i) r do // job r can be processed after i and // // deadline of job r is not exactly r // r r-1 // consider whether job r-1 can be processed after i // repeat 22

GREEDY ALGORITHM FOR SEQUENCING UNIT TIME JOBS (Contd..) if D(J(r)) d(i) and D(i) > r then // the new job i can come after existing job r; insert i into J at position r+1 // for I k to r+1 by 1 do J(I+1) J(l) // shift jobs( r+1) to k right by// //one position // repeat 23

GREEDY ALGORITHM FOR SEQUENCING UNIT TIME JOBS (Contd..) J(r+1) i ; k k+1 // i is inserted at position r+1 // // and total jobs in J are increased by one // repeat end JS 24

COMPLEXITY ANALYSIS OF JS ALGORITHM Let n be the number of jobs and s be the number of jobs included in the solution. The loop between lines 4-15 (the for-loop) is iterated (n-1)times. Each iteration takes O(k) where k is the number of existing jobs. The time needed by the algorithm is 0(sn) s n so the worst case time is 0(n 2 ). If d i = n - i+1 1 i n, JS takes θ(n 2 ) time D and J need θ(s) amount of space. 25

A FASTER IMPLEMENTATION OF JS SET UNION and FIND algorithms and using a better method to determine the feasibility of a partial solution. If J is a feasible subset of jobs, we can determine the processing time for each of the jobs using the following rule. 26

A FASTER IMPLEMENTATION OF JS (Contd..) If job I has not been assigned a processing time, then assign it to slot [ -1, ] where is the largest integer r such that 1 r d i and the slot [ -1, ] is free. This rule delays the processing of jobs i as much as possible, without need to move the existing jobs in order to accommodate the new job. If there is no, the new job is not included. 27

Assignment Q.1)Explain job sequencing with deadline.give one example of it. Q.2)Explain faster implementation of JS giving an example. 28