Algorithm Design and Analysis

Similar documents
Algorithm Design and Analysis

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 421 Greedy Algorithms / Interval Scheduling

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

CS 580: Algorithm Design and Analysis

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Design and Analysis of Algorithms

CS 580: Algorithm Design and Analysis

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

Algorithm Design and Analysis

CS781 Lecture 3 January 27, 2011

Knapsack and Scheduling Problems. The Greedy Method

Proof methods and greedy algorithms

Greedy Algorithms My T. UF

6. DYNAMIC PROGRAMMING I

The Greedy Method. Design and analysis of algorithms Cs The Greedy Method

CSE 421 Dynamic Programming

CS 161: Design and Analysis of Algorithms

Dynamic Programming: Interval Scheduling and Knapsack

Dynamic Programming. Cormen et. al. IV 15

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

Lecture 6: Greedy Algorithms I

Lecture 2: Divide and conquer and Dynamic programming

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

Lecture 9. Greedy Algorithm

Analysis of Algorithms - Midterm (Solutions)

6. DYNAMIC PROGRAMMING I

Algorithms: COMP3121/3821/9101/9801

Matching Residents to Hospitals

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

Dynamic Programming( Weighted Interval Scheduling)

Algorithm Design and Analysis

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

1 Some loose ends from last time

CSE 202 Dynamic Programming II

CS 6901 (Applied Algorithms) Lecture 3

0 1 d 010. h 0111 i g 01100

Algorithm Design and Analysis

Simple Dispatch Rules

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

Algorithm Design Strategies V

Algorithm Design. Scheduling Algorithms. Part 2. Parallel machines. Open-shop Scheduling. Job-shop Scheduling.

Scheduling Lecture 1: Scheduling on One Machine

Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

Algorithm Design and Analysis

Aperiodic Task Scheduling

Algorithms: Lecture 12. Chalmers University of Technology

Minimizing the Number of Tardy Jobs

Single Machine Models

Single Source Shortest Paths

Copyright 2000, Kevin Wayne 1

Algoritmiek, bijeenkomst 3

(a) Write a greedy O(n log n) algorithm that, on input S, returns a covering subset C of minimum size.

Divide-and-Conquer Algorithms Part Two

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

3 Greedy Algorithms. 3.1 An activity-selection problem

General Methods for Algorithm Design

Algorithm Design and Analysis

Dynamic Programming 1

Greedy Algorithms and Data Compression. Curs 2018

Advances in processor, memory, and communication technologies

EECS 477: Introduction to algorithms. Lecture 12

CS 6783 (Applied Algorithms) Lecture 3

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Assignment 4. CSci 3110: Introduction to Algorithms. Sample Solutions

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

Data Structures and Algorithms CSE 465

Scheduling Lecture 1: Scheduling on One Machine

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS 580: Algorithm Design and Analysis

Advanced Analysis of Algorithms - Midterm (Solutions)

Algorithms Exam TIN093 /DIT602

Sorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Greedy Homework Problems

Real-time Scheduling of Periodic Tasks (2) Advanced Operating Systems Lecture 3

Recoverable Robustness in Scheduling Problems

Dynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!

Embedded Systems 14. Overview of embedded systems design

Lecture 14: Nov. 11 & 13

More Approximation Algorithms

Recap & Interval Scheduling

CS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Algorithm Design and Analysis

Static-Priority Scheduling. CSCE 990: Real-Time Systems. Steve Goddard. Static-priority Scheduling

There are three priority driven approaches that we will look at

6. DYNAMIC PROGRAMMING I

IS 709/809: Computational Methods in IS Research Fall Exam Review

Outline / Reading. Greedy Method as a fundamental algorithm design technique

This means that we can assume each list ) is

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes

Solutions to the Midterm Practice Problems

Transcription:

Algorithm Design and Analysis LECTURE 6 Greedy Algorithms Interval Scheduling Interval Partitioning Scheduling to Minimize Lateness Sofya Raskhodnikova S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Optimization problems Coming up: 3 design paradigms Greedy Divide and Conquer Dynamic Programming Illustrated on optimization problems Set of feasible solutions Goal: find the best solution according to some objective function

Design technique #1: Greedy Algorithms S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.3

Greedy Algorithms Build up a solution to an optimization problem at each step shortsightedly choosing the option that currently seems the best. Sometimes good Often does not work Key to designing greedy algorithms: find structure that ensures you don t leave behind other options S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.4

Interval Scheduling Problem Job j starts at s j and finishes at f j. Two jobs are compatible if they do not overlap. Find: maximum subset of mutually compatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.5

Possible Greedy Strategies Consider jobs in some natural order. Take next job if it is compatible with the ones already taken. Earliest start time: ascending order of s j. Earliest finish time: ascending order of f j. Shortest interval: ascending order of (f j s j ). Fewest conflicts: For each job j, count the number of conflicting jobs c j. Schedule in ascending order of c j. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.6

Greedy: Counterexamples for earliest start time for shortest interval for fewest conflicts S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.7

Formulating an Algorithm Input: arrays of start and finishing times s 1, s 2,,s n f 1, f 2,, f n Input length? 2n = ϴ(n) S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.8

Greedy Algorithm Earliest finish time: ascending order of f j. Sort jobs by finish times so that f 1 f 2... f n. A // Set of jobs selected so far for j = 1 to n if (job j compatible with A) A A {j} return A Implementation: How do we quickly test if j is compatible with A? Store job j* that was added last to A. Job j is compatible with A if s j f j*. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.9

Time and space analysis O(n log n) O(1) n O(1) Sort jobs by finish times so that f 1 f 2... f n. A (empty) // Queue of selected jobs j* 0 for j = 1 to n if (f j* s j ) enqueue(j onto A) j* j return A O(n log n) time; O(n) space. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.10

Analysis: Greedy Stays Ahead Theorem. Greedy algorithm s solution is optimal. Proof strategy (by contradiction): Suppose greedy is not optimal. Consider an optimal solution which one? optimal solution that agrees with the greedy solution for as many initial jobs as possible Look at the first place in the list where optimal solution differs from the greedy solution Show a new optimal solution that agrees more w/ greedy Contradiction! S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Analysis: Greedy Stays Ahead Theorem: Greedy algorithm s solution is optimal. Proof (by contradiction): Suppose greedy not optimal. Let i 1, i 2, i k denote set of jobs selected by greedy. Let j 1, j 2, j m be the optimal solution with i 1 = j 1, i 2 = j 2,, i r = j r for the largest possible value of r. If r < k, then? job i r+1 finishes before j r+1 Greedy: i 1 i 2 i r i r+1 i r+2 OPT: j 1 j 2 j r j r+1... why not replace job j r+1 with job i r+1? S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Analysis: Greedy Stays Ahead Theorem: Greedy algorithm s solution is optimal. Proof (by contradiction): Suppose greedy not optimal. Let i 1, i 2, i k denote set of jobs selected by greedy. Let j 1, j 2, j m be the optimal solution with i 1 = j 1, i 2 = j 2,, i r = j r for the largest possible value of r. If r < k, then? job i r+1 finishes before j r+1 solution still feasible and optimal, but contradicts maximality of r. Greedy: i 1 i 2 i r i r+1 j 1 j 2 j r... OPT: i r+1 S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Analysis: Greedy Stays Ahead Theorem: Greedy algorithm s solution is optimal. Proof (by contradiction): Suppose greedy not optimal. Let i 1, i 2, i k denote set of jobs selected by greedy. Let j 1, j 2, j m be the optimal solution with i 1 = j 1, i 2 = j 2,, i r = j r for the largest possible value of r. If r < k, then we get contradiction. Could it be that r = k but k < m? Greedy: i 1 i 2 i r i r+1 j 1 j 2 j r... OPT: i r+1 S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Analysis: Greedy Stays Ahead Theorem: Greedy algorithm s solution is optimal. Proof (by contradiction): Suppose greedy not optimal. Let i 1, i 2, i k denote set of jobs selected by greedy. Let j 1, j 2, j m be the optimal solution with i 1 = j 1, i 2 = j 2,, i r = j r for the largest possible value of r. If r < k, we get a contradiction by replacing j r+1 with i r+1 because we get an optimal solution with larger r. If r = k but m > k, we get a contradiction because greedy algorithm stopped before all jobs were considered. Greedy: job i r+1 finishes before j r+1 i 1 i 2 i r i r+1 j 1 j 2 j r... OPT: i r+1

Alternate Way to See the Proof Induction statement P(k): There is an optimal solution that agrees with the greedy solution in the first k jobs. P(n) is what we want to prove. Base case: P(0) We essentially proved the induction step S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Interval Partitioning

Interval Partitioning Lecture j starts at s j and finishes at f j. Input: s 1,, s n and f 1,, f n. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. E.g.: 10 lectures are scheduled in 4 classrooms. 4 e j 3 c d g 2 b h 1 a f i 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Interval Partitioning Lecture j starts at s j and finishes at f j. Input: s 1,, s n and f 1,, f n. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. E.g.: Same lectures scheduled in 3 classrooms. 3 c d f j 2 b g i 1 a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Lower Bound Definition. The depth of a set of open intervals is the maximum number that contain any given time. Key lemma. Number of classrooms needed depth. E.g.: Depth of this schedule = 3 this schedule is optimal. a, b, c all contain 9:30 3 c d f j 2 b g i 1 a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Q: Is it always sufficient to have number of classrooms = depth? S. Raskhosnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Greedy Algorithm Consider lectures in increasing order of start time: assign lecture to any compatible classroom. Sort intervals by starting time so that s 1 s 2... s n. d 0 // Number of allocated classrooms for j = 1 to n if (lecture j is compatible with some classroom k) schedule lecture j in classroom k else allocate a new classroom d + 1 schedule lecture j in classroom d + 1 d d + 1 Implementation. O(n log n) time; O(n) space. For each classroom, maintain the finish time of the last job added. Keep the classrooms in a priority queue Using a heap, main loop takes O(n log d) time S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.21

Analysis: Structural Argument Observation. Greedy algorithm never schedules two incompatible lectures in the same classroom. Theorem. Greedy algorithm is optimal. Proof: Let d = number of classrooms allocated by greedy. Classroom d is opened because we needed to schedule a lecture, say j, that is incompatible with all d 1 last lectures in other classrooms. These d lectures each end after s j. Since we sorted by start time, they start no later than s j. Thus, we have d lectures overlapping at time s j +. Key lemma all schedules use d classrooms. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.22

Duality Our first example of duality! High-level overview of proof of correctness: Identify obstacles to scheduling in few classrooms Sets of overlapping lectures Show that our algorithm s solution matches some obstacle If our solution uses d classrooms, then there is a set of d overlapping lectures Conclude that our solution cannot be improved S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne

Scheduling to minimize lateness S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.24

Scheduling to Minimizing Lateness Minimizing lateness problem. Single resource processes one job at a time. Job j requires t j units of processing time and is due at time d j. If j starts at time s j, it finishes at time f j = s j + t j. Lateness: j = max { 0, f j - d j }. Goal: schedule all jobs to minimize maximum lateness L = max j. 1 2 3 4 5 6 t j 3 2 1 4 3 2 d j 6 8 9 9 14 15 lateness = 2 lateness = 0 max lateness = 6 d 3 = 9 d 2 = 8 d 6 = 15 d 1 = 6 d 5 = 14 d 4 = 9 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.25

Greedy strategies? S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.26

Minimizing Lateness: Greedy Strategies Greedy template: consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time t j. [Earliest deadline first] Consider jobs in ascending order of deadline d j. [Smallest slack] Consider jobs in ascending order of slack d j - t j. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.27

Minimizing Lateness: Greedy Strategies Greedy template: consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time t j. d j 1 t j 1 100 2 10 10 counterexample [Smallest slack] Consider jobs in ascending order of slack d j - t j. 1 t j 1 2 d j 2 10 10 counterexample S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.28

Minimizing Lateness: Greedy Algorithm [Earliest deadline first] Sort n jobs by deadline so that d 1 d 2 d n t 0 for j = 1 to n Assign job j to interval [t, t + t j ] s j t, f j t + t j t t + t j output intervals [s j, f j ] max lateness = 1 d 1 = 6 d 2 = 8 d 3 = 9 d 4 = 9 d 5 = 14 d 6 = 15 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.29

Minimizing Lateness: No Idle Time Observation. There exists an optimal schedule with no idle time. d = 4 d = 6 0 1 2 3 4 5 6 d = 12 7 8 9 10 11 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 Observation. The greedy schedule has no idle time. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.30

Minimizing Lateness: Inversions An inversion in schedule S is a pair of jobs i and j such that d i < d j inversion but j scheduled before i. j i Observation. Greedy schedule has no inversions. Observation. If a schedule (with no idle time) has an inversion, it has one with a pair of inverted jobs scheduled consecutively. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.31

Minimizing Lateness: Inversions An inversion in schedule S is a pair of jobs i and j such that d i < d j inversion but j scheduled before i. before swap after swap Claim. Swapping two adjacent, inverted jobs reduces the number of inversions by one and does not increase the max lateness. Proof: Let be the lateness before the swap, and let ' be the lateness afterwards. ' k = k for all k i, j ' i i If job j is late: j i j f i i i j d i f d f d i j j j f i f' j (definition) ( j finishes at time (d i d ) (definition) j f i ) S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.32

Minimizing Lateness: Analysis Theorem. Greedy schedule S is optimal. Proof: Define S* to be an optimal schedule that has the fewest number of inversions. Can assume S* has no idle time. If S* has no inversions, then S = S*. If S* has an inversion, let i-j be an adjacent inversion. Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions. This contradicts the definition of S*. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.33

Summary: Greedy Analysis Strategies Greedy algorithm stays ahead. Show that after each step of the greedy algorithm, its solution is at least as good as any other algorithm's. Structural. Discover a simple "structural" bound asserting that every possible solution must have a certain value. Then show that your algorithm always achieves this bound. Exchange argument. Gradually transform any solution to the one found by the greedy algorithm without hurting its quality. S. Raskhodnikova; based on slides by E. Demaine, C. Leiserson, A. Smith, K. Wayne L6.34