Greedy Algorithms. Kleinberg and Tardos, Chapter 4

Similar documents
Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithm Design and Analysis

CS 580: Algorithm Design and Analysis

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 421 Greedy Algorithms / Interval Scheduling

CS781 Lecture 3 January 27, 2011

Algorithm Design and Analysis

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

Design and Analysis of Algorithms

CS 580: Algorithm Design and Analysis

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

Proof methods and greedy algorithms

Algorithms: COMP3121/3821/9101/9801

Greedy Algorithms My T. UF

CS 161: Design and Analysis of Algorithms

Knapsack and Scheduling Problems. The Greedy Method

Algorithm Design Strategies V

6. DYNAMIC PROGRAMMING I

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

Dynamic Programming: Interval Scheduling and Knapsack

0 1 d 010. h 0111 i g 01100

CSE 421 Dynamic Programming

Greedy Homework Problems

Algoritmiek, bijeenkomst 3

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

Solutions to the Midterm Practice Problems

Do not turn this page until you have received the signal to start. Please fill out the identification section above. Good Luck!

Lecture 6: Greedy Algorithms I

CS325: Analysis of Algorithms, Fall Final Exam

1 Some loose ends from last time

Algorithms Exam TIN093 /DIT602

Simple Dispatch Rules

Dynamic Programming. Cormen et. al. IV 15

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

CMPSCI611: The Matroid Theorem Lecture 5

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

Matching Residents to Hospitals

Dynamic Programming( Weighted Interval Scheduling)

CS 6901 (Applied Algorithms) Lecture 3

Decision Mathematics D1

Scheduling Lecture 1: Scheduling on One Machine

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

6. DYNAMIC PROGRAMMING I

Algorithms: Lecture 12. Chalmers University of Technology

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

CS 598RM: Algorithmic Game Theory, Spring Practice Exam Solutions

D1 Discrete Mathematics The Travelling Salesperson problem. The Nearest Neighbour Algorithm The Lower Bound Algorithm The Tour Improvement Algorithm

More Approximation Algorithms

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

CSE 202 Dynamic Programming II

Minimizing total weighted tardiness on a single machine with release dates and equal-length jobs

CSEP 521 Applied Algorithms. Richard Anderson Winter 2013 Lecture 1

6. DYNAMIC PROGRAMMING II. sequence alignment Hirschberg's algorithm Bellman-Ford distance vector protocols negative cycles in a digraph

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

A note on semi-online machine covering

Single Machine Models

6. DYNAMIC PROGRAMMING II

Introduction to Computer Science and Programming for Astronomers

Outline / Reading. Greedy Method as a fundamental algorithm design technique

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Lecture 2: Divide and conquer and Dynamic programming

Algorithm Design and Analysis

Online Appendix for Coordination of Outsourced Operations at a Third-Party Facility Subject to Booking, Overtime, and Tardiness Costs

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

General Methods for Algorithm Design

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

1 Ordinary Load Balancing

CS 6783 (Applied Algorithms) Lecture 3

Minimizing the Number of Tardy Jobs

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Decision Mathematics D1 Advanced/Advanced Subsidiary. Tuesday 9 June 2015 Morning Time: 1 hour 30 minutes

This means that we can assume each list ) is

Dynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!

2. A vertex in G is central if its greatest distance from any other vertex is as small as possible. This distance is the radius of G.

Optimisation and Operations Research

Scheduling Parallel Jobs with Linear Speedup

Algorithm Design. Scheduling Algorithms. Part 2. Parallel machines. Open-shop Scheduling. Job-shop Scheduling.

Decision Mathematics D1

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Scheduling Lecture 1: Scheduling on One Machine

CSC 421: Algorithm Design & Analysis. Spring 2018

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Online Optimization of Busy Time on Parallel Machines

Real-time Scheduling of Periodic Tasks (2) Advanced Operating Systems Lecture 3

ECOM Discrete Mathematics

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

Lecture 2: Scheduling on Parallel Machines

Recoverable Robustness in Scheduling Problems

CSE 417, Winter Greedy Algorithms

Lecture 11 October 7, 2013

Transcription:

Greedy Algorithms Kleinberg and Tardos, Chapter 4 1

Selecting breakpoints Road trip from Fort Collins to Durango on a given route. Fuel capacity = C. Goal: makes as few refueling stops as possible. Greedy algorithm. Go as far as you can before refueling. C C C C Fort Collins C C C Durango 2

Selecting Breakpoints: Greedy Algorithm The road trip algorithm. Sort breakpoints so that: 0 = b 0 < b 1 < b 2 <... < b n = L S {0} x 0 breakpoints selected current distance while (x b n ) let p be largest integer such that b p x + C if (b p = x) return "no solution" x b p S S {p} return S 3

Coin Changing Goal. Given currency denominations: 1, 5, 10, 25, 100, devise a method to pay amount to customer using fewest number of coins. Example: 34. Cashier's algorithm. At each iteration, add coin of the largest value that does not take us past the amount to be paid. Example: $2.89. 4

Coin-Changing: Greedy Algorithm Cashier's algorithm. Use the maximal number of the largest denomination x amount to be changed Sort coins denominations by value: c 1 < c 2 < < c n. S φ coins selected while (x > 0) { let k be largest integer such that c k x if (k = 0) return "no solution found" x x - c k S S {k} } return S Can be proven to be optimal! 5

Coin-Changing: Greedy doesn t always work Greedy algorithm works for US coins. Fails when changing 40 when the denominations are 1, 5, 10, 20, 25. 6

Interval Scheduling Job j starts at s j and finishes at f j. Two jobs compatible if they don't overlap. Goal: find maximum size subset of compatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 7

Interval Scheduling: Greedy Algorithms Greedy template. Consider jobs in some natural order. Take each job provided it's compatible with the ones already taken. [Earliest start time] Consider jobs in ascending order of s j. [Earliest finish time] Consider jobs in ascending order of f j. [Shortest interval] Consider jobs in ascending order of f j s j. [Fewest conflicts] For each job j, count the number of conflicting jobs c j. Schedule in ascending order of c j. 8

Interval Scheduling: Greedy Algorithms Greedy template. Consider jobs in some natural order. Take each job provided it's compatible with the ones already taken. counterexample for earliest start time counterexample for shortest interval counterexample for fewest conflicts 9

Interval Scheduling: Greedy Algorithm Greedy algorithm. Consider jobs in increasing order of finish time. Take each job provided it's compatible with the ones already taken. Sort jobs by finish times so that f 1 f 2... f n. set of jobs selected A φ for j = 1 to n { if (job j compatible with A) A A {j} } return A Implementation. O(n log n). Remember job j* that was added last to A. Job j is compatible with A if s j f j*. 10

Interval Scheduling: Analysis Theorem. The greedy algorithm is optimal. Proof. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let i 1, i 2,... i k be the jobs selected by greedy. Let j 1, j 2,... j m be the jobs in the optimal solution with i 1 = j 1, i 2 = j 2,..., i r = j r for the largest possible value of r. job i r+1 finishes before j r+1 Greedy: i 1 i 2 i r i r+1... OPT: j 1 j 2 j r j r+1... why not replace job j r+1 with job i r+1? 11

Interval Scheduling: Analysis Theorem. The greedy algorithm is optimal. Proof. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let i 1, i 2,... i k be the jobs selected by greedy. Let j 1, j 2,... j m be the jobs in the optimal solution with i 1 = j 1, i 2 = j 2,..., i r = j r for the largest possible value of r. Greedy: i 1 i 2 i r i r+1... OPT: j 1 j 2 j r i r+1... solution still feasible and optimal 12

Interval Scheduling: Analysis Theorem. The greedy algorithm is optimal. Proof. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let i 1, i 2,... i k be the jobs selected by greedy. Let j 1, j 2,... j m be the jobs in the optimal solution with i 1 = j 1, i 2 = j 2,..., i r = j r for the largest possible value of r. By assumption, the optimal solution contains a larger number of jobs. Greedy: i 1 i 2 i r i r+1 OPT: j 1 j 2 j r i r+1... the one with the earliest end time should have been chosen by the greedy algorithm - contradiction 13

Greedy algorithms Greedy algorithms determine a globally optimum solution by a series of locally optimal choices

Interval Scheduling Lecture j starts at s j and finishes at f j. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. This schedule uses 4 classrooms to schedule 10 lectures: 4 e j 3 c d g 2 b h 1 a f i 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 15

Interval Scheduling Lecture j starts at s j and finishes at f j. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. This schedule uses 3: 3 c d f j 2 b g i 1 a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 16

Interval Scheduling: Lower Bound Key observation. Number of classrooms needed depth (maximum number of intervals at a time point) Example: Depth of schedule below = 3 schedule is optimal. Q. Does there always exist a schedule equal to depth of intervals? 3 c d f j 2 b g i 1 a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 17

Interval Scheduling: Greedy Algorithm Greedy algorithm. Consider lectures in increasing order of start time and assign lecture to any compatible classroom. allocate d labels(d = depth) sort the intervals by starting time: I 1,I 2,..,I n for j = 1 to n for each interval I i that precedes and overlaps with I j exclude its label for I j pick a remaining label for I j 18

Greedy works allocate d labels (d = depth) sort the intervals by starting time: I 1,I 2,..,I n for j = 1 to n for each interval I i that precedes and overlaps with I j exclude its label for I j pick a remaining label for I j Observations: There is always a label for I j No overlapping intervals get the same label

Scheduling to Minimize Lateness Minimizing lateness problem. Single resource processes one job at a time. Job j requires t j units of processing time and is due at time d j. If j starts at time s j, it finishes at time f j = s j + t j. Lateness: j = max { 0, f j - d j }. Goal: schedule all jobs to minimize maximum lateness - max j. Example: 1 t j 3 2 2 3 1 4 4 5 3 6 2 d j 6 8 9 9 14 15 lateness = 2 lateness = 0 max lateness = 6 d 3 = 9 d 2 = 8 d 6 = 15 d 1 = 6 d 5 = 14 d 4 = 9 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 20

Minimizing Lateness: Greedy Strategies Greedy template. Consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time t j. [Earliest deadline first] Consider jobs in ascending order of deadline d j. [Smallest slack] Consider jobs in ascending order of slack d j - t j. 21

Minimizing Lateness: Greedy Strategies Greedy template. Consider jobs in some order. [Shortest processing time first] Consider jobs in ascending order of processing time t j. d j 1 t j 1 100 2 10 10 counterexample [Smallest slack] Consider jobs in ascending order of slack d j - t j. 1 t j 1 2 d j 2 10 10 counterexample 22

Minimizing Lateness: Greedy Algorithm Greedy algorithm. Earliest deadline first. sort jobs by deadline so that d 1 d 2 d n t 0 for j = 1 to n assign job j to interval [t, t + t j ] t t + t j max lateness = 1 d 1 = 6 d 2 = 8 d 3 = 9 d 4 = 9 d 5 = 14 d 6 = 15 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 23

Minimizing Lateness: No Idle Time Observation. There exists an optimal schedule with no idle time. d = 4 d = 6 0 1 2 3 4 5 6 d = 12 7 8 9 10 11 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 Observation. The greedy schedule has no idle time. 24

Minimizing Lateness: Inversions Def. Given a schedule A, an inversion is a pair of jobs i and j such that: d i < d j but j scheduled before i. inversion j i Observation. Greedy schedule has no inversions. 25

Minimizing Lateness: Inversions Def. Given a schedule A, an inversion is a pair of jobs i and j such that: d i < d j but j scheduled before i. inversion j i Observation. All schedules with no inversions and no idle time have the same maximum lateness How are they different? In the order of jobs with the same deadline are scheduled 26

Minimizing Lateness: Inversions Def. Given a schedule A, an inversion is a pair of jobs i and j such that: d i < d j but j scheduled before i. inversion j i Observation. All schedules with no inversions and no idle time have the same maximum lateness If we show that there is an optimal schedule with no inversions the greedy schedule is optimal. 27

Minimizing Lateness: Inversions Def. Given a schedule A, an inversion is a pair of jobs i and j such that: d i < d j but j scheduled before i. inversion j i Observation. If a schedule (with no idle time) has an inversion, it has one with a pair of inverted jobs scheduled consecutively. 28

Minimizing Lateness: Inversions inversion before swap j i f i after swap i j Claim. Swapping two consecutive, inverted jobs reduces the number of inversions by one and does not increase the max lateness. Proof. Let be the lateness before the swap, and let ' be the lateness afterwards. ' k = k for all k i, j!" ' i j = f j " # d j (definition) i f' j = f i # d j ( j finishes at time f i ) $ f i # d i (i < j) $! i (definition) 29

Minimizing Lateness: Analysis of Greedy Algorithm Theorem. Greedy schedule S is optimal. Proof. Let S* to be an optimal schedule that has the fewest number of inversions. Can assume S* has no idle time. If S* has no inversions, then l(s) = l(s*). If S* has an inversion, let i-j be an adjacent inversion. swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions continue until no inversions left At each step lateness not increased, i.e l(s) l(s*), and since S* is optimal, l(s) = l(s*) 30

A wordle a word collage. Wordle A wordle constructed out of one of the instructor s papers: wordle constructed using the java applet at wordle.net wordle uses a randomized greedy algorithm to solve the packing problem words are placed randomly and ordered by frequency 31

Greedy Analysis Strategies Greedy algorithm stays ahead. Show that after each step of the greedy algorithm, its solution is at least as good as any other. Structural. Discover a simple "structural" bound asserting that every possible solution must have a certain value. Then show that your algorithm always achieves this bound. Exchange argument. Incrementally transform any solution to the greedy one without hurting its quality. Other greedy algorithms. Kruskal, Prim, Dijkstra, Huffman, 32

Selecting Breakpoints: Correctness Theorem. Greedy algorithm is optimal. Pf. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let 0 = g 0 < g 1 <... < g p = L denote set of breakpoints chosen by greedy. Let 0 = f 0 < f 1 <... < f q = L denote set of breakpoints in an optimal solution with f 0 = g 0, f 1 = g 1,..., f r = g r for largest possible value of r. Note: g r+1 > f r+1 by greedy choice of algorithm. Greedy: g 0 g 1 g 2 g r g r+1 OPT:... f 0 f 1 f 2 f r f r+1 f q why doesn't optimal solution drive a little further? 33

Selecting Breakpoints: Correctness Theorem. Greedy algorithm is optimal. Pf. (by contradiction) Assume greedy is not optimal, and let's see what happens. Let 0 = g 0 < g 1 <... < g p = L denote set of breakpoints chosen by greedy. Let 0 = f 0 < f 1 <... < f q = L denote set of breakpoints in an optimal solution with f 0 = g 0, f 1 = g 1,..., f r = g r for largest possible value of r. Note: g r+1 > f r+1 by greedy choice of algorithm. Greedy: g 0 g 1 g 2 g r g r+1 OPT:... f 0 f 1 f 2 f r f q another optimal solution has one more breakpoint in common contradiction 34