CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Similar documents
Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 421 Greedy Algorithms / Interval Scheduling

CS 580: Algorithm Design and Analysis

Algorithm Design and Analysis

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithm Design and Analysis

CS781 Lecture 3 January 27, 2011

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

CS 580: Algorithm Design and Analysis

Design and Analysis of Algorithms

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

Algorithm Design and Analysis

Algorithms: Lecture 12. Chalmers University of Technology

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Proof methods and greedy algorithms

Greedy Algorithms My T. UF

Lecture 6: Greedy Algorithms I

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

CSE 202 Dynamic Programming II

6. DYNAMIC PROGRAMMING I

Algorithm Design Strategies V

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

CS 161: Design and Analysis of Algorithms

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

Matching Residents to Hospitals

Algorithms: COMP3121/3821/9101/9801

6. DYNAMIC PROGRAMMING II

Algorithms Exam TIN093 /DIT602

6. DYNAMIC PROGRAMMING I

CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

Dynamic Programming: Interval Scheduling and Knapsack

Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

Lecture 9. Greedy Algorithm

Algorithm Design. Scheduling Algorithms. Part 2. Parallel machines. Open-shop Scheduling. Job-shop Scheduling.

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Knapsack and Scheduling Problems. The Greedy Method

Proof Techniques (Review of Math 271)

Chapter 7. Network Flow. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

0 1 d 010. h 0111 i g 01100

Minimizing the Number of Tardy Jobs

Design and Analysis of Algorithms

Quick Sort Notes , Spring 2010

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

Chapter 8. NP and Computational Intractability

CSE 421 Dynamic Programming

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

2. A vertex in G is central if its greatest distance from any other vertex is as small as possible. This distance is the radius of G.

Algorithm Design and Analysis

Greedy Homework Problems

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CS 4407 Algorithms Lecture: Shortest Path Algorithms

Pigeonhole Principle and Ramsey Theory

CS1800: Mathematical Induction. Professor Kevin Gold

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes

CMPSCI611: The Matroid Theorem Lecture 5

8.5 Sequencing Problems. Chapter 8. NP and Computational Intractability. Hamiltonian Cycle. Hamiltonian Cycle

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CS60007 Algorithm Design and Analysis 2018 Assignment 1

1 Terminology and setup

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

1. REPRESENTATIVE PROBLEMS

Advances in processor, memory, and communication technologies

Outline / Reading. Greedy Method as a fundamental algorithm design technique

1 Primals and Duals: Zero Sum Games

Single Machine Models

Algorithms Re-Exam TIN093/DIT600

Do not turn this page until you have received the signal to start. Please fill out the identification section above. Good Luck!

Aperiodic Task Scheduling

Scheduling Lecture 1: Scheduling on One Machine

CSE 421 Greedy: Huffman Codes

Randomized Algorithms

1 Some loose ends from last time

PGSS Discrete Math Solutions to Problem Set #4. Note: signifies the end of a problem, and signifies the end of a proof.

1. REPRESENTATIVE PROBLEMS

Dynamic Programming. Cormen et. al. IV 15

General Methods for Algorithm Design

Foundations of Mathematics MATH 220 FALL 2017 Lecture Notes

Algoritmiek, bijeenkomst 3

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd

arxiv: v1 [cs.dm] 26 Apr 2010

Static-Priority Scheduling. CSCE 990: Real-Time Systems. Steve Goddard. Static-priority Scheduling

Chapter 8. NP and Computational Intractability. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Concrete models and tight upper/lower bounds

APTAS for Bin Packing

Solutions to Quiz 1. Problem 1 (20 points). Suppose S(n) is a predicate on natural numbers, n, and suppose. k N S(k) S(k + 2). (1)

(Refer Slide Time: 0:21)

Transcription:

CSE 417 Chapter 4: Greedy Algorithms Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1

Greed is good. Greed is right. Greed works. Greed clarifies, cuts through, and captures the essence of the evolutionary spirit. - Gordon Gecko (Michael Douglas) 2

Intro: Coin Changing 3

Coin Changing Goal. Given currency denominations: 1, 5, 10, 25, 100, give change to customer using fewest number of coins. Ex: 34 Cashier's algorithm. At each step, give the largest coin valued the amount to be paid. Ex: $2.89 4

Coin-Changing: Does Greedy Always Work? Observation. Greedy is sub-optimal for US postal denominations: 1, 10, 21, 34, 70, 100, 350, 1225, 1500. Counterexample. 140. Greedy: 100, 34, 1, 1, 1, 1, 1, 1. Optimal: 70, 70. 5

Greedy Algorithms what they are Pros intuitive often simple often fast Cons often incorrect! Outline & Goals Proofs are crucial. 3 (of many) techniques: stay ahead structural exchange arguments 6

4.1 Interval Scheduling Proof Technique 1: greedy stays ahead 7

Interval Scheduling Interval scheduling. Job j starts at s j and finishes at f j. Two jobs compatible if they don t overlap. Goal: find max size subset of mutually compatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 8

Interval Scheduling: Greedy Algorithms Greedy template. Consider jobs in some order. Take next job provided it's compatible with the ones already taken. What order? Does that give best answer? Why or why not? Does it help to be greedy about order? 9

Interval Scheduling: Greedy Algorithms Greedy template. Consider jobs in some order. Take each job provided it's compatible with the ones already taken. [Earliest start time] Order jobs by ascending start time s j [Earliest finish time] Order jobs by ascending finish time f j [Shortest interval] Order jobs by ascending interval length f j - s j [Longest Interval] Reverse of the above [Fewest conflicts] For each job j, let c j be the count the number of jobs in conflict with j. Order jobs by ascending c j 10

Can You Find Counterexamples? E.g., Longest Interval: Others?: 11

Interval Scheduling: Greedy Algorithms Greedy template. Consider jobs in some order. Take each job provided it's compatible with the ones already taken. breaks earliest start time breaks shortest interval breaks fewest conflicts 12

Interval Scheduling: Earliest Finish First Greedy Algorithm Greedy algorithm. Consider jobs in increasing order of finish time. Take each job provided it s compatible with the ones already taken. Sort jobs by finish times so that f 1 f 2... f n. jobs selected A f for j = 1 to n { if (job j compatible with A) A A È {j} } return A Implementation. O(n log n). Remember job j* that was added last to A. Job j is compatible with A if s j ³ f j*. 13

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time 0 1 2 3 4 5 6 7 8 9 10 11 14

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B 0 1 2 3 4 5 6 7 8 9 10 11 15

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B C 0 1 2 3 4 5 6 7 8 9 10 11 16

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B A 0 1 2 3 4 5 6 7 8 9 10 11 17

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B E 0 1 2 3 4 5 6 7 8 9 10 11 18

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B ED 0 1 2 3 4 5 6 7 8 9 10 11 19

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B E F 0 1 2 3 4 5 6 7 8 9 10 11 20

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B E G 0 1 2 3 4 5 6 7 8 9 10 11 21

Interval Scheduling B C A E D F 0 1 2 3 4 5 6 7 8 9 10 11 G H Time B E H 0 1 2 3 4 5 6 7 8 9 10 11 22

Interval Scheduling: Correctness Theorem. Earliest Finish First Greedy algorithm is optimal. Pf. ( greedy stays ahead ) Let g 1,... g k be greedy s job picks, j 1,... j m those in some optimal solution Show f(g r ) f(j r ) by induction on r. Basis: g 1 chosen to have min finish time, so f(g 1 ) f(j 1 ) Ind: f(g r ) f(j r ) s(j r+1 ), so j r+1 is among the candidates considered by greedy when it picked g r+1, & it picks min finish, so f(g r+1 ) f(j r+1 ) Similarly, k ³ m, else j k+1 is among (nonempty) set of candidates for g k+1 job j r+1 starts after g r ends, so included in min( ) Greedy: g 1 g 2 g r g r+1 OPT : j 1 j 2 j r j r+1... 23

4.1 Interval Partitioning Proof Technique 2: Structural 24

Interval Partitioning Interval partitioning. Lecture j starts at s j and finishes at f j. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. Ex: This schedule uses 4 classrooms to schedule 10 lectures. Room 4 e j Room 3 c d g Room 2 b h Room 1 a f i 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 25

Vertices = classes; Edges = conflicting class pairs; Different colors = different assigned rooms Room 4 Interval Partitioning as Interval Graph Coloring E Note: graph coloring is very hard in general, but graphs corresponding to interval intersections are a much simpler special case. J Room 3 C D G Room 2 B H Room 1 A F I e j c d g b h a f i 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 26

Interval Partitioning Interval partitioning. Lecture j starts at s j and finishes at f j. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. Ex: Same classes, but this schedule uses only 3 rooms. c d f j b g i a e H 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 27

Interval Partitioning: A Structural Lower Bound on Optimal Solution Def. The depth of a set of open intervals is the maximum number that contain any given time. no collisions at ends Key observation. Number of classrooms needed ³ depth. Ex: Depth of schedule below = 3 Þ schedule is optimal. e.g., a, b, c all contain 9:30 Q. Does a schedule equal to depth of intervals always exist? c d f j b g i a e h 9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30 Time 28

Interval Partitioning: Earliest Start First Greedy Algorithm Greedy algorithm. Consider lectures in increasing order of start time: assign lecture to any compatible classroom. Sort intervals by start time so s 1 s 2... s n. d 0 number of allocated classrooms for j = 1 to n { if (lect j is compatible with some room k, 1 k d) schedule lecture j in classroom k else allocate a new classroom d + 1 schedule lecture j in classroom d + 1 d d + 1 } Implementation. O(n log n). Implementation? Run-time? Exercises For each classroom k, maintain the finish time of the last job added. Keep the classrooms in a priority queue. 29

Interval Partitioning: Greedy Analysis Observation. Earliest Start First Greedy algorithm never schedules two incompatible lectures in the same classroom. Theorem. Earliest Start First Greedy algorithm is optimal. Pf (exploit structural property). Let d = number of rooms the greedy algorithm allocates. Classroom d opened when we needed to schedule a job, say j, incompatible with all d-1 previously used classrooms. We sorted by start time, so all incompatibilities are with lectures starting no later than s j. So, d lectures overlap at time s j + e, i.e. depth ³ d Key observation on earlier slide Þ all schedules use ³ depth rooms, so d = depth and greedy is optimal d 2 1 s j s j +ε 30

4.2 Scheduling to Minimize Lateness Proof Technique 3: Exchange Arguments 31

Scheduling to Minimize Lateness Minimizing lateness problem. Single resource processes one job at a time. Job j requires t j units of processing time & is due at time d j. If j starts at time s j, it finishes at time f j = s j + t j. Lateness:! j = max { 0, f j - d j }. Goal: schedule all to minimize max lateness L = max! j. Ex: j 1 t j 3 2 2 3 1 4 4 5 3 6 2 d j 6 8 9 9 14 15 lateness = 2 lateness = 0 max lateness = 6 d 3 = 9 d 2 = 8 d 6 = 15 d 1 = 6 d 5 = 14 d 4 = 9 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 32

Minimizing Lateness: Greedy Algorithms Greedy template. Consider jobs in some order. [Shortest job first] Consider jobs in ascending order of processing time t j. [Earliest deadline first] Consider jobs in ascending order of deadline d j. [Smallest slack first] Consider jobs in ascending order of slack d j - t j. 33

Minimizing Lateness: Greedy Algorithms Greedy template. Consider jobs in some order. [Shortest job first] Consider in ascending order of processing time t j. d j 1 t j 1 100 2 10 10 counterexample [Smallest slack] Consider in ascending order of slack d j - t j. 1 t j 1 2 d j 2 10 10 counterexample 34

Minimizing Lateness: Greedy Algorithm Greedy algorithm. Earliest deadline first. Sort n jobs by deadline so that d 1 d 2 d n t 0 for j = 1 to n // Assign job j to interval [t, t + t j ]: s j t, f j t + t j t t + t j output intervals [s j, f j ] 1 2 3 4 5 6 t j 3 2 1 4 3 2 d j 6 8 9 9 14 15 d 1 = 6 d 2 = 8 d 3 = 9 d 4 = 9 d 5 = 14 d 6 = 15 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 max lateness = 1(also true if jobs 3 & 4 flipped) 35

Proof Strategy A schedule is an ordered list of jobs Suppose S 1 is any schedule & let G be the/a the greedy algorithm s schedule To show: Lateness(S 1 ) Lateness(G) Idea: find simple changes that successively transform S 1 into other schedules increasingly like G, each better (or at least no worse) than the last, until we reach G. I.e. Lateness(S 1 ) Lateness(S 2 ) Lateness(S 3 ) Lateness(G) If it works for any S 1, it will work for an optimal S 1, so G is optimal HOW?: exchange pairs of jobs 36

Minimizing Lateness: No Idle Time Notes: 1. There is an optimal schedule with no idle time. d = 4 d = 6 0 1 2 3 4 5 6 d = 12 7 8 9 10 11 d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 2. The greedy schedule has no idle time. 37

Minimizing Lateness: Inversions Def. An inversion in schedule S is a pair of jobs i and j s.t.: deadline i < deadline j but j scheduled before i. inversion a b j k i y z later deadline time earlier deadline deadline Greedy schedule has no inversions. Claim: If a schedule has an inversion, it has an adjacent inversion, i.e., a pair of inverted jobs scheduled consecutively. (Pf: If j & i aren t consecutive, then look at the job k scheduled right after j. If d k < d j, then (j,k) is a consecutive inversion; if not, then (k,i) is an inversion, & nearer to each other - repeat.) 38

Minimizing Lateness: Inversions Def. An inversion in schedule S is a pair of jobs i and j s.t.: deadline i < deadline j but j scheduled before i. inversion a b j k i y z later deadline time earlier deadline deadline Claim: Swapping an adjacent inversion reduces tot # invs by 1 (exactly) Pf: Let i,j be an adjacent inversion. For any pair (p,q), inversion status of (p,q) is unchanged by i j swap unless {p, q} = {i, j}, and the i,j inversion is removed by that swap. 39

Minimizing Lateness: Inversions Def. An inversion in schedule S is a pair of jobs i and j s.t.: deadline i < j but j scheduled before i. before swap after swap inversion j i i j f' j f i (j had later deadline, so is less tardy than i was) Claim. Swapping two adjacent, inverted jobs does not increase the max lateness. Pf. Let! /!' be the lateness before / after swap, resp.!' k =! k for all k ¹ i, j!' i! i If job j is now late: " j = f j " d j (definition) = f i d j ( j finishes at time f i ) f i d i (d i d j ) = i (definition) only j moves later, but it s no later than i was, so max not increased 40

Minimizing Lateness: No Inversions Claim. All idle-free, inversion-free schedules S have the same max lateness. Pf. If S has no inversions, then deadlines of scheduled jobs are monotonically nondecreasing (i.e., increase or stay the same) as we walk through the schedule from left to right. Two such schedules can differ only in the order of jobs with the same deadlines. Within a group of jobs with the same deadline, the max lateness is the lateness of the last job in the group - order within the group doesn t matter. deadline 5 deadline 10 deadline 18 A B C B C A t=10 lateness 41

Minimizing Lateness: Correctness of Greedy Algorithm Theorem. Greedy schedule G is optimal Pf. Let S* be an optimal schedule with the fewest number of inversions among all optimal schedules Can assume S* has no idle time. If S* has an inversion, let i-j be an adjacent inversion Swapping i and j does not increase the maximum lateness and strictly decreases the number of inversions This contradicts definition of S* So, S* has no inversions. Hence Lateness(G) = Lateness(S*) 42

Minimizing Lateness New (simpler?) proof. The 7 slides above summarize the correctness proof for the earliest deadline first greedy algorithm for minimizing lateness as presented in our text and in my lecture on 1/25/19. The 6 slides below outline an alternative proof that I think is a little simpler. It uses the same core exchange argument idea, while avoiding the correct but slightly tangential discussion of multiple jobs with the same deadline. It also feels a little more algorithm-oriented in that it shows how to turn an arbitrary schedule into exactly the greedy schedule. Students are not required to study this, but may find it useful. 43

Minimizing Lateness: No Idle Time Claim 1: There is an optimal schedule with no idle time. S d = 4 d = 6 0 1 2 3 4 5 6 d = 12 7 8 9 10 11 S d = 4 d = 6 d = 12 0 1 2 3 4 5 6 7 8 9 10 11 No job ends later in S than S, so max lateness in not increased Henceforth, assume all schedules are idle-free 44

Proof Strategy A schedule is an ordered list of jobs. (No idle; only order matters) Suppose S 1 is any schedule & let G be the the greedy algorithm s schedule To show: Lateness(S 1 ) Lateness(G) Idea: find simple changes that successively transform S 1 into other schedules increasingly like G, each better (or at least no worse) than the last, until we reach G. I.e. Lateness(S 1 ) Lateness(S 2 ) Lateness(S 3 ) Lateness(G) If it works for any S 1, it will work for an optimal S 1, so G is optimal HOW?: exchange pairs of jobs 45

Minimizing Lateness: Inversions (Re-)number the jobs in the order that Greedy schedules them. Then a Schedule is just permutation of 1..n. E.g.: G: 1 2 3 4 5 S: 4 5 1 2 3 Def. An inversion in schedule S is a pair of jobs i and j s.t. greedy did i before j (i.e., i < j), but S does j before i. E.g., (4,2) in S above; also (4,1), (5,3), Claim 2: If schedule S has an inversion, it has an adjacent inversion, i.e., a pair of inverted jobs scheduled consecutively. Ex: (4,2) are not adjacent, but (5,1) is an adjacent inversion Pf: An adjacent pair x,y in S is an adjacent inversion if y is smaller than x. The sublist of S from j to i must have such an adjacent pair since i is smaller than j. A walk from high to low must have a 1 st step down. Ex: 4 5 1 2 46

Minimizing Lateness: Inversions Claim 3: Swapping an adjacent inversion reduces the total number of inversions by 1 (exactly) Pf: To be clear about the defn, since S is just a list of the numbers between 1 and n, in some order, for any p q in 1..n, p,q is an inversion the larger precedes the smaller in list S. Let i, j be an adjacent inversion. Inversion status of any pair p,q is unchanged by i j swap unless {p,q} = {i,j}, and the i,j inversion is removed by that swap. In more detail, if neither p nor q is either i or j, then neither p nor q moves, so status is unchanged. If one of p,q is i or j, say, p i, and q=j, then since j is moved only one position in the list (i & j are adjacent), it can t move to the other side of p, and again status is unchanged. 47

Minimizing Lateness: Inversions before swap after swap Def. An inversion in schedule S is a pair of jobs i and j s.t. greedy did i before j (i.e., i < j), but S does j before i. Claim 4. Swapping two adjacent, inverted jobs does not increase the max lateness. j i inversion i j f i f ' j (j had later or equal deadline, so is not tardier after swap than i was before swap) Pf. Let! /!' be the lateness before / after swap, resp.!' k =! k for all k ¹ i, j!' i! i If job j is now late: " j = f j " d j (definition) = f i d j ( j finishes at time f i ) f i d i (d i d j ) = i (definition) only j moves later, but it s no later than i was, so max not increased 48

Minimizing Lateness: Correctness of Greedy Algorithm Theorem. Greedy schedule G is optimal Pf. Let S 1 be an optimal schedule. If S 1 has idle time, by claim 1, we can remove it to form S 2 without increasing lateness. If S 2 has any inversions, by claim 2 it has an adjacent inversion, and by claims 3 & 4, we can swap to form S 3 which has fewer inversions and no greater maximum lateness. Repeating this produces an idle-free, inversion-free schedule, which is exactly the greedy schedule G, without ever having increased lateness. Hence Lateness(G) Lateness(S 1 ), and so is optimal. A slightly tidier way to say this: Among all optimal schedules, let S* be one with the fewest inversions, and, by claim 1, no idle time. If S* has inversions, it has adjacent inversions (claim 2); swapping one decreases the number of inversions (claim 3) without increasing maximum lateness (claim 4), contradicting choice of S*. So, S* has no inversions nor idle time. But that s exactly schedule G, hence G is optimal. 49

Optional Exercise Here s an outline for a third proof, that is, in my opinion, even simpler. You might enjoy fleshing this out as an exercise. Defn: two vectors (u 1, u 2, u n ) and (v 1, v 2,, v n ) are lexicographically ordered u v if for some i, u 1 =v 1, u 2 =v 2,, u i-1 =v i-1, and u i < v i I.e., they are identical in their first i-1 positions, and u is smaller in the i th, the first position where they differ. Ex: the 6 permutations of 1,2,3 in lex order:123 132 213 231 312 321 Proof Outline: Let S* be the lexicographically first idle-free optimal schedule. Argue by contradiction that S* = G, since otherwise, letting i be the 1 st position where they differ, S* looks like (1, 2, 3,, i-2, i-1, x, y,, z, i, ) where x i. But z must be larger than i (why?), so z,i is an adjacent inversion; flipping it gives a lexicographically smaller sequence of no larger max lateness, contradicting choice of S*. (This uses claims 1 & 4 above; claims 2 & 3 are no longer needed.) 50

Greedy Analysis Strategies Greedy algorithm stays ahead. Show that after each step of the greedy algorithm, its solution is at least as good as any other algorithm's. (Part of the cleverness is deciding what s good. ) Structural. Discover a simple "structural" bound asserting that every possible solution must have a certain value. Then show that your algorithm always achieves this bound. (Cleverness here is usually in finding a useful structural characteristic.) Exchange argument. Gradually transform any solution to the one found by the greedy algorithm without hurting its quality. (Cleverness usually in choosing which pair to swap.) (In all 3 cases, proving these claims may require cleverness, too.) 51

4.4 Shortest Paths in a Graph You ve seen this in prerequisite courses, so this section and next two on min spanning tree are review. I won t lecture on them, but you should review the material. Both, but especially shortest paths, are common problems, having many applications. (And, hint, hint, very frequent fodder for job interview questions ) 52

Shortest Path Problem Shortest path network. Directed graph G = (V, E). Source s, destination t. Length! e = length of edge e. Shortest path problem: find shortest directed path from s to t. cost of path = sum of edge costs in path 9 2 23 3 s 15 14 5 6 20 30 5 18 11 2 16 4 6 6 19 Cost of path s-2-3-5-t = 9 + 23 + 2 + 16 = 48. 7 44 t 53

Dijkstra's Algorithm Dijkstra's algorithm. Maintain a set of explored nodes S for which we have determined the shortest path distance d(u) from s to u. Initialize S = { s }, d(s) = 0. Repeatedly choose unexplored node v which minimizes π ( v) = e min = ( u, v) : u S d( u) + e, add v to S, and set d(v) = p(v). shortest path to some u in explored part, followed by a single edge (u, v) d(u)! e v u S s 54

Dijkstra's Algorithm Dijkstra's algorithm. Maintain a set of explored nodes S for which we have determined the shortest path distance d(u) from s to u. Initialize S = { s }, d(s) = 0. Repeatedly choose unexplored node v which minimizes π ( v) = e min = ( u, v) : u S d( u) + e, add v to S, and set d(v) = p(v). shortest path to some u in explored part, followed by a single edge (u, v) d(u)! e v u S s 55

Summary Greedy algorithms are natural, often intuitive, tend to be simple and efficient But seductive often incorrect! E.g., Change making, depending on the available denominations So, we look at a few examples, each useful in its own right, but emphasize correctness, and various approaches to reasoning about these algorithms Interval Scheduling greedy stays ahead Interval Partitioning greedy matches structural lower bound Minimizing Lateness exchange arguments Next: Huffman codes and another exchange argument Also: This is a good time to review shortest paths and min spanning trees 56