Greedy Algorithms My T. UF

Similar documents
Knapsack and Scheduling Problems. The Greedy Method

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

General Methods for Algorithm Design

Lecture 9. Greedy Algorithm

CS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming

CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

CS 6901 (Applied Algorithms) Lecture 3

1 Some loose ends from last time

Algorithm Design and Analysis

Algorithm Design and Analysis

CS60007 Algorithm Design and Analysis 2018 Assignment 1

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

6. DYNAMIC PROGRAMMING I

CSE 421 Dynamic Programming

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

CS 6783 (Applied Algorithms) Lecture 3

Algorithm Design Strategies V

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

CSE 421 Greedy Algorithms / Interval Scheduling

CS 161: Design and Analysis of Algorithms

CS325: Analysis of Algorithms, Fall Final Exam

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Outline / Reading. Greedy Method as a fundamental algorithm design technique

Proof methods and greedy algorithms

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

10.3 Matroids and approximation

Dynamic Programming( Weighted Interval Scheduling)

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

The Greedy Method. Design and analysis of algorithms Cs The Greedy Method

6. DYNAMIC PROGRAMMING I

Matroids and submodular optimization

CS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016

Chapter 8 Dynamic Programming

Greedy Algorithms and Data Compression. Curs 2018

Dynamic Programming. Cormen et. al. IV 15

Dynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!

CS781 Lecture 3 January 27, 2011

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

CSE 202 Dynamic Programming II

Greedy Homework Problems

Algorithms: COMP3121/3821/9101/9801

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Matroids and Greedy Algorithms Date: 10/31/16

CMPSCI611: The Matroid Theorem Lecture 5

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

Lecture 6: Greedy Algorithms I

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

Chapter 8 Dynamic Programming

Minimum spanning tree

Lecture 2: Divide and conquer and Dynamic programming

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Dynamic Programming: Interval Scheduling and Knapsack

Lecture 13: Dynamic Programming Part 2 10:00 AM, Feb 23, 2018

CS Analysis of Recursive Algorithms and Brute Force

0-1 Knapsack Problem in parallel Progetto del corso di Calcolo Parallelo AA

CSE 421 Introduction to Algorithms Final Exam Winter 2005

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

Matroids. Start with a set of objects, for example: E={ 1, 2, 3, 4, 5 }

Matching Residents to Hospitals

Combinatorial optimization problems

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Greedy Algorithms and Data Compression. Curs 2017

Recoverable Robustness in Scheduling Problems

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Lecture 7: Dynamic Programming I: Optimal BSTs

6. DYNAMIC PROGRAMMING I

Week 7 Solution. The two implementations are 1. Approach 1. int fib(int n) { if (n <= 1) return n; return fib(n 1) + fib(n 2); } 2.

Unit 1A: Computational Complexity

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

A lower bound for scheduling of unit jobs with immediate decision on parallel machines

Aside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n

Notes on MapReduce Algorithms

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

15.1 Matching, Components, and Edge cover (Collaborate with Xin Yu)

Design and Analysis of Algorithms

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

(a) Write a greedy O(n log n) algorithm that, on input S, returns a covering subset C of minimum size.

Weighted Activity Selection

Mat 3770 Bin Packing or

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

CS 5114: Theory of Algorithms. Tractable Problems. Tractable Problems (cont) Decision Problems. Clifford A. Shaffer. Spring 2014

if the two intervals overlap at most at end

A Separator Theorem for Graphs with an Excluded Minor and its Applications

On Two Class-Constrained Versions of the Multiple Knapsack Problem

Algorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

CS 598RM: Algorithmic Game Theory, Spring Practice Exam Solutions

Dynamic Programming (CLRS )

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

REVIEW QUESTIONS. Chapter 1: Foundations: Sets, Logic, and Algorithms

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 17: Combinatorial Problems as Linear Programs III. Instructor: Shaddin Dughmi

Design and Analysis of Algorithms

Computational complexity theory

Transcription:

Introduction to Algorithms Greedy Algorithms @ UF

Overview A greedy algorithm always makes the choice that looks best at the moment Make a locally optimal choice in hope of getting a globally optimal solution Example: Play cards, Invest on stocks, etc. Do not always yield optimal solutions They do for some problems with optimal substructure (like Dynamic Programming) Easier to code than DP 2

An Activity-Selection Problem Input: Set S of n activities, a 1, a 2,, a n. Activity a i starts at time s i and finishes at time f i Two activities are compatible, if their intervals don t overlap. Output: A subset of maximum number of mutually compatible activities. 3

An Activity-Selection Problem Assume activities are sorted by finishing times. Example: f 1 f 2 f n. Possible sets of mutually compatible activies: Largest subsets 4

Activities 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 10 11

Optimal Substructure Suppose an optimal solution includes activity a k. Two subproblems: a 1,, a k-1 a k a k+1,, a n 1. Select compatible activities that finish before a k starts 2. Select compatible activities that finish after a k finishes The solutions to the two subproblems must be optimal (prove using cut-and-paste argument) 6

Recursive Solution S ij : Subset of activities that start after a i finishes and finish before a j starts. a j a i S ij Subproblems: Find c[i, j], maximum number of mutually compatible activities from S ij. Recurrence: 7

Activity-selection: Dynamic Programming We can use Dynamic Programming Memoize OR Bottom-up filling the table entries But, simpler approach exists 8

Early Finish Greedy 1. while ( activities) 2. Select the activity with the earliest finish 3. Remove the activities that are not compatible 4. end while Greedy in the sense that: It leaves as much opportunity as possible for the remaining activities to be scheduled. Maximizes the amount of unscheduled time remaining 9

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Greedy Choice Property Locally optimal choice solution globally optimal Them 16.1: if S is an non-empty activity-selection subproblem, then optimal solution A S such that a s1 A, where a s1 is the earliest finish activity in A Sketch of proof: if optimal solution B that does not contain a s 1, can always replace the first activity in B with a s 1. Same number of activities, thus optimal. 15

Recursive Algorithm Initial call: Complexity: Straightforward to convert to an iterative one 16

Iterative Algorithm Initial call: Complexity: 17

Elements of the greedy strategy 1. Determine the optimal substructure 2. Develop the recursive solution 3. Prove that if we make the greedy choice, only one subproblem remains 4. Prove that it is safe to make the greedy choice 5. Develop a recursive algorithm that implements the greedy strategy 6. Convert the recursive algorithm to an iterative one. 18

Not All Greedy are Equal Earliest finish time: Select the activity with the earliest finish time Optimal Earliest start time: Select activity with the earliest start time Shortest duration: Select activity with the shortest duration d i = f i s i Fewest conflicts: Select activity that conflicts with the least number of other activities first Last start time: Select activity with the last start time 19

Not All Greedy are Equal Earliest start time: Select activity with the earliest start time 2 3 1 20

Not All Greedy are Equal Shortest duration: Select activity with the shortest duration d i = f i s i 2 3 1 21

Not All Greedy are Equal Fewest conflicts: Select activity that conflicts with the least number of other activities first 22

Not All Greedy are Equal Last start time: Select activity with the last start time? 23

Greedy vs. Dynamic Programming Knapsack Problem: A thief robbing a store and find n items: The ith item is worth v i dollars and weighs w i pounds The thief can carry at most W pounds v i, w i, W are integers Which items should he take to obtain the maximum amount of money????? 0-1 knapsack each item is taken or not taken Fractional knapsack fractions of items can be taken 24

Knapsack Problem Both exhibit the optimal-substructure property 0-1: If item j is removed from an optimal packing, the remaining packing is an optimal packing with weight at most W-w j Fractional: If w pounds of item j is removed from an optimal packing, the remaining packing is an optimal packing with weight at most W-w that can be taken from other n-1 items plus w j w of item j 25

Fractional Knapsack Problem Can be solvable by the greedy strategy Compute the value per pound v i /w i for each item Obeying a greedy strategy, take as much as possible of the item with the greatest value per pound. If the supply of that item is exhausted and there is still more room, take as much as possible of the item with the next value per pound, and so forth until there is no more room O(n lg n) (we need to sort the items by value per pound) Greedy Algorithm? Correctness? 26

O-1 knapsack Much harder! Cannot be solved by the greedy strategy. Counter example? We must compare the solution to the sub-problem in which the item is included with the solution to the sub-problem in which the item is excluded before we can make the choice Dynamic Programming 27

Counter Example 28

Matroid Hassler Whitney coined the term matroid while studying rows of a given matrix and sets of linearly independent row vectors. A matroid M = (S, I) satisfies the following conditions 1. A finite ground set S. 2. A nonempty family I of subset of S, called the independent subsets of S such that B I, A B A I (hereditary property.) 3. A I, B I, and A < B x B A such that A {x} I (exchange property). 29

Examples Matrix Matroid Ground set S: The set of rows (or columns) of a matrix Independent sets I: The sets of linearly independent rows (columns) A = 3 1 3 0 2 1 0 2 1 1 0 0 1 1 2 0 1 0 2 0 0-1 0 2 e 1 e 2 e 3 e 4 e 5 e 6 I 1 = {e 1, e 2, e 3, e 4 } are linearly independent I 1 is a independent set. I 2 = {e 1, e 4, e 5, e 6 } are not linearly independent I 2 is a dependent set

Examples- Graphic Matroid Ground set S: The set of edges of graph Independent sets I: The sets of edges that do not form a cycle. 2 2 2 1 3 1 3 1 3 4 G = (V, E) 4 E 1 = {(1;2), (2;4), (3;4)} 4 E 2 = {(1;2), (2;4), (3;4)} Independent set Dependent set

Partition matroid Ground set is the union of m finite disjoint sets E i, for 1 i r Independent sets are sets formed by taking at most one element from each set E i

Matroid Properties Extension: For A I, If x A and A {x} I, then x is an extension of A. Maximal: A is maximal. I and A has no extension, then A All maximal independent subsets in a matroid have the same size. Proof. Suppose A, B are maximal independent sets and A < B. Exchange property implies x B A such that A {x} I, contradicting the assumption that A is maximal. 33

Weighted Matroid Assign weight w(x) > 0 for all x S. Weight of an independent set A I: w (A) = x A w(x), Problem: Find an independent set A has maximum possible weight. I that The optimal independent set is always maximal. Why? 34

Minimum Spanning Tree Connected undirected graph G=(V, E), each edge e E has length w(e). Minimum-spanning-tree problem: Find a subsets of edges that connects all of the vertices and has minimum total length. How to relate to weighted matroid? Minimize length vs. Maximize weight independent set 35

Minimum Spanning Tree Recall M G = (S G, I G ) S G : Set of edges in G (E) I G : Set of acyclic subsets of edges. Weight function: w (e) = w 0 w(e), where w 0 is larger than the maximum length of any edge. A maximal independent set of V - 1 edges. A spanning tree 36

Minimum Spanning Tree For a maximal independent set A of V - 1 elements: A constant Maximize w (A) Minimize w(a) 37

Weighted Matroids: Greedy Algorithm Matroid M = (S, I): Select x S, in order of monotonically decreasing weight, and adds it to the set A if A {x} is still an independent set. Time complexity: O( n log n+ n f(n) ) To check if A {x} I Sorting time 38

Matroid Greedy-choice property Lemma 16.7: Sort S into monotonically decreasing order by w. Let x be the first element of S such that {x} is independent. If x exists, then there exists an optimal subset A of S that contains x. Proof. If no such x exists, is the optimal solution. Assume B is the optimal set and x Construct A = {x}, repeatedly find a new element of B/A that can be added to A (exchange property). B 39

Greedy-choice property Lemma 16.7: Sort S into monotonically decreasing order by w. Let x be the first element of S such that {x} is independent. If x exists, then there exists an optimal subset A of S that contains x. Proof (cont.) Finally, for some y B. We have: B is optimal A is optimal. 40

Greedy-choice property If element x cannot be used immediately by GREEDY, it can never be used later. Proof. Assume x is an extension of an independent set A I (x can be used later) i.e. A {x} I. {x} A {x} {x} I (hereditary property) i.e. x is an extension of (can be used immediately) Hence, if {x} I, x cannot be an extension of A 41

Optimal-substructure property Let x be the first element chosen by GREEDY Subproblem: Find optimal (maximum-weight) independent subset of a new weighted matroid M = (S, I ) A is an optimal independent set of M, iff A = A {x} is an optimal independent set of M. 42

Correctness of GREEDY By Greedy-choice property: Any elements that GREEDY passes over initially will never be useful. If GREEDY select x, then there exists an optimal solution containing x. By Optimal-substructure property The remaining problem after selecting x can be interpreted as finding optimal solution on the new matroid M =(S, I ). Subsequence operation GREEDY will also make correct (optimal) choices. 43

Minimum-spanning tree (again) Three greedy algorithms: Kruskal's 1956 (Boruvka 1926) Prim s 1957 (Jarnik 1930) Sollin s (Baruvka 1926) (Parallel prim s algorithm) Kruskal's algorithm = GREEDY algorithm Select edges in non-decreasing order of weights. Add edges if they do not create cycles. GREEDY s correctness Kruskal s correctness 44

A task-scheduling problem Scheduling unit-time tasks with deadlines and penalties for a single processor : Given : n unit-time tasks : S = {a 1, a 2,,a n } Deadlines d 1,,d n (positive integers) Penalties w 1,,w n 0: we incur a penalty w i if task a i is not finished by time d i. No penalty, otherwise. Find a schedule that minimizes the total penalty 45

A task-scheduling problem Example: Set A of tasks is independent, if we can schedule A so that no task is late. For t = 1,,n N t (A) = number of tasks in A with deadline t or earlier 46

A task-scheduling problem The following statements are equivalent: A is independent. N t (A) t, for all t = 1,,n If the tasks in A are scheduled in order of nondecrease deadlines, then no task is late. Define M = (S, I) S: Set of tasks. I : Family of independent sets of tasks. 47

A task-scheduling problem Theorem: M=(S, I) is a matroid. Proof. I is non-empty. If A is an independent set, then clearly all subsets of A can be scheduled with no late tasks (heredity) Assume A < B, and A, B are independent sets. Let k be the largest t, N t (B) N t (A). N j (B) > N j (A) for all j > k Let x be a task in B/A with deadline k + 1, A = A {x} N t (A) N t (A ) N t (B) A I (exchange) 48

A task-scheduling problem Minimizing total penalty = maximizing weight of an independent subset GREEDY algorithm to find maximum weight independent set Complexity: O(n 2 ) (each of O(n) independence checks takes time O(n) ) 49

A task-scheduling problem Example Select a 1, a 2, a 3, and a 4 Reject a 5, a 6 Select a 7 Final optimal schedule: <a 2,a 4,a 1,a 3,a 7,a 5,a 6 > Penalty incurred: w 5 + w 6 = 50. 50