Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

Similar documents
CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

Greedy Algorithms My T. UF

CS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming

Knapsack and Scheduling Problems. The Greedy Method

Algorithm Design Strategies V

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

Lecture 9. Greedy Algorithm

Outline / Reading. Greedy Method as a fundamental algorithm design technique

CMPSCI611: The Matroid Theorem Lecture 5

The Greedy Method. Design and analysis of algorithms Cs The Greedy Method

Dynamic Programming( Weighted Interval Scheduling)

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

This means that we can assume each list ) is

Introduction to Computer Science and Programming for Astronomers

Data Structures in Java

Lecture 2: Divide and conquer and Dynamic programming

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

CS 6901 (Applied Algorithms) Lecture 3

CS483 Design and Analysis of Algorithms

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

a 1 a 2 a 3 a 4 v i c i c(a 1, a 3 ) = 3

Do not turn this page until you have received the signal to start. Please fill out the identification section above. Good Luck!

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

Today s Outline. CS 362, Lecture 4. Annihilator Method. Lookup Table. Annihilators for recurrences with non-homogeneous terms Transformations

CS Analysis of Recursive Algorithms and Brute Force

CS 6783 (Applied Algorithms) Lecture 3

CS 598RM: Algorithmic Game Theory, Spring Practice Exam Solutions

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

CS325: Analysis of Algorithms, Fall Final Exam

Today s Outline. CS 362, Lecture 13. Matrix Chain Multiplication. Paranthesizing Matrices. Matrix Multiplication. Jared Saia University of New Mexico

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Computational Complexity. IE 496 Lecture 6. Dr. Ted Ralphs

ECS122A Handout on NP-Completeness March 12, 2018

The knapsack problem

University of California Berkeley CS170: Efficient Algorithms and Intractable Problems November 19, 2001 Professor Luca Trevisan. Midterm 2 Solutions

COMP Analysis of Algorithms & Data Structures

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 421 Dynamic Programming

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

Minimizing the Number of Tardy Jobs

Lecture 4: An FPTAS for Knapsack, and K-Center

MITOCW Lec 11 MIT 6.042J Mathematics for Computer Science, Fall 2010

Intractable Problems Part Two

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Lecture 4: Best, Worst, and Average Case Complexity. Wednesday, 30 Sept 2009

CS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016

CS 241 Analysis of Algorithms

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Algorithm Design and Analysis

CSC 421: Algorithm Design & Analysis. Spring 2018

CS173 Strong Induction and Functions. Tandy Warnow

Dynamic Programming: Interval Scheduling and Knapsack

More Approximation Algorithms

(a) Write a greedy O(n log n) algorithm that, on input S, returns a covering subset C of minimum size.

Question Paper Code :

Greedy vs Dynamic Programming Approach

Week 7 Solution. The two implementations are 1. Approach 1. int fib(int n) { if (n <= 1) return n; return fib(n 1) + fib(n 2); } 2.

Dynamic Programming (CLRS )

Admin NP-COMPLETE PROBLEMS. Run-time analysis. Tractable vs. intractable problems 5/2/13. What is a tractable problem?

CSE541 Class 22. Jeremy Buhler. November 22, Today: how to generalize some well-known approximation results

NP-Complete Reductions 2

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

Minimizing Mean Flowtime and Makespan on Master-Slave Systems

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Matroids and Greedy Algorithms Date: 10/31/16

Dynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!

Exercises NP-completeness

Design and Analysis of Algorithms

University of New Mexico Department of Computer Science. Final Examination. CS 362 Data Structures and Algorithms Spring, 2007

(tree searching technique) (Boolean formulas) satisfying assignment: (X 1, X 2 )

COMP-330 Theory of Computation. Fall Prof. Claude Crépeau. Lecture 2 : Regular Expressions & DFAs

Approximation Algorithms (Load Balancing)

Notes for Lecture 21

Recoverable Robustness in Scheduling Problems

CHAPTER 3 FUNDAMENTALS OF COMPUTATIONAL COMPLEXITY. E. Amaldi Foundations of Operations Research Politecnico di Milano 1

1 Closest Pair of Points on the Plane

if the two intervals overlap at most at end

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

6. DYNAMIC PROGRAMMING I

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

Intro to Annihilators. CS 362, Lecture 3. Today s Outline. Annihilator Operators

Algorithms: Lecture 12. Chalmers University of Technology

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

10.3 Matroids and approximation

2. Introduction to commutative rings (continued)

Algorithm Design and Analysis

Lecture 6: Greedy Algorithms I

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Lecture 13: Dynamic Programming Part 2 10:00 AM, Feb 23, 2018

Math Homework 5 Solutions

Transcription:

Today s Outline CS 561, Lecture 15 Jared Saia University of New Mexico Greedy Algorithm Intro Activity Selection Knapsack 1 Greedy Algorithms Activity Selection Greed is Good - Michael Douglas in Wall Street A greedy algorithm always makes the choice that looks best at the moment Greedy algorithms do not always lead to optimal solutions, but for many problems they do In the next week, we will see several problems for which greedy algorithms produce optimal solutions including: activity selection, fractional knapsack. When we study graph theory, we will also see that greedy algorithms can work well for computing shortest paths and finding minimum spanning trees. You are given a list of programs to run on a single processor Each program has a start time and a finish time However the processor can only run one program at any given time, and there is no preemption (i.e. once a program is running, it must be completed) 2 3

Another Motivating Problem Example Imagine you are given the following set of start and stop times for activities Suppose you are at a film fest, all movies look equally good, and you want to see as many complete movies as possible This problem is also exactly the same as the activity selection problem. time 4 5 Ideas Greedy Activity Selector There are many ways to optimally schedule these activities Brute Force: examine every possible subset of the activites and find the largest subset of non-overlapping activities Q: If there are n activities, how many subsets are there? The book also gives a DP solution to the problem 1. Sort the activities by their finish times 2. Schedule the first activity in this list 3. Now go through the rest of the sorted list in order, scheduling activities whose start time is after (or the same as) the last scheduled activity (note: code for this algorithm is in section 16.1) 6 7

Greedy Algorithm Greedy Scheduling of Activities Sorting the activities by their finish times time time 8 9 Analysis Optimality Let n be the total number of activities The algorithm first sorts the activities by finish time taking O(n log n) Then the algorithm visits each activity exactly once, doing a constant amount of work each time. This takes O(n) Thus total time is O(n log n) The big question here is: Does the greedy algorithm give us an optimal solution??? Surprisingly, the answer turns out to be yes We can prove this is true by contradiction 10 11

Proof of Optimality Proof of Optimality Let A be the set of activities selected by the greedy algorithm Consider any non-overlapping set of activities B We will show that A B by showing that we can replace each activity in B with an activity in A This will show that A has at least as many activities as any other non-overlapping schedule and thus that A is optimal. Let a x be the first activity in A that is different than an activity in B Then A = a 1, a 2,..., a x, a x+1,... and B = a 1, a 2,..., b x, b x+1,... But since A was chosen by the greedy algorithm, a x must have a finish time which is earlier than the finish time of b x Thus B = a 1, a 2,..., a x, b x+1,... is also a valid schedule (B = B {b x } {a x } ) Continuing this process, we see that we can replace each activity in B with an activity in A. QED 12 13 What? Greedy pattern We wanted to show that the schedule, A, chosen by greedy was optimal To do this, we showed that the number of activities in A was at least as large as the number of activities in any other non-overlapping set of activities To show this, we considered any arbitrary, non-overlapping set of activities, B. We showed that we could replace each activity in B with an activity in A The problem has a solution that can be given some numerical value. The best (optimal) solution has the highest/lowest value. The solutions can be broken down into steps. The steps have some order and at each step there is a choice that makes up the solution. The choice is based on what s best at a given moment. Need a criterion that will distinguish one choice from another. Finally, need to prove that the solution that you get by making these local choices is indeed optimal 14 15

Activity Selection Pattern Knapsack Problem The value of the solution is the number of non-overlapping activities. The best solution has the highest number. The sorting gives the order to the activities. Each step is examining the next activity in order and decide whether to include it. In each step, the greedy algorithm chooses the activity which extends the length of the schedule as little as possible Those problems for which greedy algorithms can be used are a subset of those problems for which dynamic programming can be used So, it s easy to mistakenly generate a dynamic program for a problem for which a greedy algorithm suffices Or to try to use a greedy algorithm when, in fact, dynamic programming is required The knapsack problem illustrates this difference The 0-1 knapsack problem requires dynamic programming, whereas for the fractional knapsack problem, a greedy algorithm suffices 16 17 0-1 Knapsack Fractional Knapsack The problem: A thief robbing a store finds n items, the i-th item is worth v i dollars and weighs w i pounds, where w i and v i are integers The thief has a knapsack which can only hold W pounds for some integer W The thief s goal is to take as valuable a load as possible Which values should the thief take? In this variant of the problem, the thief can take fractions of items rather than the whole item An item in the 0-1 knapsack is like a gold ingot whereas an item in the fractional knapsack is like gold dust (This is called the 0-1 knapsack problem because each item is either taken or not taken, the thief can not take a fractional amount) 18 19

Greedy Analysis We can solve the fractional knapsack problem with a greedy algorithm: 1. Compute the value per pound (v i /w i ) for each item 2. Sort the items by value per pound 3. The thief then follows the greedy strategy of always taking as much as possible of the item remaining which has highest value per pound. If there are n items, this greedy algorithm takes O(n log n) time We ll show in the in-class exercise that it returns the correct solution Note however that the greedy algorithm does not work on the 0 1 knapsack 20 21 Failure on 0-1 Knapsack Optimality of Greedy on Fractional Say the knapsack holds weight 5, and there are three items Let item 1 have weight 1 and value 3, let item 2 have weight 2 and value 5, let item 3 have weight 3 and value 6 Then the value per pound of the items are: 3, 5/2, 2 respectively The greedy algorithm will then choose item 1 and item 2, for a total value of 8 However the optimal solution is to choose items 2 and 3, for a total value of 11 Greedy is not optimal on 0-1 knapsack, but it is optimal on fractional knapsack To show this, we can use a proof by contradiction 22 23

Proof Proof Assume the objects are sorted in order of cost per pound. Let v i be the value for item i and let w i be its weight. Let x i be the fraction of object i selected by greedy and let V be the total value obtained by greedy Consider some arbitrary solution, B, and let x i be the fraction of object i taken in B and let V be the total value obtained by B We want to show that V V or that V V 0 Let k be the smallest index with x k < 1 Note that for i < k, x i = 1 and for i > k, x i = 0 You will show that for all i, (x i x i ) v i w i (x i x i ) v k w k 24 25 Proof Proof V V = x i v i x i v i (1) i=1 i=1 = (x i x i ) v i (2) i=1 ( ) = (x i x vi i ) w i (3) i=1 w i ( ) (x i x vk i ) w i (4) i=1 w k ( ) vk (x i x i w ) w i (5) k i=1 0 (6) Note that the last step follows because v k w is positive and k because: (x i x i ) w i = x i w i x i w i (7) i=1 i=1 i=1 = W W (8) 0. (9) Where W is the total weight taken by greedy and W is the total weight for the strategy B We know that W W

In-Class Exercise Consider the inequality: (x i x i ) v i w i (x i x i ) v k w k Q1: Show this inequality is true for i < k Q2: Show it s true for i = k Q3: Show it s true for i > k 28