Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.

Similar documents
Dynamic Programming( Weighted Interval Scheduling)

The Greedy Method. Design and analysis of algorithms Cs The Greedy Method

This means that we can assume each list ) is

CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Knapsack and Scheduling Problems. The Greedy Method

Algorithm Design Strategies V

Recoverable Robustness in Scheduling Problems

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

Algorithm Design and Analysis

5 Integer Linear Programming (ILP) E. Amaldi Foundations of Operations Research Politecnico di Milano 1

hal , version 1-27 Mar 2014

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Operations Research: Introduction. Concept of a Model

CSE 421 Dynamic Programming

CS 161: Design and Analysis of Algorithms

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

Dynamic Programming. Cormen et. al. IV 15

Discrete Mathematics: Logic. Discrete Mathematics: Lecture 17. Recurrence

Embedded Systems 14. Overview of embedded systems design

Module 5: CPU Scheduling

Scheduling with AND/OR Precedence Constraints

Outline / Reading. Greedy Method as a fundamental algorithm design technique

Chapter 6: CPU Scheduling

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Simple Dispatch Rules

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

CS 6901 (Applied Algorithms) Lecture 3

Primal vector is primal infeasible till end. So when primal feasibility attained, the pair becomes opt. & method terminates. 3. Two main steps carried

Real-time operating systems course. 6 Definitions Non real-time scheduling algorithms Real-time scheduling algorithm

Algorithms: COMP3121/3821/9101/9801

4 Sequencing problem with heads and tails

Data Structures in Java

Dynamic Programming: Interval Scheduling and Knapsack

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

SPT is Optimally Competitive for Uniprocessor Flow

What is an integer program? Modelling with Integer Variables. Mixed Integer Program. Let us start with a linear program: max cx s.t.

CSE 202 Dynamic Programming II

Scheduling Periodic Real-Time Tasks on Uniprocessor Systems. LS 12, TU Dortmund

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

3.4 Relaxations and bounds

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

University of New Mexico Department of Computer Science. Final Examination. CS 561 Data Structures and Algorithms Fall, 2006

Advanced Analysis of Algorithms - Midterm (Solutions)

Online Optimization of Busy Time on Parallel Machines

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Week 7 Solution. The two implementations are 1. Approach 1. int fib(int n) { if (n <= 1) return n; return fib(n 1) + fib(n 2); } 2.

Scheduling Online Algorithms. Tim Nieberg

CS60007 Algorithm Design and Analysis 2018 Assignment 1

ISE 330 Introduction to Operations Research: Deterministic Models. What is Linear Programming? www-scf.usc.edu/~ise330/2007. August 29, 2007 Lecture 2

Greedy Homework Problems

Open Problems in Throughput Scheduling

EECS 571 Principles of Real-Time Embedded Systems. Lecture Note #7: More on Uniprocessor Scheduling

IE 5531: Engineering Optimization I

CS 6783 (Applied Algorithms) Lecture 3

Integer Programming. Wolfram Wiesemann. December 6, 2007

Introduction to integer programming III:

List Scheduling and LPT Oliver Braun (09/05/2017)

Che-Wei Chang Department of Computer Science and Information Engineering, Chang Gung University

Design and Analysis of Algorithms

Real-time Scheduling of Periodic Tasks (2) Advanced Operating Systems Lecture 3

Totally unimodular matrices. Introduction to integer programming III: Network Flow, Interval Scheduling, and Vehicle Routing Problems

TDDI04, K. Arvidsson, IDA, Linköpings universitet CPU Scheduling. Overview: CPU Scheduling. [SGG7] Chapter 5. Basic Concepts.

Networked Embedded Systems WS 2016/17

Friday, September 21, Flows

6. DYNAMIC PROGRAMMING I

Algorithm Design and Analysis

3.3 Easy ILP problems and totally unimodular matrices

Scheduling Algorithms for Multiprogramming in a Hard Realtime Environment

On-line Bin-Stretching. Yossi Azar y Oded Regev z. Abstract. We are given a sequence of items that can be packed into m unit size bins.

Combinatorial Structure of Single machine rescheduling problem

Linear and Integer Programming - ideas

Integer Linear Programming (ILP)

CSC 421: Algorithm Design & Analysis. Spring 2018

CS2223 Algorithms D Term 2009 Exam 3 Solutions

CS 598RM: Algorithmic Game Theory, Spring Practice Exam Solutions

Lecture 13. Real-Time Scheduling. Daniel Kästner AbsInt GmbH 2013

In the original knapsack problem, the value of the contents of the knapsack is maximized subject to a single capacity constraint, for example weight.

Online Packet Buffering

CSE 380 Computer Operating Systems

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

A general framework for handling commitment in online throughput maximization

Approximation Algorithms

(tree searching technique) (Boolean formulas) satisfying assignment: (X 1, X 2 )

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Embedded Systems 15. REVIEW: Aperiodic scheduling. C i J i 0 a i s i f i d i

Question Paper Code :

CPU Scheduling. CPU Scheduler

Simplex Algorithm Using Canonical Tableaus

2/5/07 CSE 30341: Operating Systems Principles

There are three priority driven approaches that we will look at

General Methods for Algorithm Design

Integer Linear Programming Modeling

Approximation Algorithms (Load Balancing)

Partition is reducible to P2 C max. c. P2 Pj = 1, prec Cmax is solvable in polynomial time. P Pj = 1, prec Cmax is NP-hard

(a) Write a greedy O(n log n) algorithm that, on input S, returns a covering subset C of minimum size.

Non-preemptive Scheduling of Distance Constrained Tasks Subject to Minimizing Processor Load

Transcription:

IV-0 Definitions Optimization Problem: Given an Optimization Function and a set of constraints, find an optimal solution. Optimal Solution: A feasible solution for which the optimization function has the best possible value. Feasible Solution: Solution that satisfies the constraints. Example Printer problem: The constraint is to print all the jobs nonpreemptively (one at a time), and the objective is to minimize the average finish time. Container Loading problem: The constraint is that the container loaded have total weight the cargo weight capacity, and the objective function is to find a larget set of containers to load.

IV-1 Coin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages. At each stage we make a decision that appears to be the best (under some criterion) at the time (local optimum). A decision made in one stage is not changed in a later stage, so each decision should assure feasibility Greedy criterion: criterion used to make the greedy decision at each stage.

IV-2 Container Loading Large ship is to be loaded with cargo. Cargo is in equal size containers Container i has weight w i. The cargo weight capacity is c (and every w i c). Load the ship with maximum number of containers without exceeding the cargo weight capacity. Find values x i {0, 1} such that n i=1 w i x i c, and the optimization function ( n i=1 x i) is maximized.

IV-3 Example: n = 8 w 1 w 2 w 3 w 4 w 5 w 6 w 7 w 8 c 100 200 50 90 150 50 20 80 400 1 1 1 0 0 1 0 0 400 Is it an optimal solution? No! Why? One can take out object 2 and add objects 7 and 8. That would be a better solution. Is a feasible solution optimal if one cannot trade two objects for one (in the solution) while maintaining feasibility? Answer: See the following non-optimal solution. w 1 w 2 w 3 w 4 w 5 w 6 w 7 c 60 60 60 60 80 80 80 240 0 0 0 0 1 1 1 240

IV-4 Algorithm Load ship in stages, one container per stage. At each stage we need to decide which container to load. Greedy criterion: From the remaining containers, select the one with least weight. Example w 1 w 2 w 3 w 4 w 5 w 6 w 7 w 8 c 20 50 50 80 90 100 150 200 400 1 1 1 1 1 1 0 0 390

IV-5 Correctness Proof Theorem: The above greedy algorithm generates an optimal set of containers to load. Proof Idea: No matter which feasible solution (Y ) you start with, it is possible to transform it to the solution generatated by the algorithm without decreasing the objective function value. Assume without loss of generality (wlog) w 1 w 2... w n ; Let X = (x 1, x 2,...x n ) be the solution generated by the algorithm Let Y = (y 1, y 2,...y n ) be any feasible solution such that w i y i c. Transform Y to X in several steps without decreasing the objective function value.

IV-6 From the way the algorithm works, there is a k [0, n] s.t. x i = 1 for i k, and x i = 0 for i > k. (i.e., X = 1, 1,..., 1, 0, 0,..., 0). Transformation: Let j be the smallest integer in [1, n], s.t. x j y j. So either: (1) No such j exists in which case Y = X. (2) j k (as otherwise Y is not feasible). So, x j = 1 and y j = 0. Change y j to 1 If Y is infeasible then there is an l in [j + 1, n] s.t. y l = 1, y l is changed to 0, and the new Y is feasible (because w j w l ). No matter what the new Y is, it has at least as many 1s (or more) as before. Apply the transformation until you get Y = X.

IV-7 Example for Proof Solution Generated by our algorithm (X). w 1 w 2 w 3 w 4 w 5 w 6 w 7 w 8 c 20 50 50 80 90 100 150 200 400 1 1 1 1 1 1 0 0 390 Consider the following feasible solution (Y ) w 1 w 2 w 3 w 4 w 5 w 6 w 7 w 8 c 20 50 50 80 90 100 150 200 400 0 1 0 0 0 0 1 1 400 Transformations as in the proof of the previous theorem. (j in the proof is 1, then 3, 4, 5, 6).

IV-8 w 1 w 2 w 3 w 4 w 5 w 6 w 7 w 8 c 20 50 50 80 90 100 150 200 400 1 1 0 0 0 0 0 1 270 1 1 1 0 0 0 0 1 320 1 1 1 1 0 0 0 1 400 1 1 1 1 1 0 0 0 290 1 1 1 1 1 1 0 0 390 Implementation Remaining objects are stored in a heap ordered with respect to their weight (smallest on top of the heap). Algorithm talks O(n log n) time (n deletes from a heap with initially n objects, creating the heap takes O(n) time (CMPSC 130A)). Alg takes O(n) time if optimal sol has very few (n/ log n) objects.

IV-9 Linear Time Implementation!! Assume all weights are different, if there are repeated weights then a similar algorithm exists for the solution of the problem. We use an algorithm (which we will describe when we cover divide-and-conquer algorithms) that finds the middle object of n objects (i.e., find the element such that there are exactly n/2 objects smaller or equal to it) in O(n) time.

IV-10 Our algorithm works by doing a binary Search type of search on the unsorted weights (W). Let S be the smallest n/2 objects (in W) and let t be their total weight. There are three cases: (1) If t > c then search for a solution in S only with the same capacity c. (2) If t = c, then add all the objects in S to the solution and end the procedure; (3) Else, add all the elements in S to the solution and set the the remaining capacity to c t and now try to add as many objects as possible from W S. Repeat the above step until there are no objects left. Time complexity is c 1 n + c 1 n/2 + c 1 n/4 +... = c 2 n

IV-11 Example 1 Suppose that W is: {6, 10, 8, 15, 22, 19, 5, 9}, and c = 25. The middle object is 9, S = {6, 8, 5, 9} and t = 28. The objects in S do not fit. The new W is: {6, 8, 9, 5}, and c = 25. The middle object is 6, S = {6, 5}, and t = 11. The objects in S fit and are added to the solution. The new W is: {8, 9}, and c = 14. The middle object is 8, S = {8}, and t = 8. The object in S fit and are added to the solution. The ne W is: {9}, and c = 6. The middle object is 9, S = {9}, and t = 9. The object in S does not fit. There are no objects left and we are done. The solution are the objects with weight 5, 6 and 8.

IV-12 Example 2 Suppose W is: {6, 10, 8, 15, 22, 19, 5, 9}, and c = 53. The middle object is 9, S = {6, 8, 5, 9}, and t = 28. The objects in S fit and are added to the solution. The new W is: {10, 15, 22, 19}, and c = 25. The middle object is 15, S = {10, 15}, and t = 25. The objects in S fit exactly and the algorithm finishes. The solution are the objects with weight 6, 8, 5, 9, 10, and 15.

IV-13 Deterministic Scheduling Printer Scheduling with Complete Information Problem is identical to the one in Section 10.1.1. At time zero there are n tasks to be printed. Tasks are denoted by T 1, T 2,...,T n with execution time requirements t 1, t 2,...,t n Once a task starts printing it will continue printing until it terminates (I.e., preemptions are not allowed).

IV-14 Example: t 1 = 2, t 2 = 1, t 3 = 4, t 4 = 9. Two schedules: S1 T1 T3 T4 T2 0 2 6 1516 S2 T3 T2 T1 T4 0 4 5 7 16 Let f i (S) be the finish time for task T i in S. fi (S) The Average Finish Time (AFT) for S is n. The AFT for S 1 is 2+16+6+15 4 = 9.75, and the AFT for S 2 is 7+5+4+16 4 = 8.00. Objective Function: Find a schedule with minimum AFT. T3 T2 T1 T4 0 4 5 7 16 T2 T3 T1 T4 0 1 5 7 16 T2 T1 T3 T4 0 1 3 7 16 Shortest Processing Time First (SPT): Assign the tasks to the printer from smallest to largest.

IV-15 SPT is Optimal Theorem: SPT schedules are optimal wrt AFT. Proof: By contradiction. Suppose that there is a problem instance I such that schedule S (which is not an SPT schedule) is an optimal schedule wrt AFT, i.e., fi (S ) n < fi (SPT) n Since S is not an SPT schedule there exist two tasks (T i, T j ) such that they are scheduled one after the other in S (first T i and then T j ) such that t i > t j.

IV-16 Construct schedule S from S by interchanging task T i, and T j. S Ti Tj S Tj Ti Since t i > t j we know that f i (S ) > f j (S ), f j (S ) = f i (S ), and the finish time of the remaining tasks in both schedules is identical. fi (S ) fi (S ) Therefore, n > n. This contradicts that S is an optimal schedule.