PTAS for Bin-Packing

Similar documents
Exercises for Square-Congruence Modulo n ver 11

Lecture 9: Tolerant Testing

18.413: Error Correcting Codes Lab March 2, Lecture 8

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

CHAPTER 4 RADICAL EXPRESSIONS

CHAPTER VI Statistical Analysis of Experimental Data

A tighter lower bound on the circuit size of the hardest Boolean functions

MATH 247/Winter Notes on the adjoint and on normal operators.

Chapter 9 Jordan Block Matrices

means the first term, a2 means the term, etc. Infinite Sequences: follow the same pattern forever.

MA 524 Homework 6 Solutions

1 Onto functions and bijections Applications to Counting

For combinatorial problems we might need to generate all permutations, combinations, or subsets of a set.

The Selection Problem - Variable Size Decrease/Conquer (Practice with algorithm analysis)

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Introduction to local (nonparametric) density estimation. methods

Algorithms Theory, Solution for Assignment 2

CIS 800/002 The Algorithmic Foundations of Data Privacy October 13, Lecture 9. Database Update Algorithms: Multiplicative Weights

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

NP!= P. By Liu Ran. Table of Contents. The P vs. NP problem is a major unsolved problem in computer

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations

Chapter 13, Part A Analysis of Variance and Experimental Design. Introduction to Analysis of Variance. Introduction to Analysis of Variance

Pseudo-random Functions

8.1 Hashing Algorithms

Functions of Random Variables

10.1 Approximation Algorithms

Lecture 4 Sep 9, 2015

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)

NP!= P. By Liu Ran. Table of Contents. The P versus NP problem is a major unsolved problem in computer

Summary of the lecture in Biostatistics

UNIT 2 SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS

Packing of graphs with small product of sizes

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

(b) By independence, the probability that the string 1011 is received correctly is

= lim. (x 1 x 2... x n ) 1 n. = log. x i. = M, n

f f... f 1 n n (ii) Median : It is the value of the middle-most observation(s).

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

D KL (P Q) := p i ln p i q i

Analyzing Control Structures

Runtime analysis RLS on OneMax. Heuristic Optimization

Section l h l Stem=Tens. 8l Leaf=Ones. 8h l 03. 9h 58

Laboratory I.10 It All Adds Up

Hard Core Predicates: How to encrypt? Recap

2. Independence and Bernoulli Trials

9 U-STATISTICS. Eh =(m!) 1 Eh(X (1),..., X (m ) ) i.i.d

Computational Geometry

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

Econometric Methods. Review of Estimation

Simple Linear Regression

Log1 Contest Round 2 Theta Complex Numbers. 4 points each. 5 points each

The Occupancy and Coupon Collector problems

This lecture and the next. Why Sorting? Sorting Algorithms so far. Why Sorting? (2) Selection Sort. Heap Sort. Heapsort

5 Short Proofs of Simplified Stirling s Approximation

L5 Polynomial / Spline Curves

Multiple Choice Test. Chapter Adequacy of Models for Regression

2.28 The Wall Street Journal is probably referring to the average number of cubes used per glass measured for some population that they have chosen.

Third handout: On the Gini Index

Lecture 07: Poles and Zeros

Evaluating Polynomials

EP2200 Queueing theory and teletraffic systems. Queueing networks. Viktoria Fodor KTH EES/LCN KTH EES/LCN

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

Simulation Output Analysis

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

Problems and Solutions

Chapter 5 Properties of a Random Sample

Statistics Descriptive and Inferential Statistics. Instructor: Daisuke Nagakura

v 1 -periodic 2-exponents of SU(2 e ) and SU(2 e + 1)

ANALYSIS ON THE NATURE OF THE BASIC EQUATIONS IN SYNERGETIC INTER-REPRESENTATION NETWORK

Non-uniform Turán-type problems

Pseudo-random Functions. PRG vs PRF

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

arxiv:math/ v1 [math.gm] 8 Dec 2005

k 1 in the worst case, and ( k 1) / 2 in the average case The O-notation was apparently The o-notation was apparently

Ideal multigrades with trigonometric coefficients

Lebesgue Measure of Generalized Cantor Set

Unit 9. The Tangent Bundle

Likewise, properties of the optimal policy for equipment replacement & maintenance problems can be used to reduce the computation.

SPECIAL CONSIDERATIONS FOR VOLUMETRIC Z-TEST FOR PROPORTIONS

Chapter 3 Sampling For Proportions and Percentages

MA/CSSE 473 Day 27. Dynamic programming

Logistic regression (continued)

å 1 13 Practice Final Examination Solutions - = CS109 Dec 5, 2018

The Mathematical Appendix

Minimizing Total Completion Time in a Flow-shop Scheduling Problems with a Single Server

12.2 Estimating Model parameters Assumptions: ox and y are related according to the simple linear regression model

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006

MEASURES OF DISPERSION

Unsupervised Learning and Other Neural Networks

Arithmetic Mean and Geometric Mean

AN UPPER BOUND FOR THE PERMANENT VERSUS DETERMINANT PROBLEM BRUNO GRENET

STK4011 and STK9011 Autumn 2016

2SLS Estimates ECON In this case, begin with the assumption that E[ i

PPCP: The Proofs. 1 Notations and Assumptions. Maxim Likhachev Computer and Information Science University of Pennsylvania Philadelphia, PA 19104

MATH 371 Homework assignment 1 August 29, 2013

Dimensionality Reduction and Learning

Median as a Weighted Arithmetic Mean of All Sample Observations

Lecture 3. Sampling, sampling distributions, and parameter estimation

Transcription:

CS 663: Patter Matchg Algorthms Scrbe: Che Jag /9/00. Itroducto PTAS for B-Packg The B-Packg problem s NP-hard. If we use approxmato algorthms, the B-Packg problem could be solved polyomal tme. For example, the smplest approxmato algorthm s the Frst-ft algorthm, whch solves the B-Packg problem tme O ( log ). We use the approxmato factor to determe how good our approxmato algorthm s. Let A (I ) be the umber of bs requred by the approxmato algorthm, ad let OPT (I ) be the optmal umber of requred bs for put I. We say that algorthm A has approxmato factor C f for every put I: A ( I ) C OPT ( I ) ( C ) Ths equalty meas that the approxmato algorthm would ot use more tha C tmes the optmal umber of bs, whch s also the upper boud of Obvously, the closer C s to, the better the approxmato. the approxmato algorthm. Clam: The B-Packg problem has a PTAS (Polyomal Tme Approxmato Scheme). I.e., Gve > 0, oe ca always produce approxmato algorthm whose () tme s polyomal ad ; ad () whose approxmato factor s, A( I ) (+ ) OPT ( I ). Specal Cases for B-Packg Before provg that the B-Packg problem has a PTAS, frst cosder two Specal Cases for B-Packg to get some terestg coclusos.. Case : All tem szes less tha δ Clam : If tem sze u < δ for = to, the FF( I ) (+ ) OPT ( I ) + FF (I ) = the umber of bs of Frst-ft algorthm.

Proof f δ / Q FF( I ) OPT ( I ) + (Proved by Frst-ft) ad δ / + FF( I ) OPT ( I ) + (+ ) OPT ( I ) + f δ < / The above fgure shows the stuato at the ed of rug the Frst-ft. Sce all tem szes are less tha δ, there would be less the δ empty space each b (except, possbly, for the last b). So, the flled part (blue) = u ( FF( I ) )( δ ), FF ( I ) s the umber of bs, = δ s the smallest empty space wth ay b. Aother way of thkg = u s that suppose we could smash all the tems & bled them, ad the put them to bs wthout leavg ay spare space. OPT (I ). = u ca ot be more tha Therefore, ( FF( I ) )( δ ) = u OPT ( I ) FF ( I ) OPT ( I ) /( δ ) But /( δ ) +, Q ( + )( δ ) = + δ ( ) ad δ < / 0< ( + )( δ ) FF( I ) OPT ( I ) /( δ ) (+ ) OPT ( I ) FF( I ) (+ ) OPT ( I ) +

Clam proved.. Case : There are oly k dfferet szes of tems Clam : If there are oly k dfferet szes of tems, + OPT (I ) could be foud determstcally tme O ( k ) wthout usg approxmato algorthm. Proof Observato: The umber of subsets of elemets s expoetal ( ) the geeral case, whle the umber of subset s polyomal O ( k ) the case where k s fxed. Why? Sort the tems by sze, the dvde the tems to k groups. Deote a subset by a k-tuple of the umber of elemets of each sze the subset. Example..., Suppose tems {/, /4, /3, /7, /9, /, /3, /4, /9} k(=5) groups {/, /4, /3, /7, /9} k We could deote subset {/, /4, /3, /9, /9} by <,,, 0, >, where represets the umber of tmes /9 appeared. Obvously, all of these umbers caot greater tha. So we have O ( k ) subsets. Compute OPT (I ) Usg dyamc programmg. Fd the subsets that ca be packed b Fd the subsets that ca be packed bs Observato: each of these subsets Fd the subsets that ca be packed OPT(I) bs must be composed of two subsets, each of whch ca be packed b

Sce OPT ( I ). There are teratos at most, ad each teratos has O( k ) subsets. Implemetato Step. Sort the subsets by the sum of ther szes. Those wth sum ot exceedg ca be packed a sgle b. Call them -b subsets. Step. For all pars of -b subsets, cosder ther uo. Every uo that s ot a -b subset s a -b subset. Step +. For all pars cosstg of a -b subset ad a -b subset, cosder ther uo. If t s ot a -b subset or less, the t s a (+)-b subset. If the etre set s packed doe, else cotue to ext step. Tme k We cosder all pars each terato, thus tme s O (( ) ) = O ( k ) per terato. + Sce there are o more tha teratos, total tme s O ( k ). Clam proved. 3. Prove PTAS for B-Packg 3. Specal cases To prove PTAS for B-Packg problem, frst cosder two specal cases. Gve I { s s... } = ad, s 3.. Case : all tem szes are o more the / Clam 3: If tem sze u / for = to, A( I ) (+ ) OPT ( I ) + Proof Usg Clam drectly. 3.. Case : all tem szes are more the / Clam 4: If tem sze u > / for = to, There exsts a packg algorthm PA wth approxmato factor PA(I) (+)OPT(I)+

Proof Fx k (we wll later see how to choose t), the cosder the followg approxmato algorthm. Step. Sort I o-decreasg order. Splt to k groups of k elemets each. Step. Pack the frst group G at most k bs, because we have k elemets G Step 3. Costruct set I`. Dscard G frst, ad the chage all umbers each group to largest umber group. Step 4. Fd OPT (I`) by usg the method.. Tme. Sortg: O ( log). Packg frst group: O () 3. Costructg I`: O () 4. Fdg OPT(I`): O ( / k + ). Sce we dvde I to k groups, there are o more tha k - dfferet szes of tems I`, so that we could use Clam. Coclude: Tme O ( / k + ) Approxmato Factor Sce we used k bs to pack the frst group ad OPT(I`) bs to pack the rest of groups, we have approxmato factor: PA ( I ) = OPT ( I`) + k Lemma: OPT ( I`) OPT ( I ) Proof G / Costruct set I``. Dscard the last group k frst, ad the chage all umbers each

group to the smallest umber group. Example G G G3 G4 I 0 9 8 7 7 6 6 6 5 3 I`` 8 8 8 6 6 6 5 5 5 dscarded I` dscarded 7 7 7 6 6 6 3 3 Clearly OPT ( I``) OPT ( I ), because there are less elemets I`` ad the sze of elemets are smaller. Also OPT ( I`) OPT ( I``), because f compare G set I`` wth G + set I` (showed red arrow), we see that () all the sze of elemets I`s group are o more tha those I``s group; ad () the umber of elemets I`s group s o more tha those I``s group. Coclude, OPT ( I`) OPT ( I``) OPT ( I ) 3 Choose k Approxmato Factor Q PA ( I ) = OPT ( I`) + k Ad OPT ( I`) OPT ( I ) PA ( I ) OPT ( I ) + k Now choose k = = S Sce u OPT ( I ) s proved., = = u OPT ( I ) k OPT ( I ) + A( I ) OPT ( I ) + k ( + ) OPT ( I ) + Therefore, PA( I ) ( + ) OPT ( I ) + Tme

= k = S S = But s > / for = to, so: k S = = = + 4 + Therefore, Rug Tme O ( ) = O ( ) Sce s gve (fxed), ths packg algorthm s polyomal. 3. Geeral case Implemetato Clam 4 proved. Step. Splt tems to two sets: small ( s / ) ad large ( s > / ). Step. Hadle large tems as Clam 4. Step 3. Use FF to pack small tems to remag spaces of large tem bs. Whe all such space s used, ope ew bs usg FF. Tme 4 + Obvously Polyomal O ( ) Approxmato Factor Case : All large tem bs are flled, ad ew oes opeed. Ths s the same case as Clam, because we are usg FF to pack the small elemets of the set ( s / ). So the Approxmato Factor s A( I ) (+ ) OPT ( I ) + Case : Not all large tem bs are flled. Use cocluso of Clam 4, A( I ) = PA( I) (+ ) OPT ( I') + (+ ) OPT( I ) + 3.3 Cocluso For B-Packg problem, Gve, we could always propose a Polyomal Tme

Approxmato Scheme whose approxmato factor s ( + ) OPT ( I ), tme s O ( ). Ths cocluso also shows that f we wat to get more close to optmal umber of bs, we eed to pay more tme. For example, f we choose =/00, tme ( 4000 ) although t s polyomal. 4 + O would be awful