Dynamic Programming 11/8/2009. Weighted Interval Scheduling. Weighted Interval Scheduling. Unweighted Interval Scheduling: Review

Similar documents
Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

Network Flow. Data Structures and Algorithms Andrei Bulatov

Longest Common Prefixes

Stationary Distribution. Design and Analysis of Algorithms Andrei Bulatov

6. DYNAMIC PROGRAMMING II

1 Review of Zero-Sum Games

CSE 421 Dynamic Programming

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

The Residual Graph. 12 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm

6. DYNAMIC PROGRAMMING II

The Residual Graph. 11 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm

16 Max-Flow Algorithms

INSTANTANEOUS VELOCITY

Random Walk with Anti-Correlated Steps

Max Flow, Min Cut COS 521. Kevin Wayne Fall Soviet Rail Network, Cuts. Minimum Cut Problem. Flow network.

CMU-Q Lecture 3: Search algorithms: Informed. Teacher: Gianni A. Di Caro

Algorithmic Discrete Mathematics 6. Exercise Sheet

Traversal of a subtree is slow, which affects prefix and range queries.

A Hop Constrained Min-Sum Arborescence with Outage Costs

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

I. Introduction to place/transition nets. Place/Transition Nets I. Example: a vending machine. Example: a vending machine

Notes for Lecture 17-18

Physics Notes - Ch. 2 Motion in One Dimension

Solutions for Assignment 2

Brock University Physics 1P21/1P91 Fall 2013 Dr. D Agostino. Solutions for Tutorial 3: Chapter 2, Motion in One Dimension

Chapter 7: Solving Trig Equations

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

Christos Papadimitriou & Luca Trevisan November 22, 2016

Chapter 12: Velocity, acceleration, and forces

Problem Set If all directed edges in a network have distinct capacities, then there is a unique maximum flow.

Linear Response Theory: The connection between QFT and experiments

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4.

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Chapter 3 Kinematics in Two Dimensions

One-Dimensional Kinematics

Asymptotic Equipartition Property - Seminar 3, part 1

Randomized Perfect Bipartite Matching

Algorithms and Data Structures 2011/12 Week 9 Solutions (Tues 15th - Fri 18th Nov)

Lecture Notes 2. The Hilbert Space Approach to Time Series

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

EE363 homework 1 solutions

Viterbi Algorithm: Background

Soviet Rail Network, 1955

Vehicle Arrival Models : Headway

NEWTON S SECOND LAW OF MOTION

Dynamic Programming: Interval Scheduling and Knapsack

Ensamble methods: Boosting

KINEMATICS IN ONE DIMENSION

Network Flows: Introduction & Maximum Flow

! Abstraction for material flowing through the edges. ! G = (V, E) = directed graph, no parallel edges.

Ensamble methods: Bagging and Boosting

Graphs III - Network Flow

HOTELLING LOCATION MODEL

CS4445/9544 Analysis of Algorithms II Solution for Assignment 1

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1

Operating Systems Exercise 3

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 17

Math 116 Practice for Exam 2

Dynamic Programming. Cormen et. al. IV 15

Differential Geometry: Numerical Integration and Surface Flow

Flow networks. Flow Networks. A flow on a network. Flow networks. The maximum-flow problem. Introduction to Algorithms, Lecture 22 December 5, 2001

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018

Let us start with a two dimensional case. We consider a vector ( x,

Solutions from Chapter 9.1 and 9.2

15. Vector Valued Functions

Comparing Means: t-tests for One Sample & Two Related Samples

Timed Circuits. Asynchronous Circuit Design. Timing Relationships. A Simple Example. Timed States. Timing Sequences. ({r 6 },t6 = 1.

LAB # 2 - Equilibrium (static)

An introduction to the theory of SDDP algorithm

6. DYNAMIC PROGRAMMING II

Some Ramsey results for the n-cube

Approximation Algorithms for Unique Games via Orthogonal Separators

Removing Useless Productions of a Context Free Grammar through Petri Net

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Seminar 4: Hotelling 2

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Network Flows UPCOPENCOURSEWARE number 34414

Online Convex Optimization Example And Follow-The-Leader

Phys 221 Fall Chapter 2. Motion in One Dimension. 2014, 2005 A. Dzyubenko Brooks/Cole

This is an example to show you how SMath can calculate the movement of kinematic mechanisms.

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Laplace transfom: t-translation rule , Haynes Miller and Jeremy Orloff

Some Basic Information about M-S-D Systems

Echocardiography Project and Finite Fourier Series

Maximum Flow and Minimum Cut

THE MATRIX-TREE THEOREM

Physics for Scientists and Engineers. Chapter 2 Kinematics in One Dimension

Final Spring 2007

Physical Limitations of Logic Gates Week 10a

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model

Q2.4 Average velocity equals instantaneous velocity when the speed is constant and motion is in a straight line.

3.1 More on model selection

Soviet Rail Network, 1955

Algorithm Design and Analysis

13 Dynamic Programming (3) Optimal Binary Search Trees Subset Sums & Knapsacks

Technical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models.

Equations of motion for constant acceleration

This document was generated at 7:34 PM, 07/27/09 Copyright 2009 Richard T. Woodward

Transcription:

//9 Algorihms Dynamic Programming - Weighed Ineral Scheduling Dynamic Programming Weighed ineral scheduling problem. Insance A se of n jobs. Job j sars a s j, finishes a f j, and has weigh or alue j. Two jobs compaible if hey don' oerlap. Objecie Find maximum weigh subse of muually compaible jobs. Daa Srucures and Algorihms Andrei Bulao Algorihms Dynamic Programming - Algorihms Dynamic Programming - Unweighed Ineral Scheduling: Reiew Recall: Greedy algorihm works if all weighs are. Consider jobs in ascending order of finish ime. Add job o subse if i is compaible wih preiously chosen jobs. Obseraion. Greedy algorihm can fail specacularly if arbirary weighs are allowed. weigh = 999 weigh = a b 6 7 9 Time Weighed Ineral Scheduling Noaion: Label jobs by finishing ime: f f... f n. Le p(j) be he larges index i < j such ha job i is compaible wih j.. p() =, p(7) =, p() =. 6 7 9 6 7 Time Algorihms Dynamic Programming - Algorihms Dynamic Programming -6 Dynamic Programming: Binary Choice Le OPT(j) denoe he alue of an opimal soluion o he problem consising of job requess,,..., j. Case : OPT selecs job j. canno use incompaible jobs { p(j) +, p(j) +,..., j - } mus include opimal soluion o problem consising of remaining compaible jobs,,..., p(j) opimal subsrucure Case : OPT does no selec job j. mus include opimal soluion o problem consising of remaining compaible jobs,,..., j- OPT ( j) = max j { + OPT ( p( j)), OPT ( j ) } if j = oherwise Weighed Ineral Scheduling: Brue Force Inpu: n, s,,s n, f,,f n,,, n sor jobs by finish imes so ha f f f n compue p(), p(),, p(n) reurn Compue-Op(n) Compue-Op(j) if (j = ) reurn else reurn max( j +Compue-Op(p(j)),Compue-Op(j-))

//9 Algorihms Dynamic Programming -7 Algorihms Dynamic Programming - Weighed Ineral Scheduling: Brue Force Obseraion. Recursie algorihm fails specacularly because of redundan sub-problems exponenial algorihms. Number of recursie calls for family of "layered" insances grows like Fibonacci sequence. p() =, p(j) = j - Weighed Ineral Scheduling: Memoizaion Memoizaion: Sore resuls of each sub-problem in a cache; lookup as needed. Inpu: n, s,,s n, f,,f n,,, n sor jobs by finish imes so ha f f f n compue p(), p(),, p(n) se OPT[]:= for j= o n do se OPT[j]:=max( j +OPT[p(j)],OPT[j-]) reurn OPT[n] Algorihms Dynamic Programming -9 Algorihms Dynamic Programming - Weighed Ineral Scheduling: Running Time Auomaed Memoizaion Theorem Memoized ersion of algorihm akes O(n log n) ime. Sor by finish ime: O(n log n). Compuing p( ) : O(n) afer soring by finish ime Each ieraion of he for loop: O() Oerall ime is O(n log n) Remark. O(n) if jobs are pre-sored by finish imes Auomaed memoizaion. Many funcional programming languages (e.g., Lisp) hae buil-in suppor for memoizaion. Q. Why no in imperaie languages (e.g., Jaa)? (defun F (n) (if (<= n ) n (+ (F (- n )) (F (- n ))))) Lisp (efficien) saic in F(in n) { if (n <= ) reurn n; else reurn F(n-) + F(n-); } F(7) F() F(6) F(9) F(6) Jaa (exponenial) F(7) F() F() F(6) F(7) F() F() F() F(6) F() Algorihms Dynamic Programming - Algorihms Dynamic Programming - Finding a Soluion Dynamic programming algorihm compues opimal alue. Wha if we wan he soluion iself? Do some pos-processing Find-Soluion(j) if j = hen oupu nohing else if j +M[p(j)]>M[j-] hen do prin j Find-Soluion(p(j)) endif else Find-Soluion(j-) endif Knapsack The Knapsack Problem Insance: A se of n objecs, each of which has a posiie ineger alue i and a posiie ineger weigh w i. A weigh limi W. Objecie: Selec objecs so ha heir oal weigh does no exceed W, and hey hae maximal oal alue

//9 Algorihms Dynamic Programming - Algorihms Dynamic Programming - Idea A simple quesion: Should we include he las objec ino selecion? Le OPT(n,W) denoe he maximal alue of a selecion of objecs ou of {,, n} such ha he oal weigh of he selecion doesn exceed W More general, OPT(i,U) denoe he maximal alue of a selecion of objecs ou of {,, i} such ha he oal weigh of he selecion doesn exceed U Then OPT(n,W) = max{ OPT(n, W), OPT(n, W wi ) + i } Algorihm (Firs Try) Knapsack(n,W) se V:=Knapsack(n-,W) se V:=Knapsack(n-,W- w i ) oupu max(v,v+ i ) Is i good enough? Le he alues be,,,, he weighs,,,, and W = Recursion ree Algorihms Dynamic Programming - Algorihms Dynamic Programming -6 Anoher Idea: Memoizaion Le us sore alues OPT(i,U) as we find hem We need o sore (and compue) a mos n W numbers We ll do i in a regular way: Insead of recursion, we will compue hose alues saring from smaller ones, and fill up a able Algorihm (Second Try) Knapsack(n,W) array M[..n,..W] se M[,w]:= for each w=,,...,w for i= o n do for w= o W do se M[i,w]:= max{m[i,w],m[n-,w w i ]+ i } Algorihms Dynamic Programming -7 Algorihms Dynamic Programming - Le he alues be,,,, he weighs,,,, and W = w i M[i,w] = max{ M[ i, w], M[ n,w w i ] + i } 7 7 Shores Pah Suppose ha eery arc e of a digraph G has lengh (or cos, or weigh, or ) len(e) Bu now we allow negaie lenghs (weighs) Then we can naurally define he lengh of a direced pah in G, and he disance beween any wo nodes The s--shores Pah Problem Insance: Digraph G wih lenghs of arcs, and nodes s, Objecie: Find a shores pah beween s and

//9 Algorihms Dynamic Programming -9 Algorihms Dynamic Programming - Shores Pah: Difficulies Shores Pah: Obseraions Negaie Cycles. s < Assumpion There are no negaie cycles Greediness fails s Adding consan weigh o all arcs fails -6 If graph G has no negaie cycles, hen here is a shores pah from s o ha is simple (i.e. does no repea nodes), and hence has a mos n arcs If a shores pah P from s o repea a node, hen i also include a cycle C saring and ending a. The weigh of he cycle is non-negaie, herefore remoing he cycle makes he pah shorer (no longer). Algorihms Dynamic Programming - Algorihms Dynamic Programming - Shores Pah: Dynamic Programming We will be looking for a shores pah wih increasing number of arcs Le OPT(i,) denoe he minimum weigh of a pah from o using a mos i arcs w Shores pah can use i arcs. Then OPT(i,) = OPT(i,) Or i can use i arcs and he firs arc is w. Then OPT(i,) = len(w) + OPT(i,w) Shores Pah: Bellman-Ford Algorihm Shores-Pah(G,s,) se n:= V /*number of nodes in G array M[..n-,V] se M[,]:= and M[,]:= for each V-{} for i= o n- do for V do se M[i,w]:=min{M[i-,],min w V{M[i-,w]+len(w)}} reurn m[n-,s] OPT( i, ) = min{ OPT( i, ),min{ OPT( i, w) + len( w)}} w V Algorihms Dynamic Programming - Algorihms Dynamic Programming - - 6 - - a - - - -6-6 - b - - - c - d e M[i,w] = min{ M[i, ], min w V{ M[ i, w] + len(w) }} Shores Pah: Soundness and Running Time Theorem The ShoresPah algorihm correcly compues he minimum cos of an s- pah in any graph ha has no negaie cycles, and runs in O( n ) ime. Soundness follows by inducion from he recurren relaion for he opimal alue. DIY. Running ime: We fill up a able wih n enries. Each of hem requires O(n) ime

//9 Algorihms Dynamic Programming - Algorihms Dynamic Programming -6 Shores Pah: Soundness and Running Time Theorem The ShoresPah algorihm can be implemened in O(mn) ime A big improemen for sparse graphs Shores Pah: Running Time Improemens I akes O( n ) o compue he array enry M[i,]. I needs o be compued for eery node and for each i, i n. Thus he bound for running ime is O n n = O(nm) V. Consider he compuaion of he array enry M[i,]: M[i,] = min{ M[i, ], min w V { M[ i, w] + len(w) }} We need only compue he minimum oer all nodes w for which has an edge o w Le n denoe he number of such edges Indeed, n is he oudegree of, and we hae he resul by he Handshaking. Algorihms Dynamic Programming -7 Algorihms Dynamic Programming - Shores Pah: Space Improemens The sraighforward implemenaion requires soring a able wih enries I can be reduced o O(n) Insead of recording M[i,] for each i, we use and updae a single alue M[] for each node, he lengh of he shores pah from o found so far Thus we use he following recurren relaion: M[] = min{ M[], min { M[ w] + len(w) }} w V Shores Pah: Space Improemens (cnd) Throughou he algorihm M[] is he lengh of some pah from o, and afer i rounds of updaes he alue M[] is no larger han he lengh of he shores from o using a mos i edges Algorihms Dynamic Programming -9 Algorihms Dynamic Programming - Shores Pah: Finding Shores Pah In he sandard ersion we only need o keep record on how he opimum is achieed Consider he space saing ersion. For each node sore he firs node on is pah o he desinaion Denoe i by firs() Updae i eery ime M[] is updaed Le P be he poiner graph P = (V, {(, firs()): V}) Shores Pah: Finding Shores Pah If he poiner graph P conains a cycle C, hen his cycle mus hae negaie cos. If w = firs() a any ime, hen M[] M[w] + len(w) Le,, K, k be he nodes along he cycle C, and ( k, ) he las arc o be added Consider he alues righ before his arc is added We hae M[ i ] M[ i+ ] + len( ii + ) for i =,, k and M[ k ] > M[ ] + len( k) Adding up all he inequaliies we ge k > len( ii + ) + len( k ) i=

//9 Algorihms Dynamic Programming - Algorihms Dynamic Programming - Shores Pah: Finding Shores Pah (cnd) Suppose G has no negaie cycles, and le P be he poiner graph afer erminaion of he algorihm. For each node, he pah in P from o is a shores - pah in G. Obsere ha P is a ree. Since he algorihm erminaes we hae M[] = M[w] + len(w), where w = firs(). As M[] =, he lengh of he pah raced ou by he poiner graph is exacly M[], which is he shores pah disance. Shores Pah: Finding Negaie Cycles Two quesions: - how o decide if here is a negaie cycle? - how o find one? I suffices o find negaie cycles C such ha can be reached from C < < Algorihms Dynamic Programming - Algorihms Dynamic Programming - Shores Pah: Finding Negaie Cycles Le G be a graph The augmened graph, A(G), is obained by adding a new node and connecing eery node in G wih he new node As is easily seen, G conains a negaie cycle if and only if A(G) conains a negaie cycle C such ha is reachable from C Shores Pah: Finding Negaie Cycles (cnd) Exend OPT(i,) o i n If he graph G does no conain negaie cycles hen OPT(i,) = OPT(n,) for all nodes and all i n Indeed, i follows from he obseraion ha eery shores pah conains a mos n arcs. There is no negaie cycle wih a pah o if and only if OPT(n,) = OPT(n,) If here is no negaie cycle, hen OPT(n,) = OPT(n,) for all nodes by he obseraion aboe Algorihms Dynamic Programming - Algorihms Dynamic Programming -6 Shores Pah: Finding Negaie Cycles (cnd) (cnd) Suppose OPT(n,) = OPT(n,) for all nodes. Therefore OPT(n,) = min{ OPT(n,), min w V { OPT(n,w) + len(w) }} = min{ OPT(n,), min w V { OPT(n,w) + len(w) }} = OPT(n +,) =. Howeer, if a negaie cycle from which is reachable exiss, hen lim OPT( i, ) = i Shores Pah: Finding Negaie Cycles (cnd) Le be a node such ha OPT(n,) OPT(n,). A pah P from o of weigh OPT(n,) mus use exacly n arcs Any simple pah can hae a mos n arcs, herefore P conains a cycle C If G has n nodes and OPT(n,) OPT(n,), hen a pah P of weigh OPT(n,) conains a cycle C, and C is negaie. Eery pah from o using less han n arcs has greaer weigh. Le w be a node ha occurs in P more han once. Le C be he cycle beween he wo occurrences of w Deleing C we ge a shorer pah of greaer weigh, hus C is negaie 6