Dynamic Programming. Reading: CLRS Chapter 15 & Section CSE 6331: Algorithms Steve Lai
|
|
- Cameron Flowers
- 5 years ago
- Views:
Transcription
1 Dynamic Programming Reading: CLRS Chapter 5 & Section 25.2 CSE 633: Algorithms Steve Lai
2 Optimization Problems Problems that can be solved by dynamic programming are typically optimization problems. Optimization problems: Construct a set or a sequence of { } of elements y,..., that satisfies a given constraint and optimizes a given obective function. y The closest pair problem is an optimization problem. The convex hull problem is an optimization problem. 2
3 Problems and Subproblems Consider the closest pair problem: { } Given a set of points,,,,,, find a closest pair in A. n A= p p2 p3 p n Let Pi (, ) denote the problem of finding a closest pair { } in A p, p,, p, where i n. i = i i+ We have a class of similar problems, indexed by ( i, ). The original problem is P(, n). 3
4 Dynamic Programming: basic ideas 2 ( x x ) Problem: construct an optimal solution,,. There are several options for x, say, op, op,..., op. Each option op leads to a subproblem P : given x ( x = op x x ) find an optimal solution,,,. The best of these optimal solutions, i.e., 2 {( x ) } = op x2 x d Best of,,, : is an optimal solution to the original problem. DP wors only if the original problem. P is a problem similar to the = d op, 4
5 Dynamic Programming: basic ideas Apply the same reasoning to each subproblem, sub-subproblem, sub-sub-subproblem, and so on. Have a tree of the original problem (root) and subproblems. Dynamic programming wors when these subproblems have many duplicates, are of the same type, and we can describe them using, typically, one or two parameters. The tree of problem/subproblems (which is of exponential size) now condensed to a smaller, polynomial-size graph. Now solve the subproblems from the "leaves". 5
6 Design a Dynamic Programming Algorithm 2 ( x x ). View the problem as constructing an opt. seq.,,. 2. There are several options for x, say, op, op,..., op. Each option op leads to a subproblem. 3. Denote each problem/subproblem by a small number of parameters, the fewer the better. E.g., Pi (, ), i n. 4. Define the obective function to be optimized using these parameter(s). E.g., f( i, ) = the optimal value of P( i, ). 5. Formulate a recurrence relation. 6. Determine the boundary condition and the goal. 7. Implement the algorithm. d 6
7 Shortest Path Problem: Let G = ( V, E) be a directed acyclic graph (DAG). Let G be represented by a matrix: length of edge ( i, ) if ( i, ) E di (, ) = 0 if i= otherwise Find a shr o test path from a given node u to a given node v. 7
8 Dynamic Programming Solution. View the problem as constructing an opt. seq. Here we want to find a sequence of nodes,, ( ) ( x x ) ( x x ) such that ux,,, x, v is a shortest path from uto v. 2,,. 2. There are several options for x, say, op, op,..., op. Each option op leads to a subproblem. Options for x are the nodes x which have an edge from u. The subproblem corresponding to option x is: Find a shortest path from x to v. d 8
9 3. Denote each problem/subproblem by a small number of parameters, the fewer the better. 4. Define the obective function to be optimized using these parameter(s). These two steps are usually done simultaneously. Let f( x) denote the shortest distance from x to v. 5. Formulate a recurrence relation. { } f( x) = min d( x, y) + f( y):( x, y) E, if x v and out-degree( x) 0. 9
10 6. Determine the boundary condition. 0 if x = v f( x) = if x v and out-degree( x) = 0 7. What's the goal (obective)? Our goal is to compute f( u). Once we now how to compute f( u), it will be easy to construct a shortest path from u to v. I.e., we compute the shortest distance from u to v, and then construct a path having that distance. 8. Implement the algorithm. 0
11 Computing f( u) (version ) function shortest() x //computing f( x)// global d[.. n,.. n] if x = elseif v then return (0) out-degree( x) = 0 then return ( ) { y) E} ( ) else return min d( x, y) + shortest( y):( x, Initial call: shortest( u) Question: What's the worst-case running time?
12 Computing f( u) (version 2) function shortest( x) //computing f( x)// global d[.. n,.. n], F[.. n], Next[.. n] if Fx [ ] = then if x= vthen Fx [ ] 0 elseif out-degree( x) = 0 then else F[ x] { } Fx [ ] min dx (, y) + shortest( y) : ( x, y) E Next[ x] the node y that yielded the min retur n( Fx [ ]) 2
13 Main Program procedure shortest-path( uv, ) // find a shortest path from u to v // global d[.. n,.. n], F[.. n], Next[.. n] initialize Next[ v] 0 initialize F[.. n] SD shortest( u) //shortest distance from u to v// if SD < then //print the shortest path// u { } while 0 do write( ); Next[ ] 3
14 Time Complexity Number of calls to shortest: ( E ) Θ( E ) Is it Ω or? ( ) O E How much time is spent on shortest( x) for any x? The first call: O() + time to find x's outgoing edges Subsequent calls: O() per call The over-all worst-case running time of the algorithm is ( ) O E O() + time to find all nodes' outgoing edges If the graph is represend by an adacency matrix: ( 2 ) O V If the graph is represend by adacency lists: O V ( + E ) 4
15 Forward vs Bacward approach 5
16 Matrix-chain Multiplication Problem: Given n matrices M, M,..., M, where M is of dimensions d i d i 2, we want to compute the product M M2 Mn in a least expensive order, assuming that the cost for multiplying an a b matrix by a b c matrix is abc. Example: want to compute A B C, where A is 0 2, B is 2 5, C is 5 0. Cost of computing ( A B) C is = 600 Cost of computing A ( B C) is = 300 n i 6
17 Dynamic Programming Solution 2 n ( x x ) We want to determine an optimal,,, where x x x n means which two matrices to multiply first, means which two matrices to multiply next, and means which two matrices to multiply lastly. Consider x. (Why not x?) n There are n choices for x : n ( ) ( ) M M M M, where n. + n A general problem/subproblem is to multiply which can be naturally denoted by Pi (, ). M i M, 7
18 Dynamic Programming Solution Let M i Cost( i, ) denote the minimum cost for computing M Recurrence relation:. { + + } Cost( i, ) = min Cost( i, ) Cost(, ) + d d d i < i for i< n. Boundary condition: Cost( i, i) = 0 for i n. Goal: Cost(, n) 8
19 Algorithm (recursive version) function MinCost( i, ) global d[0.. n], Cost[.. n,.. n], Cut[.. n,.. n] //initially, Cost[ i, ] 0 if i =, and Cost[ i, ] if i // if Cost[ i, ] < 0 then { Cost[ i, ] min MinCost( i, ) + MinCost( +, ) i < + di [ ] d [ ] d[ ] Cut[ i, ] the index that gave the minimum in the last ( Cost[ i ) return, ] statement } 9
20 Algorithm (non-recursive version) procedure MinCost global d[0.. n], Cost[.. n,.. n], Cut[.. n,.. n] initialize Cost[ i, i] 0 for i n for i n to l do for i+ to n do Cost[ i, ] min { i < Cs o t( i, ) + Cost( +, ) + di [ ] d [ ] d[ ] Cut[ i, ] the index that gave the minimum in the last statement } 20
21 Computing M i M function MatrixProduct( i, ) // Return the product M M // global Cut[.. n,.. n], M,..., M if i = then return( M ) else Cut[ i, ] i i ( Matr uct i MatrixProdu ct + ) ) return ixprod (, ) (, n 3 Time complexity: Θ( n ) 2
22 Paragraphing Problem: Typeset a sequence of words w, w,..., w 2 2 into a paragraph with minimum cost (penalty). Words: w, w,..., w. w i L : b : length of w. length of each line. i n : ideal width of space between two words. ε : minimum required space between words. b : actual width of space between words if the line is right ustified. Assume that w + + w L for all i. i ε i+ n 22
23 If words w, w,..., w are typeset as a line, where n, i i+ ( ) ( ) ( ) = i the value of b for that line is b = L w i and the penalty is defined as: b b i if b ε Cost( i, ) = if b < ε Right ustification is not needed for the last line. So the width of space for setting w, w,..., w when n is min( b, b ), and the penalty is i i+ = ( ) b b i if ε b < b Cost( i, ) = 0 if b b if b < ε 23
24 Longest Common Subsequence Problem: Given two sequences (, 2,, n ) (,,, ) A= a a a B = b b b 2 find a longest common subsequence of A and B. n To solve it by dynamic programming, we view the problem ( x x x ) as finding an optimal sequence,,, and as: what choices are there for for x?) x 2? (Or what choices are there 24
25 Approach ( x x ) 2 (not efficient) View,, as a subsequence of A. So, the choices for x are a, a2,, an. Let Li (, ) denote the length of a longest common subseq ( a ) of A = a, a,, i i i+ n ( ) and B = b, b,, b. + n Let ϕ(, ) be the index of the first character in B is equal to a, or n+ if no such character. that ( ϕ ) { L } + max +, (, ) + i n Recurrence: Li (, ) = ϕ (, ) n 0 if the set for the max is empty Boundary condition: Ln ( +, ) = Lin (, + ) = 0, i, n+. Running time: ( n 3 ) O( n 3 ) = Θ( n 3 ) Θ + 25
26 Approach 2 ( x x ) (not efficient) View, 2, as a sequence of 0/, where x indicates whether or not to include a. The choices for each x are 0 and. i Let Li (, ) denote the length of a longest common subseq ( ) ( ) of A = a, a,, a and B = b, b,, b. i i i+ n + n Recurrence: Running time: Li (, ϕ( i, ) ) ( ) ( + ) max if ϕ( i, ) n Li (, ) = Li+, Li, otherwise ( 2) ( 3 n O n ) Θ + i i 26
27 Algorithm (non-recursive) procedure Compute-Array-L global L[.. n+,.. n+ ], ϕ[.. n,.. n] initialize Li [, n+ ] 0, Ln [ +, ] 0 for i, n+ compute ϕ[.. n,.. n] for i for n to l do n to do if ϕ( i, ) n then { ϕ } Li [, ] max + Li [ +, ( i, ) + ], Li [ +, ] else Li [, ] Li [ +, ] 27
28 Algorithm (recursive) procedure Longest( i, ) //print the longest common subsequence// //assume L[.. n+,.. n+ ] has been computed// global L[.. n+,.. n+ ] if Li [, ] = Li [ +, ] then else ( i+ ) Longest, Print ( a ) ( i+ ϕ i + ) Longest, (, ) Initial call: Longest(,) i 28
29 Approach 3 ( x x ) View,, as a sequence of decisions, where 2 x indicates whether to include a = b (if a = b) exclude a or exclude b (if a b ) Let Li (, ) denote the length of a longest common subseq ( ) ( ) of A = a, a,, a and B = b, b,, b. i i i+ n + n ( ) ( ) ( + ) + Li+, + if ai = b Recurrence: Li (, ) = max{ Li+,, Li, } if ai b Boundary: Li (, ) = 0, if i = n+ or = n+ Running time: Θ ( 2 n ) 29
30 All-Pair Shortest Paths Problem: Let GV (, E) be a weighted directed graph. For every pair of nodes u, v, find a shortest path from u to v. DP approach: u, v V, we are looing for an optimal sequence ( ) x x2 x,,...,. What choices are there for x? To answer this, we need to now the meaning of x. 30
31 Approach x : the next node. What choices are there for x? How to describe a subproblem? { } What about L( i, ) = min d( i, z) + L( z, ) : ( i, z) E? Let L ( i, ) denote the length of a shortest path from i to with at most intermediate nodes. { + } L (, i ) = min d(, i z) L ( z, ):(, i z) E. 3
32 Approach 2 x : going through node or not? What choices are there for x? Taing the bacward approach, we as whether to go through node n or not. Let D ( i, ) be the length of a shortest path from { } to with intermediate nodes, 2,...,. { } Then, D ( i, ) = min D ( i, ), D ( i, ) + D (, ). weight of edge ( i, ) if ( i, ) E 0 D ( i, ) = 0 if i = () otherwise i 32
33 Straightforward implementation 0 initialize [..,.. ] by Eq. () D n n for to n do for i to n do for to n do if [, ] [, ] [, ] then D i + D < D i D [ i, ] D [ i, ] + D [, ] P [ i, ] else D [ i, ] D [ i, ] P [ i, ] 0 33
34 Print paths Procedure Path(, i, ) //shortest path from i to w/o going thru +,, n // global D [.. n,.. n], P [.. n,.. n], 0 n. if = 0 then if i = then print i D i 0 elseif (, ) < then print i, else print "no path" elseif P [ i, ] = then Path(, i, ), Path(,, ) else Path(, i, ) 34
35 Print paths Procedure ShortestPath( i, ) //shortest path from i to // global D [.. n,.. n], P [.. n,.. n], 0 n. the largest such that P [ i, ] = let 0 if no such if = 0 then if i = then print i 0 elseif (, ) then print, D i < i else print "no path" else ShortestPath(, i, ), ShortestPath(,, ) 35
36 Eliminate the in D [.. n,.. n], P [.. n,.. n] If i and : We need [, ] only for computing [, ]. D i D i Once D [ i, ] is computed, we don't need to eep D [ i, ]. If i = or : D [ i, ] D [ i, ]. = = What does P [ i, ] indicate? Only need to now the largest such that P [ i, ] =. 36
37 Floyd's Algorithm initialize D[.. n,.. n] by Eq. () initialize P[.. n,.. n] 0 for to n do for i to n do for to n do if D[ i, ] + D[, ] < D[ i, ] then D[ i, ] D[ i, ] + D[, ] Pi [, ] 37
38 Longest Nondecreasing Subsequence Problem: Given a sequence of integers (,,, ) A= a a a n 2 find a longest nondecreasing subsequence of A. 38
39 Sum of Subset Given a positive integer M { } 2 and a multiset of positive integers A= a, a,..., a, determine if there is a subset B A such that Sum( B) = M, where Sum( B) denotes the sum of integers in B. This problem is NP-har d. n 39
40 Job Scheduling on Two Machines There are n obs to be processed, and two machines A and B are available. If ob i is processed on machine A then a units of time are needed. If it is processed on machine B then b i units of processing time are needed. Because of the peculiarities of the obs and the machines, it is possible that a > b for some i while a < b for some other. Schedule i i the obs to minimize the completion time. (If obs in J are processed by machine A and the rest by machine B, the completion time is defined to be max a, i bi.) i J i J Assume a, b 3 for all i. i i i 40
Dynamic Programming. Reading: CLRS Chapter 15 & Section CSE 2331 Algorithms Steve Lai
Dynamic Programming Reading: CLRS Chapter 5 & Section 25.2 CSE 233 Algorithms Steve Lai Optimization Problems Problems that can be solved by dynamic programming are typically optimization problems. Optimization
More informationCSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo
CSE 431/531: Analysis of Algorithms Dynamic Programming Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Paradigms for Designing Algorithms Greedy algorithm Make a
More informationDynamic Programming( Weighted Interval Scheduling)
Dynamic Programming( Weighted Interval Scheduling) 17 November, 2016 Dynamic Programming 1 Dynamic programming algorithms are used for optimization (for example, finding the shortest path between two points,
More informationLecture 7: Dynamic Programming I: Optimal BSTs
5-750: Graduate Algorithms February, 06 Lecture 7: Dynamic Programming I: Optimal BSTs Lecturer: David Witmer Scribes: Ellango Jothimurugesan, Ziqiang Feng Overview The basic idea of dynamic programming
More informationComputer Science & Engineering 423/823 Design and Analysis of Algorithms
Computer Science & Engineering 423/823 Design and Analysis of Algorithms Lecture 03 Dynamic Programming (Chapter 15) Stephen Scott and Vinodchandran N. Variyam sscott@cse.unl.edu 1/44 Introduction Dynamic
More informationCMPSCI 311: Introduction to Algorithms Second Midterm Exam
CMPSCI 311: Introduction to Algorithms Second Midterm Exam April 11, 2018. Name: ID: Instructions: Answer the questions directly on the exam pages. Show all your work for each question. Providing more
More informationDivide-and-Conquer. Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, CSE 6331 Algorithms Steve Lai
Divide-and-Conquer Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, 33.4. CSE 6331 Algorithms Steve Lai Divide and Conquer Given an instance x of a prolem, the method works as follows: divide-and-conquer
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More informationMaximum sum contiguous subsequence Longest common subsequence Matrix chain multiplication All pair shortest path Kna. Dynamic Programming
Dynamic Programming Arijit Bishnu arijit@isical.ac.in Indian Statistical Institute, India. August 31, 2015 Outline 1 Maximum sum contiguous subsequence 2 Longest common subsequence 3 Matrix chain multiplication
More informationLecture 2: Divide and conquer and Dynamic programming
Chapter 2 Lecture 2: Divide and conquer and Dynamic programming 2.1 Divide and Conquer Idea: - divide the problem into subproblems in linear time - solve subproblems recursively - combine the results in
More informationNote that M i,j depends on two entries in row (i 1). If we proceed in a row major order, these two entries will be available when we are ready to comp
CSE 3500 Algorithms and Complexity Fall 2016 Lecture 18: October 27, 2016 Dynamic Programming Steps involved in a typical dynamic programming algorithm are: 1. Identify a function such that the solution
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More informationMore Dynamic Programming
Algorithms & Models of Computation CS/ECE 374, Fall 2017 More Dynamic Programming Lecture 14 Tuesday, October 17, 2017 Sariel Har-Peled (UIUC) CS374 1 Fall 2017 1 / 48 What is the running time of the following?
More informationBinary Decision Diagrams
Binary Decision Diagrams Logic Circuits Design Seminars WS2010/2011, Lecture 2 Ing. Petr Fišer, Ph.D. Department of Digital Design Faculty of Information Technology Czech Technical University in Prague
More informationCSCE 750 Final Exam Answer Key Wednesday December 7, 2005
CSCE 750 Final Exam Answer Key Wednesday December 7, 2005 Do all problems. Put your answers on blank paper or in a test booklet. There are 00 points total in the exam. You have 80 minutes. Please note
More informationDivide-Conquer-Glue Algorithms
Divide-Conquer-Glue Algorithms Mergesort and Counting Inversions Tyler Moore CSE 3353, SMU, Dallas, TX Lecture 10 Divide-and-conquer. Divide up problem into several subproblems. Solve each subproblem recursively.
More informationCSE 202 Dynamic Programming II
CSE 202 Dynamic Programming II Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Algorithmic Paradigms Greed. Build up a solution incrementally,
More informationCS325: Analysis of Algorithms, Fall Final Exam
CS: Analysis of Algorithms, Fall 0 Final Exam I don t know policy: you may write I don t know and nothing else to answer a question and receive percent of the total points for that problem whereas a completely
More informationComputer Science & Engineering 423/823 Design and Analysis of Algorithms
Computer Science & Engineering 423/823 Design and Analysis of s Lecture 09 Dynamic Programming (Chapter 15) Stephen Scott (Adapted from Vinodchandran N. Variyam) 1 / 41 Spring 2010 Dynamic programming
More informationGreedy Algorithms My T. UF
Introduction to Algorithms Greedy Algorithms @ UF Overview A greedy algorithm always makes the choice that looks best at the moment Make a locally optimal choice in hope of getting a globally optimal solution
More informationDynamic Programming. p. 1/43
Dynamic Programming Formalized by Richard Bellman Programming relates to planning/use of tables, rather than computer programming. Solve smaller problems first, record solutions in a table; use solutions
More informationQuestion Paper Code :
www.vidyarthiplus.com Reg. No. : B.E./B.Tech. DEGREE EXAMINATION, NOVEMBER/DECEMBER 2011. Time : Three hours Fourth Semester Computer Science and Engineering CS 2251 DESIGN AND ANALYSIS OF ALGORITHMS (Regulation
More informationDynamic Programming. Cormen et. al. IV 15
Dynamic Programming Cormen et. al. IV 5 Dynamic Programming Applications Areas. Bioinformatics. Control theory. Operations research. Some famous dynamic programming algorithms. Unix diff for comparing
More informationComputational Complexity
Computational Complexity Algorithm performance and difficulty of problems So far we have seen problems admitting fast algorithms flow problems, shortest path, spanning tree... and other problems for which
More informationCPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009
CPSC 320 (Intermediate Algorithm Design and Analysis). Summer 2009. Instructor: Dr. Lior Malka Final Examination, July 24th, 2009 Student ID: INSTRUCTIONS: There are 6 questions printed on pages 1 7. Exam
More informationCopyright 2000, Kevin Wayne 1
Algorithm runtime analysis and computational tractability Time Complexity of an Algorithm How do we measure the complexity (time, space requirements) of an algorithm. 1 microsecond? Units of time As soon
More informationCS6901: review of Theory of Computation and Algorithms
CS6901: review of Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational
More informationCSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 20: Dynamic Programming
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.ucsd.edu Office 4208 CSE Building Lecture 20: Dynamic Programming DYNAMIC PROGRAMMING Dynamic programming is an algorithmic paradigm in which
More informationLecture 1: Asymptotics, Recurrences, Elementary Sorting
Lecture 1: Asymptotics, Recurrences, Elementary Sorting Instructor: Outline 1 Introduction to Asymptotic Analysis Rate of growth of functions Comparing and bounding functions: O, Θ, Ω Specifying running
More informationMore Dynamic Programming
CS 374: Algorithms & Models of Computation, Spring 2017 More Dynamic Programming Lecture 14 March 9, 2017 Chandra Chekuri (UIUC) CS374 1 Spring 2017 1 / 42 What is the running time of the following? Consider
More informationUniversity of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016
University of Toronto Department of Electrical and Computer Engineering Final Examination ECE 345 Algorithms and Data Structures Fall 2016 Print your first name, last name, UTORid, and student number neatly
More informationDynamic Programming: Shortest Paths and DFA to Reg Exps
CS 374: Algorithms & Models of Computation, Spring 207 Dynamic Programming: Shortest Paths and DFA to Reg Exps Lecture 8 March 28, 207 Chandra Chekuri (UIUC) CS374 Spring 207 / 56 Part I Shortest Paths
More informationAside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n
Aside: Golden Ratio Golden Ratio: A universal law. Golden ratio φ = lim n a n+b n a n = 1+ 5 2 a n+1 = a n + b n, b n = a n 1 Ruta (UIUC) CS473 1 Spring 2018 1 / 41 CS 473: Algorithms, Spring 2018 Dynamic
More informationCS60007 Algorithm Design and Analysis 2018 Assignment 1
CS60007 Algorithm Design and Analysis 2018 Assignment 1 Palash Dey and Swagato Sanyal Indian Institute of Technology, Kharagpur Please submit the solutions of the problems 6, 11, 12 and 13 (written in
More informationAnalysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort
Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort Xi Chen Columbia University We continue with two more asymptotic notation: o( ) and ω( ). Let f (n) and g(n) are functions that map
More informationIntroduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them
ntroduction Computer Science & Engineering 423/823 Design and Analysis of Algorithms Lecture 03 Dynamic Programming (Chapter 15) Stephen Scott and Vinodchandran N. Variyam Dynamic programming is a technique
More informationCS 580: Algorithm Design and Analysis
CS 58: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 28 Announcement: Homework 3 due February 5 th at :59PM Midterm Exam: Wed, Feb 2 (8PM-PM) @ MTHW 2 Recap: Dynamic Programming
More informationCSE 421. Dynamic Programming Shortest Paths with Negative Weights Yin Tat Lee
CSE 421 Dynamic Programming Shortest Paths with Negative Weights Yin Tat Lee 1 Shortest Paths with Neg Edge Weights Given a weighted directed graph G = V, E and a source vertex s, where the weight of edge
More informationCS2223 Algorithms D Term 2009 Exam 3 Solutions
CS2223 Algorithms D Term 2009 Exam 3 Solutions May 4, 2009 By Prof. Carolina Ruiz Dept. of Computer Science WPI PROBLEM 1: Asymptoptic Growth Rates (10 points) Let A and B be two algorithms with runtimes
More informationA Dynamic Programming algorithm for minimizing total cost of duplication in scheduling an outtree with communication delays and duplication
A Dynamic Programming algorithm for minimizing total cost of duplication in scheduling an outtree with communication delays and duplication Claire Hanen Laboratory LIP6 4, place Jussieu F-75 252 Paris
More informationMore Approximation Algorithms
CS 473: Algorithms, Spring 2018 More Approximation Algorithms Lecture 25 April 26, 2018 Most slides are courtesy Prof. Chekuri Ruta (UIUC) CS473 1 Spring 2018 1 / 28 Formal definition of approximation
More informationNP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch]
NP-Completeness Andreas Klappenecker [based on slides by Prof. Welch] 1 Prelude: Informal Discussion (Incidentally, we will never get very formal in this course) 2 Polynomial Time Algorithms Most of the
More informationSums of Products. Pasi Rastas November 15, 2005
Sums of Products Pasi Rastas November 15, 2005 1 Introduction This presentation is mainly based on 1. Bacchus, Dalmao and Pitassi : Algorithms and Complexity results for #SAT and Bayesian inference 2.
More informationRecoverable Robustness in Scheduling Problems
Master Thesis Computing Science Recoverable Robustness in Scheduling Problems Author: J.M.J. Stoef (3470997) J.M.J.Stoef@uu.nl Supervisors: dr. J.A. Hoogeveen J.A.Hoogeveen@uu.nl dr. ir. J.M. van den Akker
More informationDynamic Programming. Data Structures and Algorithms Andrei Bulatov
Dynamic Programming Data Structures and Algorithms Andrei Bulatov Algorithms Dynamic Programming 18-2 Weighted Interval Scheduling Weighted interval scheduling problem. Instance A set of n jobs. Job j
More informationAPTAS for Bin Packing
APTAS for Bin Packing Bin Packing has an asymptotic PTAS (APTAS) [de la Vega and Leuker, 1980] For every fixed ε > 0 algorithm outputs a solution of size (1+ε)OPT + 1 in time polynomial in n APTAS for
More informationToday s Outline. CS 362, Lecture 13. Matrix Chain Multiplication. Paranthesizing Matrices. Matrix Multiplication. Jared Saia University of New Mexico
Today s Outline CS 362, Lecture 13 Jared Saia University of New Mexico Matrix Multiplication 1 Matrix Chain Multiplication Paranthesizing Matrices Problem: We are given a sequence of n matrices, A 1, A
More informationCombinatorial Optimization
Combinatorial Optimization 2017-2018 1 Maximum matching on bipartite graphs Given a graph G = (V, E), find a maximum cardinal matching. 1.1 Direct algorithms Theorem 1.1 (Petersen, 1891) A matching M is
More informationMaximum Flow. Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai
Maximum Flow Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai Flow Network A low network G ( V, E) is a directed graph with a source node sv, a sink node tv, a capacity unction c. Each edge ( u,
More informationAlgorithms Exam TIN093 /DIT602
Algorithms Exam TIN093 /DIT602 Course: Algorithms Course code: TIN 093, TIN 092 (CTH), DIT 602 (GU) Date, time: 21st October 2017, 14:00 18:00 Building: SBM Responsible teacher: Peter Damaschke, Tel. 5405
More informationLongest Common Prefixes
Longest Common Prefixes The standard ordering for strings is the lexicographical order. It is induced by an order over the alphabet. We will use the same symbols (,
More informationCSE 202 Homework 4 Matthias Springer, A
CSE 202 Homework 4 Matthias Springer, A99500782 1 Problem 2 Basic Idea PERFECT ASSEMBLY N P: a permutation P of s i S is a certificate that can be checked in polynomial time by ensuring that P = S, and
More informationDynamic Programming: Shortest Paths and DFA to Reg Exps
CS 374: Algorithms & Models of Computation, Fall 205 Dynamic Programming: Shortest Paths and DFA to Reg Exps Lecture 7 October 22, 205 Chandra & Manoj (UIUC) CS374 Fall 205 / 54 Part I Shortest Paths with
More informationAlgorithms Design & Analysis. Dynamic Programming
Algorithms Design & Analysis Dynamic Programming Recap Divide-and-conquer design paradigm Today s topics Dynamic programming Design paradigm Assembly-line scheduling Matrix-chain multiplication Elements
More informationDivide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14
Divide and Conquer Algorithms CSE 101: Design and Analysis of Algorithms Lecture 14 CSE 101: Design and analysis of algorithms Divide and conquer algorithms Reading: Sections 2.3 and 2.4 Homework 6 will
More informationDesign and Analysis of Algorithms
CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 4: Divide and Conquer (I) Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Divide and Conquer ( DQ ) First paradigm or framework DQ(S)
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17 12.1 Introduction Today we re going to do a couple more examples of dynamic programming. While
More informationDAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad
DAA Unit- II Greedy and Dynamic Programming By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad 1 Greedy Method 2 Greedy Method Greedy Principal: are typically
More information6. DYNAMIC PROGRAMMING I
6. DYNAMIC PROGRAMMING I weighted interval scheduling segmented least squares knapsack problem RNA secondary structure Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley Copyright 2013
More informationCSE 421 Introduction to Algorithms Final Exam Winter 2005
NAME: CSE 421 Introduction to Algorithms Final Exam Winter 2005 P. Beame 14 March 2005 DIRECTIONS: Answer the problems on the exam paper. Open book. Open notes. If you need extra space use the back of
More information1 The independent set problem
ORF 523 Lecture 11 Spring 2016, Princeton University Instructor: A.A. Ahmadi Scribe: G. Hall Tuesday, March 29, 2016 When in doubt on the accuracy of these notes, please cross chec with the instructor
More informationHow many hours would you estimate that you spent on this assignment?
The first page of your homework submission must be a cover sheet answering the following questions. Do not leave it until the last minute; it s fine to fill out the cover sheet before you have completely
More informationGraduate Algorithms CS F-09 Dynamic Programming
Graduate Algorithms CS673-216F-9 Dynamic Programming David Galles Department of Computer Science University of San Francisco 9-: Recursive Solutions Divide a problem into smaller subproblems Recursively
More informationUnit 1A: Computational Complexity
Unit 1A: Computational Complexity Course contents: Computational complexity NP-completeness Algorithmic Paradigms Readings Chapters 3, 4, and 5 Unit 1A 1 O: Upper Bounding Function Def: f(n)= O(g(n)) if
More informationChapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright Pearson-Addison Wesley. All rights reserved. 4 4.1 Interval Scheduling Interval Scheduling Interval scheduling. Job j starts at s j and finishes
More informationCPS 616 DIVIDE-AND-CONQUER 6-1
CPS 616 DIVIDE-AND-CONQUER 6-1 DIVIDE-AND-CONQUER Approach 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger)
More informationDesign and Analysis of Algorithms
CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 5: Divide and Conquer (Part 2) Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ A Lower Bound on Convex Hull Lecture 4 Task: sort the
More informationCSE 421 Dynamic Programming
CSE Dynamic Programming Yin Tat Lee Weighted Interval Scheduling Interval Scheduling Job j starts at s(j) and finishes at f j and has weight w j Two jobs compatible if they don t overlap. Goal: find maximum
More informationLecture 21: Counting and Sampling Problems
princeton univ. F 14 cos 521: Advanced Algorithm Design Lecture 21: Counting and Sampling Problems Lecturer: Sanjeev Arora Scribe: Today s topic of counting and sampling problems is motivated by computational
More information3. Branching Algorithms
3. Branching Algorithms COMP6741: Parameterized and Exact Computation Serge Gaspers Semester 2, 2015 Contents 1 Introduction 1 2 Maximum Independent Set 3 2.1 Simple Analysis................................................
More informationAlgorithm Efficiency. Algorithmic Thinking Luay Nakhleh Department of Computer Science Rice University
Algorithm Efficiency Algorithmic Thinking Luay Nakhleh Department of Computer Science Rice University 1 All Correct Algorithms Are Not Created Equal When presented with a set of correct algorithms for
More informationAlgorithms. NP -Complete Problems. Dong Kyue Kim Hanyang University
Algorithms NP -Complete Problems Dong Kyue Kim Hanyang University dqkim@hanyang.ac.kr The Class P Definition 13.2 Polynomially bounded An algorithm is said to be polynomially bounded if its worst-case
More informationDynamic programming. Curs 2015
Dynamic programming. Curs 2015 Fibonacci Recurrence. n-th Fibonacci Term INPUT: n nat QUESTION: Compute F n = F n 1 + F n 2 Recursive Fibonacci (n) if n = 0 then return 0 else if n = 1 then return 1 else
More informationCS 470/570 Dynamic Programming. Format of Dynamic Programming algorithms:
CS 470/570 Dynamic Programming Format of Dynamic Programming algorithms: Use a recursive formula But recursion can be inefficient because it might repeat the same computations multiple times So use an
More informationFundamental Algorithms
Fundamental Algorithms Chapter 5: Searching Michael Bader Winter 2014/15 Chapter 5: Searching, Winter 2014/15 1 Searching Definition (Search Problem) Input: a sequence or set A of n elements (objects)
More informationExam in Discrete Mathematics
Exam in Discrete Mathematics First Year at the Faculty of Engineering and Science and the Technical Faculty of IT and Design June 4th, 018, 9.00-1.00 This exam consists of 11 numbered pages with 14 problems.
More informationDynamic Programming (CLRS )
Dynamic Programming (CLRS.-.) Today we discuss a technique called dynamic programming. It is neither especially dynamic nor especially programming related. We will discuss dynamic programming by looking
More informationAlgorithms and Complexity Theory. Chapter 8: Introduction to Complexity. Computer Science - Durban - September 2005
Algorithms and Complexity Theory Chapter 8: Introduction to Complexity Jules-R Tapamo Computer Science - Durban - September 2005 Contents 1 Introduction 2 1.1 Dynamic programming...................................
More informationIntroduction to Divide and Conquer
Introduction to Divide and Conquer Sorting with O(n log n) comparisons and integer multiplication faster than O(n 2 ) Periklis A. Papakonstantinou York University Consider a problem that admits a straightforward
More informationSolutions to the Mathematics Masters Examination
Solutions to the Mathematics Masters Examination OPTION 4 Spring 2007 COMPUTER SCIENCE 2 5 PM NOTE: Any student whose answers require clarification may be required to submit to an oral examination. Each
More informationCSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD
Greedy s Greedy s Shortest path Claim 2: Let S be a subset of vertices containing s such that we know the shortest path length l(s, u) from s to any vertex in u S. Let e = (u, v) be an edge such that 1
More informationV. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005
V. Adamchi Recurrences Victor Adamchi Fall of 00 Plan Multiple roots. More on multiple roots. Inhomogeneous equations 3. Divide-and-conquer recurrences In the previous lecture we have showed that if the
More informationOutline. Relaxation. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING. 1. Lagrangian Relaxation. Lecture 12 Single Machine Models, Column Generation
Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING 1. Lagrangian Relaxation Lecture 12 Single Machine Models, Column Generation 2. Dantzig-Wolfe Decomposition Dantzig-Wolfe Decomposition Delayed Column
More informationCS481: Bioinformatics Algorithms
CS481: Bioinformatics Algorithms Can Alkan EA224 calkan@cs.bilkent.edu.tr http://www.cs.bilkent.edu.tr/~calkan/teaching/cs481/ Reminder The TA will hold a few recitation sessions for the students from
More informationCS 170 Algorithms Fall 2014 David Wagner MT2
CS 170 Algorithms Fall 2014 David Wagner MT2 PRINT your name:, (last) SIGN your name: (first) Your Student ID number: Your Unix account login: cs170- The room you are sitting in right now: Name of the
More information3. Branching Algorithms COMP6741: Parameterized and Exact Computation
3. Branching Algorithms COMP6741: Parameterized and Exact Computation Serge Gaspers 12 1 School of Computer Science and Engineering, UNSW Australia 2 Optimisation Resarch Group, NICTA Semester 2, 2015
More informationThis means that we can assume each list ) is
This means that we can assume each list ) is of the form ),, ( )with < and Since the sizes of the items are integers, there are at most +1pairs in each list Furthermore, if we let = be the maximum possible
More informationChapter 8 Dynamic Programming
Chapter 8 Dynamic Programming Copyright 007 Pearson Addison-Wesley. All rights reserved. Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by
More informationAlgorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {
Question 1. (10 points) for (i = 1; i
More informationAlgorithms PART II: Partitioning and Divide & Conquer. HPC Fall 2007 Prof. Robert van Engelen
Algorithms PART II: Partitioning and Divide & Conquer HPC Fall 2007 Prof. Robert van Engelen Overview Partitioning strategies Divide and conquer strategies Further reading HPC Fall 2007 2 Partitioning
More informationExact Algorithms for Dominating Induced Matching Based on Graph Partition
Exact Algorithms for Dominating Induced Matching Based on Graph Partition Mingyu Xiao School of Computer Science and Engineering University of Electronic Science and Technology of China Chengdu 611731,
More information1 More finite deterministic automata
CS 125 Section #6 Finite automata October 18, 2016 1 More finite deterministic automata Exercise. Consider the following game with two players: Repeatedly flip a coin. On heads, player 1 gets a point.
More informationCS Fall 2011 P and NP Carola Wenk
CS3343 -- Fall 2011 P and NP Carola Wenk Slides courtesy of Piotr Indyk with small changes by Carola Wenk 11/29/11 CS 3343 Analysis of Algorithms 1 We have seen so far Algorithms for various problems Running
More informationLecture 2: Scheduling on Parallel Machines
Lecture 2: Scheduling on Parallel Machines Loris Marchal October 17, 2012 Parallel environment alpha in Graham s notation): P parallel identical Q uniform machines: each machine has a given speed speed
More informationCSE 591 Foundations of Algorithms Homework 4 Sample Solution Outlines. Problem 1
CSE 591 Foundations of Algorithms Homework 4 Sample Solution Outlines Problem 1 (a) Consider the situation in the figure, every edge has the same weight and V = n = 2k + 2. Easy to check, every simple
More informationDesign and Analysis of Algorithms
Design and Analysis of Algorithms CSE 5311 Lecture 22 All-Pairs Shortest Paths Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 All Pairs
More informationCopyright 2000, Kevin Wayne 1
//8 Fast Integer Division Too (!) Schönhage Strassen algorithm CS 8: Algorithm Design and Analysis Integer division. Given two n-bit (or less) integers s and t, compute quotient q = s / t and remainder
More informationOutline. Part 5. Computa0onal Complexity (3) Compound Interest 2/15/12. Recurrence Rela0ons: An Overview. Recurrence Rela0ons: Formal Defini0on
Outline Part 5. Computa0onal Complexity (3) Recurrence Relations Divide and Conquer Understanding the Complexity of Algorithms CS 200 Algorithms and Data Structures 1 2 Recurrence Rela0ons: An Overview
More informationFriday Four Square! Today at 4:15PM, Outside Gates
P and NP Friday Four Square! Today at 4:15PM, Outside Gates Recap from Last Time Regular Languages DCFLs CFLs Efficiently Decidable Languages R Undecidable Languages Time Complexity A step of a Turing
More informationCS 5114: Theory of Algorithms. Tractable Problems. Tractable Problems (cont) Decision Problems. Clifford A. Shaffer. Spring 2014
Department of Computer Science Virginia Tech Blacksburg, Virginia Copyright c 2014 by Clifford A. Shaffer : Theory of Algorithms Title page : Theory of Algorithms Clifford A. Shaffer Spring 2014 Clifford
More information