Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming
|
|
- Godfrey McDowell
- 6 years ago
- Views:
Transcription
1 lgorithmic Paradigms Dynamic Programming reed Build up a solution incrementally, myopically optimizing some local criterion Divide-and-conquer Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to sub-problems to form solution to original problem Dynamic programming Break up a problem into a series of overlapping sub-problems, and build up solutions to larger and larger sub-problems lgorithm Design by Éva Tardos and Jon Kleinberg Portions opyright 25 ddison Wesley Slides by Kevin Wayne and Neil Rhodes 2 Dynamic Programming Steps haracterize structure of optimal solution Weighted Interval Scheduling Define a recurrence for the value of an optimal solution ompute the value of an optimal solution Either bottom-up Top-down with memoization onstruct the optimal solution from optimal value and intermediate information 3
2 Weighted Interval Scheduling nweighted Interval Scheduling Review Weighted interval scheduling problem Job j starts at s j, finishes at f j, and has weight or value v j Two jobs compatible if they don't overlap oal: find maximum weight subset of mutually compatible jobs reedy algorithm works if all weights are onsider jobs in ascending order of finish time dd job to subset if it is compatible with previously chosen jobs b a c Observation reedy algorithm can fail spectacularly if arbitrary weights are allowed d e f g h Time weight = 999 weight = a b Time 5 Weighted Interval Scheduling Dynamic Programming: Binary hoice Notation Label jobs by finishing time: f f 2 f n Def p(j) = largest index i < j such that job i is compatible with j Ex: p(8) = 5, p() = 3, p(2) = Notation OPT(j) = value of optimal solution to the problem consisting of job requests, 2,, j ase : OPT selects job j can't use incompatible jobs { p(j), p(j) 2,, j - must include optimal solution to problem consisting of remaining compatible jobs, 2,, p(j) ase 2: OPT does not select job j must include optimal solution to problem consisting of remaining compatible jobs, 2,, j- optimal substructure Time # if j = OPT( j) = $ % max v j OPT( p( j)), OPT( j ") { otherwise 8
3 Weighted Interval Scheduling: Brute Force Weighted Interval Scheduling: Brute Force Brute force algorithm Observation Recursive algorithm fails spectacularly because of redundant sub-problems $ exponential algorithms Input: n, s,,s n, f,,f n, v,,v n Sort jobs by finish times so that f f 2 f n Ex Number of recursive calls for family of "layered" instances grows like Fibonacci sequence 5 ompute p(), p(2),, p(n) 4 3 ompute-opt(j) { if (j = ) return else return max(v j ompute-opt(p(j)), ompute-opt(j-)) p() =, p(j) = j Weighted Interval Scheduling: Memoization Weighted Interval Scheduling: Running Time Memoization Store results of each sub-problem in a cache; lookup as needed laim Memoized version of algorithm takes O(n log n) time Sort by finish time: O(n log n) omputing p(") : O(n) after sorting by start time Input: n, s,,s n, f,,f n, v,,v n Sort jobs by finish times so that f f 2 f n ompute p(), p(2),, p(n) for j = to n M[j] = empty M[j] = global array M-ompute-Opt(j) { if (M[j] is empty) M[j] = max(w j M-ompute-Opt(p(j)), M-ompute-Opt(j-)) return M[j] M-ompute-Opt(j): each invocation takes O() time and either (i) returns an existing value M[j] (ii) fills in one new entry M[j] and makes two recursive calls Progress measure # = # nonempty entries of M[] initially # =, throughout # n (ii) increases # by $ at most 2n recursive calls Overall running time of M-ompute-Opt(n) is O(n) Remark O(n) if jobs are pre-sorted by start and finish times 2
4 utomated Memoization Weighted Interval Scheduling: Bottom-p utomated memoization Many functional programming languages (eg, Lisp) have library or built-in support for memoization Bottom-up dynamic programming nwind recursion (def-memo-fun F (n) (if (<= n ) n ( (F (- n )) (F (- n 2))))) Lisp (efficient) static int F(int n) { if (n <= ) return n; else return F(n-) F(n-2); Java (exponential) F(4) Input: n, s,,s n, f,,f n, v,,v n Sort jobs by finish times so that f f 2 f n ompute p(), p(2),, p(n) Iterative-ompute-Opt { M[] = for j = to n M[j] = max(v j M[p(j)], M[j-]) F(39) F(38) F(38) F(3) F(3) F(3) F(3) F(3) F(3) F(35) F(3) F(35) F(35) F(34) 3 4 Weighted Interval Scheduling: Finding a Solution Q Dynamic programming algorithms computes optimal value What if we want the solution itself? Do some post-processing Knapsack Problem Run M-ompute-Opt(n) Run Find-Solution(n) Find-Solution(j) { if (j = ) output nothing else if (v j M[p(j)] > M[j-]) print j Find-Solution(p(j)) else Find-Solution(j-) # of recursive calls n $ O(n) 5
5 Knapsack Problem Dynamic Programming: False Start Knapsack problem iven n objects and a "knapsack" Item i weighs w i > kilograms and has value v i > Knapsack has capacity of W kilograms oal: choose subset of items to fill knapsack so as to maximize total value Ex: { 3, 4 has value 4 Item Value Weight W = Def OPT(i) = max profit subset of items,, i ase : OPT does not select item i OPT selects best of {, 2,, i- ase 2: OPT selects item i accepting item i does not immediately imply that we will have to reject other items without knowing what other items were selected before i, we don't even know if we have enough room for i onclusion Need more sub-problems reedy: repeatedly add item with maximum ratio v i / w i Ex: { 5, 2, achieves only value = 35 $ greedy not optimal 8 Dynamic Programming: dding a New Variable Knapsack Problem: Bottom-p Def OPT(i, w) = max profit subset of items,, i with weight limit w Knapsack Fill up an n-by-w array ase : OPT does not select item i OPT selects best of {, 2,, i- using weight limit w ase 2: OPT selects item i new weight limit = w w i OPT selects best of {, 2,, i using this new weight limit # if i = % OPT(i, w) = $ OPT(i ", w) if w i > w % & max{ OPT(i ", w), v i OPT(i ", w " w i ) otherwise Input: n, w,,w N, v,,v N for w = to W M[, w] = for i = to n M[i, ] = for i = to n for w = to W if (w i > w) M[i, w] = M[i-, w] else M[i, w] = max {M[i-, w], v i M[i-, w-w i ] return M[n, W] 9 2
6 Knapsack lgorithm Knapsack Problem: Running Time W Running time %(n W) Not polynomial in input size "Pseudo-polynomial" & Decision version of Knapsack is NP-complete { n {, 2 {, 2, {, 2, 3, {, 2, 3, 4, Item Value Weight OPT: { 4, 3 value = 22 8 = 4 W = Finding Optimal Substructure solution to the problem makes a choice Leaves one or more subproblems to be solved RN Secondary Structure For example, choose whether to use an item Figure out which subproblems to use if you knew the choice that led to an optimal solution Try all possible choices and choose the one that provides the optimal solution Show that each subproblem in an optimal solution is itself optimal Optimal substructure How many subproblems are needed in an optimal solution to the original problem For the examples so far, this has been one How many choices must be considered 23
7 RN Secondary Structure RN Secondary Structure RN String B = b b 2 b n over alphabet {,,, Secondary structure RN is single-stranded so it tends to loop back and form base pairs with itself This structure is essential for understanding behavior of molecule Ex: Secondary structure set of pairs S = { (b i, b j ) that satisfy: [Watson-rick] S is a matching and each pair in S is a Watson- rick complement: -, -, -, or - [No sharp turns] The ends of each pair are separated by at least 4 intervening bases If (b i, b j ) ' S, then i < j - 4 [Non-crossing] If (b i, b j ) and (b k, b l ) are two pairs in S, then we cannot have i < k < j < l Free energy sual hypothesis is that an RN molecule will form the secondary structure with the optimum total free energy approximate by number of base pairs oal iven an RN molecule B = b b 2 b n, find a secondary structure S that maximizes the number of base pairs complementary base pairs: -, RN Secondary Structure: Examples RN Secondary Structure: Subproblems Examples First attempt OPT(j) = maximum number of base pairs in a secondary structure of the substring b b 2 b j match b t and b n base pair t n Difficulty Results in two sub-problems 4 Finding secondary structure in: b b 2 b t- Finding secondary structure in: b t b t2 b n- OPT(t-) need more sub-problems ok sharp turn crossing 2 28
8 Dynamic Programming Over Intervals Bottom p Dynamic Programming Over Intervals Notation OPT(i, j) = maximum number of base pairs in a secondary structure of the substring b i b i b j Q What order to solve the sub-problems? Do shortest intervals first ase If i ( j - 4 OPT(i, j) = by no-sharp turns condition ase 2 Base b j is not involved in a pair OPT(i, j) = OPT(i, j-) RN(b,,b n ) { for k = 5,,, n- for i =, 2,, n-k j = i k ompute M[i, j] i ase 3 Base b j pairs with b t for some i t < j - 4 return M[, n] using recurrence 5 8 j non-crossing constraint decouples resulting sub-problems OPT(i, j) = max t { OPT(i, t-) OPT(t, j-) take max over t such that i t < j-4 and b t and b j are Watson-rick complements Running time O(n 3 ) Remark Same core idea in KY algorithm to parse context-free grammars 29 3 Dynamic Programming Summary Recipe haracterize structure of problem Sequence lignment Recursively define value of optimal solution ompute value of optimal solution onstruct optimal solution from computed information Dynamic programming techniques Binary choice: weighted interval scheduling dding a new variable: knapsack Dynamic programming over intervals: RN secondary structure Top-down vs bottom-up: different people have different intuitions 3
9 String Similarity Edit Distance How similar are two strings? ocurrance occurrence o c u r r a n c e - o c c u r r e n c e mismatches, gap o c - u r r a n c e pplications Basis for nix diff Speech recognition omputational biology Edit distance [Levenshtein 9, Needleman-Wunsch 9, Smith-Waterman 98] ap penalty ); mismatch penalty * pq ost = sum of gap and mismatch penalties o c c u r r e n c e mismatch, gap T T T - T T T o c - u r r - a n c e T T T T - T T o c c u r r e - n c e mismatches, 3 gaps * T * T * 2* 2) * Sequence lignment Sequence lignment: Problem Structure oal: iven two strings X = x x m and Y = y y n find alignment of minimum cost Def n alignment M is a set of ordered pairs x i -y j such that each item occurs in at most one pair and no crossings Def The pair x i -y j and x i' -y j' cross if i < i', but j > j' cost( M ) = $ " xi y j $ % $ % (x i, y j ) # M i : x i unmatched j : y j unmatched mismatch Ex: T vs TT Sol: M = -y, -, x 4 -, x 5 -, x -y gap x x 4 x 5 x T - - T T Def OPT(i, j) = min cost of aligning strings x x i and y y j ase : OPT matches x i -y j pay mismatch for x i -y j min cost of aligning two strings x x i- and y y j- ase 2a: OPT leaves x i unmatched pay gap for x i and min cost of aligning x x i- and y y j ase 2b: OPT leaves y j unmatched pay gap for y j and min cost of aligning x x i and y y j- " j& if i = $ " ' xi y j OPT(i (, j () $ $ OPT(i, j) = # min # & OPT(i (, j) otherwise $ $ & OPT(i, j () % % $ i& if j = y y 35 3
10 Sequence lignment: lgorithm Sequence-lignment(m, n, x x m, y y n, ), *) { for i = to m M[, i] = i) for j = to n M[j, ] = j) Sequence lignment in Linear Space for i = to m for j = to n M[i, j] = min(*[x i, y j ] M[i-, j-], ) M[i-, j], ) M[i, j-]) return M[m, n] nalysis %(mn) time and space English words or sentences: m, n omputational biology: m = n =, billions ops OK, but B array? 3 Sequence lignment: Linear Space Sequence lignment: Linear Space Q an we avoid using quadratic space? Easy Optimal value in O(m n) space and O(mn) time ompute OPT(i, ) from OPT(i-, ) Edit distance graph Let f(i, j) be shortest path from (,) to (i, j) Observation: f(i, j) = OPT(i, j) No longer a simple way to recover alignment itself Theorem [Hirschberg, 95] Optimal alignment in O(m n) space and O(mn) time - y y lever combination of divide-and-conquer and dynamic programming Inspired by idea of Savitch from complexity theory x " xi y j ) ) 39 4
11 Sequence lignment: Linear Space Sequence lignment: Linear Space Edit distance graph Let f(i, j) be shortest path from (,) to (i, j) an compute f (, j) for any j in O(mn) time and O(m n) space Edit distance graph Let g(i, j) be shortest path from (i, j) to (m, n) an compute by reversing the edge orientations and inverting the roles of (, ) and (m, n) j y y y y - - x x ) " xi y j ) 4 42 Sequence lignment: Linear Space Sequence lignment: Linear Space Edit distance graph Let g(i, j) be shortest path from (i, j) to (m, n) an compute g(, j) for any j in O(mn) time and O(m n) space Observation The cost of the shortest path that uses (i, j) is f(i, j) g(i, j) j y y y y - - x x 43 44
12 Sequence lignment: Linear Space Sequence lignment: Linear Space Observation 2 let q be an index that minimizes f(q, n/2) g(q, n/2) Then, the shortest path from (, ) to (m, n) uses (q, n/2) Divide: find index q that minimizes f(q, n/2) g(q, n/2) using DP lign x q and y n/2 onquer: recursively compute optimal alignment in each piece n / 2 n / 2 - y y - y y x q x q 45 4 Sequence lignment: Running Time nalysis Sequence lignment in Linear Space Theorem Let T(m, n) = max running time of algorithm on strings of length at most m and n T(m, n) = %(mn) T(m, n) = T(q, n/2) T(m-q, n/2) %(mn) T(m, n) = T(q, n/2) T(m-q, n/2) cmn Note that (q*n/2) (m-q)*n/2 = (qm-q)n/2 = mq/2 Level ost cmn nother way to look at the solution Quadratic: OPT(i, j) = min cost of aligning strings x x i and y y j The first part of x matches with the first part of y Linear: OPT(i) = min cost of aligning strings x x i and y y n/2 and x i x i2 x m and y n/2 y n/22 y n The first half of y matches with some initial part of x and the second half of y matches with the rest of x Linear space algorithm due to Hirschberg, 95 2 cmn/2 i cmn/2 i Total time is %(mn) 4 48
Chapter 6. Weighted Interval Scheduling. Dynamic Programming. Algorithmic Paradigms. Dynamic Programming Applications
lgorithmic Paradigms hapter Dynamic Programming reedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into sub-problems, solve each
More informationCSE 202 Dynamic Programming II
CSE 202 Dynamic Programming II Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Algorithmic Paradigms Greed. Build up a solution incrementally,
More informationCopyright 2000, Kevin Wayne 1
/9/ lgorithmic Paradigms hapter Dynamic Programming reed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into two sub-problems, solve
More informationAreas. ! Bioinformatics. ! Control theory. ! Information theory. ! Operations research. ! Computer science: theory, graphics, AI, systems,.
lgorithmic Paradigms hapter Dynamic Programming reed Build up a solution incrementally, myopically optimizing some local criterion Divide-and-conquer Break up a problem into two sub-problems, solve each
More informationChapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing
More informationCS 580: Algorithm Design and Analysis
CS 58: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 28 Announcement: Homework 3 due February 5 th at :59PM Midterm Exam: Wed, Feb 2 (8PM-PM) @ MTHW 2 Recap: Dynamic Programming
More informationDynamic Programming 1
Dynamic Programming 1 lgorithmic Paradigms Divide-and-conquer. Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to sub-problems to form solution to original
More information6. DYNAMIC PROGRAMMING I
lgorithmic paradigms 6. DYNMI PRORMMIN I weighted interval scheduling segmented least squares knapsack problem RN secondary structure reedy. Build up a solution incrementally, myopically optimizing some
More informationDynamic Programming: Interval Scheduling and Knapsack
Dynamic Programming: Interval Scheduling and Knapsack . Weighted Interval Scheduling Weighted Interval Scheduling Weighted interval scheduling problem. Job j starts at s j, finishes at f j, and has weight
More information6. DYNAMIC PROGRAMMING I
6. DYNAMIC PROGRAMMING I weighted interval scheduling segmented least squares knapsack problem RNA secondary structure Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley Copyright 2013
More informationCSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure
CSE Weighted Interval Scheduling, Knapsack, RNA Secondary Structure Shayan Oveis haran Weighted Interval Scheduling Interval Scheduling Job j starts at s(j) and finishes at f j and has weight w j Two jobs
More informationDynamic Programming. Cormen et. al. IV 15
Dynamic Programming Cormen et. al. IV 5 Dynamic Programming Applications Areas. Bioinformatics. Control theory. Operations research. Some famous dynamic programming algorithms. Unix diff for comparing
More informationChapter 6. Dynamic Programming. CS 350: Winter 2018
Chapter 6 Dynamic Programming CS 350: Winter 2018 1 Algorithmic Paradigms Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into
More information6. DYNAMIC PROGRAMMING I
6. DYNAMIC PRORAMMIN I weighted interval scheduling segmented least squares knapsack problem RNA secondary structure Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos
More informationCopyright 2000, Kevin Wayne 1
//8 Fast Integer Division Too (!) Schönhage Strassen algorithm CS 8: Algorithm Design and Analysis Integer division. Given two n-bit (or less) integers s and t, compute quotient q = s / t and remainder
More informationCS 580: Algorithm Design and Analysis
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 208 Announcement: Homework 3 due February 5 th at :59PM Final Exam (Tentative): Thursday, May 3 @ 8AM (PHYS 203) Recap: Divide
More informationCSE 421 Dynamic Programming
CSE Dynamic Programming Yin Tat Lee Weighted Interval Scheduling Interval Scheduling Job j starts at s(j) and finishes at f j and has weight w j Two jobs compatible if they don t overlap. Goal: find maximum
More informationDynamic Programming. Data Structures and Algorithms Andrei Bulatov
Dynamic Programming Data Structures and Algorithms Andrei Bulatov Algorithms Dynamic Programming 18-2 Weighted Interval Scheduling Weighted interval scheduling problem. Instance A set of n jobs. Job j
More information6.6 Sequence Alignment
6.6 Sequence Alignment String Similarity How similar are two strings? ocurrance o c u r r a n c e - occurrence First model the problem Q. How can we measure the distance? o c c u r r e n c e 6 mismatches,
More informationChapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some
More informationDynamic Programming( Weighted Interval Scheduling)
Dynamic Programming( Weighted Interval Scheduling) 17 November, 2016 Dynamic Programming 1 Dynamic programming algorithms are used for optimization (for example, finding the shortest path between two points,
More informationChapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing
More information6. DYNAMIC PROGRAMMING II
6. DYNAMIC PROGRAMMING II sequence alignment Hirschberg's algorithm Bellman-Ford algorithm distance vector protocols negative cycles in a digraph Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison
More informationLecture 2: Divide and conquer and Dynamic programming
Chapter 2 Lecture 2: Divide and conquer and Dynamic programming 2.1 Divide and Conquer Idea: - divide the problem into subproblems in linear time - solve subproblems recursively - combine the results in
More information1.1 First Problem: Stable Matching. Six medical students and four hospitals. Student Preferences. s6 h3 h1 h4 h2. Stable Matching Problem
//0 hapter Introduction: Some Representative Problems Slides by Kevin ayne. opyright 00 Pearson-ddison esley. ll rights reserved.. First Problem: Stable Matching Six medical students and four hospitals
More informationRNA Secondary Structure. CSE 417 W.L. Ruzzo
RN Secondary Structure SE 417 W.L. Ruzzo The Double Helix Los lamos Science The entral Dogma of Molecular Biology DN RN Protein gene Protein DN (chromosome) cell RN (messenger) Non-coding RN Messenger
More informationObjec&ves. Review. Dynamic Programming. What is the knapsack problem? What is our solu&on? Ø Review Knapsack Ø Sequence Alignment 3/28/18
/8/8 Objec&ves Dynamic Programming Ø Review Knapsack Ø Sequence Alignment Mar 8, 8 CSCI - Sprenkle Review What is the knapsack problem? What is our solu&on? Mar 8, 8 CSCI - Sprenkle /8/8 Dynamic Programming:
More informationCS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016
CS Lunch 1 Michelle Oraa Ali Tien Dao Vladislava Paskova Nan Zhuang Surabhi Sharma Wednesday, 12:15 PM Kendade 307 2 Midterm 2! Monday, November 21 In class Covers Greedy Algorithms Closed book Dynamic
More informationCSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo
CSE 431/531: Analysis of Algorithms Dynamic Programming Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Paradigms for Designing Algorithms Greedy algorithm Make a
More informationAside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n
Aside: Golden Ratio Golden Ratio: A universal law. Golden ratio φ = lim n a n+b n a n = 1+ 5 2 a n+1 = a n + b n, b n = a n 1 Ruta (UIUC) CS473 1 Spring 2018 1 / 41 CS 473: Algorithms, Spring 2018 Dynamic
More informationAlgoritmiek, bijeenkomst 3
Algoritmiek, bijeenkomst 3 Mathijs de Weerdt Today Introduction Greedy Divide and Conquer (very briefly) Dynamic programming Slides with thanks to Kevin Wayne and Pearson Education (made available together
More informationThe Double Helix. CSE 417: Algorithms and Computational Complexity! The Central Dogma of Molecular Biology! DNA! RNA! Protein! Protein!
The Double Helix SE 417: lgorithms and omputational omplexity! Winter 29! W. L. Ruzzo! Dynamic Programming, II" RN Folding! http://www.rcsb.org/pdb/explore.do?structureid=1t! Los lamos Science The entral
More informationAlgorithms and Theory of Computation. Lecture 9: Dynamic Programming
Algorithms and Theory of Computation Lecture 9: Dynamic Programming Xiaohui Bei MAS 714 September 10, 2018 Nanyang Technological University MAS 714 September 10, 2018 1 / 21 Recursion in Algorithm Design
More informationOutline DP paradigm Discrete optimisation Viterbi algorithm DP: 0 1 Knapsack. Dynamic Programming. Georgy Gimel farb
Outline DP paradigm Discrete optimisation Viterbi algorithm DP: Knapsack Dynamic Programming Georgy Gimel farb (with basic contributions by Michael J. Dinneen) COMPSCI 69 Computational Science / Outline
More informationChapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 4.1 Interval Scheduling Interval Scheduling Interval scheduling. Job j starts at s j and
More informationChapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright Pearson-Addison Wesley. All rights reserved. 4 4.1 Interval Scheduling Interval Scheduling Interval scheduling. Job j starts at s j and finishes
More informationMore Dynamic Programming
CS 374: Algorithms & Models of Computation, Spring 2017 More Dynamic Programming Lecture 14 March 9, 2017 Chandra Chekuri (UIUC) CS374 1 Spring 2017 1 / 42 What is the running time of the following? Consider
More informationMore Dynamic Programming
Algorithms & Models of Computation CS/ECE 374, Fall 2017 More Dynamic Programming Lecture 14 Tuesday, October 17, 2017 Sariel Har-Peled (UIUC) CS374 1 Fall 2017 1 / 48 What is the running time of the following?
More informationDynamic programming. Curs 2015
Dynamic programming. Curs 2015 Fibonacci Recurrence. n-th Fibonacci Term INPUT: n nat QUESTION: Compute F n = F n 1 + F n 2 Recursive Fibonacci (n) if n = 0 then return 0 else if n = 1 then return 1 else
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 18 Dynamic Programming (Segmented LS recap) Longest Common Subsequence Adam Smith Segmented Least Squares Least squares. Foundational problem in statistic and numerical
More informationMatching Residents to Hospitals
Midterm Review Matching Residents to Hospitals Goal. Given a set of preferences among hospitals and medical school students, design a self-reinforcing admissions process. Unstable pair: applicant x and
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17 12.1 Introduction Today we re going to do a couple more examples of dynamic programming. While
More informationDesign and Analysis of Algorithms
CSE 0, Winter 08 Design and Analysis of Algorithms Lecture 8: Consolidation # (DP, Greed, NP-C, Flow) Class URL: http://vlsicad.ucsd.edu/courses/cse0-w8/ Followup on IGO, Annealing Iterative Global Optimization
More informationGreedy Algorithms My T. UF
Introduction to Algorithms Greedy Algorithms @ UF Overview A greedy algorithm always makes the choice that looks best at the moment Make a locally optimal choice in hope of getting a globally optimal solution
More informationDynamic programming. Curs 2017
Dynamic programming. Curs 2017 Fibonacci Recurrence. n-th Fibonacci Term INPUT: n nat QUESTION: Compute F n = F n 1 + F n 2 Recursive Fibonacci (n) if n = 0 then return 0 else if n = 1 then return 1 else
More informationLecture 2: Pairwise Alignment. CG Ron Shamir
Lecture 2: Pairwise Alignment 1 Main source 2 Why compare sequences? Human hexosaminidase A vs Mouse hexosaminidase A 3 www.mathworks.com/.../jan04/bio_genome.html Sequence Alignment עימוד רצפים The problem:
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 6 Greedy Algorithms Interval Scheduling Interval Partitioning Scheduling to Minimize Lateness Sofya Raskhodnikova S. Raskhodnikova; based on slides by E. Demaine,
More informationDynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!
Dynamic Programming Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff! Optimization Problems For most, the best known algorithm runs in exponential time.
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 5 Greedy Algorithms Interval Scheduling Interval Partitioning Guest lecturer: Martin Furer Review In a DFS tree of an undirected graph, can there be an edge (u,v)
More informationActivity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.
Greedy Algorithm 1 Introduction Similar to dynamic programming. Used for optimization problems. Not always yield an optimal solution. Make choice for the one looks best right now. Make a locally optimal
More informationDAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad
DAA Unit- II Greedy and Dynamic Programming By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad 1 Greedy Method 2 Greedy Method Greedy Principal: are typically
More informationChapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 4.1 Interval Scheduling Interval Scheduling Interval scheduling. Job j starts at s j and
More informationMatrix Multiplication. Data Structures and Algorithms Andrei Bulatov
Matrix Multiplicatio Data Structures ad Algorithms Adrei Bulatov Algorithms Matrix Multiplicatio 7- Matrix Multiplicatio Matrix multiplicatio. Give two -by- matrices A ad B, compute A B. k kj ik ij b a
More informationCS 6901 (Applied Algorithms) Lecture 3
CS 6901 (Applied Algorithms) Lecture 3 Antonina Kolokolova September 16, 2014 1 Representative problems: brief overview In this lecture we will look at several problems which, although look somewhat similar
More informationCS 580: Algorithm Design and Analysis
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Announcement: Homework 1 due soon! Due: January 25 th at midnight (Blackboard) Recap: Graphs Bipartite Graphs Definition
More informationDynamic Programming. Prof. S.J. Soni
Dynamic Programming Prof. S.J. Soni Idea is Very Simple.. Introduction void calculating the same thing twice, usually by keeping a table of known results that fills up as subinstances are solved. Dynamic
More informationChapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should
More informationCS60007 Algorithm Design and Analysis 2018 Assignment 1
CS60007 Algorithm Design and Analysis 2018 Assignment 1 Palash Dey and Swagato Sanyal Indian Institute of Technology, Kharagpur Please submit the solutions of the problems 6, 11, 12 and 13 (written in
More informationKnapsack and Scheduling Problems. The Greedy Method
The Greedy Method: Knapsack and Scheduling Problems The Greedy Method 1 Outline and Reading Task Scheduling Fractional Knapsack Problem The Greedy Method 2 Elements of Greedy Strategy An greedy algorithm
More informationGreedy Algorithms. Kleinberg and Tardos, Chapter 4
Greedy Algorithms Kleinberg and Tardos, Chapter 4 1 Selecting breakpoints Road trip from Fort Collins to Durango on a given route. Fuel capacity = C. Goal: makes as few refueling stops as possible. Greedy
More informationCS473 - Algorithms I
CS473 - Algorithms I Lecture 10 Dynamic Programming View in slide-show mode CS 473 Lecture 10 Cevdet Aykanat and Mustafa Ozdal, Bilkent University 1 Introduction An algorithm design paradigm like divide-and-conquer
More informationCS 6783 (Applied Algorithms) Lecture 3
CS 6783 (Applied Algorithms) Lecture 3 Antonina Kolokolova January 14, 2013 1 Representative problems: brief overview of the course In this lecture we will look at several problems which, although look
More information1 Assembly Line Scheduling in Manufacturing Sector
Indian Institute of Information Technology Design and Manufacturing, Kancheepuram, Chennai 600 27, India An Autonomous Institute under MHRD, Govt of India http://www.iiitdm.ac.in COM 209T Design and Analysis
More informationCSE 202 Homework 4 Matthias Springer, A
CSE 202 Homework 4 Matthias Springer, A99500782 1 Problem 2 Basic Idea PERFECT ASSEMBLY N P: a permutation P of s i S is a certificate that can be checked in polynomial time by ensuring that P = S, and
More informationLecture 9. Greedy Algorithm
Lecture 9. Greedy Algorithm T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo choo@skku.edu Copyright 2000-2018
More informationCSE 421 Greedy Algorithms / Interval Scheduling
CSE 421 Greedy Algorithms / Interval Scheduling Yin Tat Lee 1 Interval Scheduling Job j starts at s(j) and finishes at f(j). Two jobs compatible if they don t overlap. Goal: find maximum subset of mutually
More information2. ALGORITHM ANALYSIS
2. ALGORITHM ANALYSIS computational tractability asymptotic order of growth survey of common running times Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos
More informationComputer Science & Engineering 423/823 Design and Analysis of Algorithms
Computer Science & Engineering 423/823 Design and Analysis of Algorithms Lecture 03 Dynamic Programming (Chapter 15) Stephen Scott and Vinodchandran N. Variyam sscott@cse.unl.edu 1/44 Introduction Dynamic
More informationCS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming
// CS8 Lecture Jana Kosecka Dynamic Programming Greedy Algorithms Many slides here are based on D. Luebke slides Review: Dynamic Programming A meta-technique, not an algorithm (like divide & conquer) Applicable
More informationLecture 7: Dynamic Programming I: Optimal BSTs
5-750: Graduate Algorithms February, 06 Lecture 7: Dynamic Programming I: Optimal BSTs Lecturer: David Witmer Scribes: Ellango Jothimurugesan, Ziqiang Feng Overview The basic idea of dynamic programming
More informationPartha Sarathi Mandal
MA 252: Data Structures and Algorithms Lecture 32 http://www.iitg.ernet.in/psm/indexing_ma252/y12/index.html Partha Sarathi Mandal Dept. of Mathematics, IIT Guwahati The All-Pairs Shortest Paths Problem
More information0-1 Knapsack Problem in parallel Progetto del corso di Calcolo Parallelo AA
0-1 Knapsack Problem in parallel Progetto del corso di Calcolo Parallelo AA 2008-09 Salvatore Orlando 1 0-1 Knapsack problem N objects, j=1,..,n Each kind of item j has a value p j and a weight w j (single
More informationChapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 P and NP P: The family of problems that can be solved quickly in polynomial time.
More informationCS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1
CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring 2017 1 / 1 Part I Greedy Algorithms: Tools and Techniques Chandra
More informationCPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009
CPSC 320 (Intermediate Algorithm Design and Analysis). Summer 2009. Instructor: Dr. Lior Malka Final Examination, July 24th, 2009 Student ID: INSTRUCTIONS: There are 6 questions printed on pages 1 7. Exam
More informationAlgoritmi e strutture di dati 2
Algoritmi e strutture di dati 2 Paola Vocca Lezione 5: Allineamento di sequenze Lezione 5 - Allineamento di sequenze 1 Allineamento sequenze Struttura secondaria dell RNA Algoritmi e strutture di dati
More information13 Dynamic Programming (3) Optimal Binary Search Trees Subset Sums & Knapsacks
13 Dynamic Programming (3) Optimal Binary Search Trees Subset Sums & Knapsacks Average-case analysis Average-case analysis of algorithms and data structures: Input is generated according to a known probability
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 26 Computational Intractability Polynomial Time Reductions Sofya Raskhodnikova S. Raskhodnikova; based on slides by A. Smith and K. Wayne L26.1 What algorithms are
More informationCS483 Design and Analysis of Algorithms
CS483 Design and Analysis of Algorithms Lectures 15-16 Dynamic Programming Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: STII, Room 443, Friday 4:00pm - 6:00pm or by appointments
More informationCS781 Lecture 3 January 27, 2011
CS781 Lecture 3 January 7, 011 Greedy Algorithms Topics: Interval Scheduling and Partitioning Dijkstra s Shortest Path Algorithm Minimum Spanning Trees Single-Link k-clustering Interval Scheduling Interval
More informationAlgorithms Design & Analysis. Dynamic Programming
Algorithms Design & Analysis Dynamic Programming Recap Divide-and-conquer design paradigm Today s topics Dynamic programming Design paradigm Assembly-line scheduling Matrix-chain multiplication Elements
More informationGeneral Methods for Algorithm Design
General Methods for Algorithm Design 1. Dynamic Programming Multiplication of matrices Elements of the dynamic programming Optimal triangulation of polygons Longest common subsequence 2. Greedy Methods
More informationCOMP 355 Advanced Algorithms
COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Running Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that
More informationWeek 7 Solution. The two implementations are 1. Approach 1. int fib(int n) { if (n <= 1) return n; return fib(n 1) + fib(n 2); } 2.
Week 7 Solution 1.You are given two implementations for finding the nth Fibonacci number(f Fibonacci numbers are defined by F(n = F(n 1 + F(n 2 with F(0 = 0 and F(1 = 1 The two implementations are 1. Approach
More informationComputer Science & Engineering 423/823 Design and Analysis of Algorithms
Computer Science & Engineering 423/823 Design and Analysis of s Lecture 09 Dynamic Programming (Chapter 15) Stephen Scott (Adapted from Vinodchandran N. Variyam) 1 / 41 Spring 2010 Dynamic programming
More informationCoin Changing: Give change using the least number of coins. Greedy Method (Chapter 10.1) Attempt to construct an optimal solution in stages.
IV-0 Definitions Optimization Problem: Given an Optimization Function and a set of constraints, find an optimal solution. Optimal Solution: A feasible solution for which the optimization function has the
More informationApproximation: Theory and Algorithms
Approximation: Theory and Algorithms The String Edit Distance Nikolaus Augsten Free University of Bozen-Bolzano Faculty of Computer Science DIS Unit 2 March 6, 2009 Nikolaus Augsten (DIS) Approximation:
More informationFinal exam study sheet for CS3719 Turing machines and decidability.
Final exam study sheet for CS3719 Turing machines and decidability. A Turing machine is a finite automaton with an infinite memory (tape). Formally, a Turing machine is a 6-tuple M = (Q, Σ, Γ, δ, q 0,
More informationAlgorithms: COMP3121/3821/9101/9801
NEW SOUTH WALES Algorithms: COMP3121/3821/9101/9801 Aleks Ignjatović School of Computer Science and Engineering University of New South Wales TOPIC 4: THE GREEDY METHOD COMP3121/3821/9101/9801 1 / 23 The
More informationCS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 9 PSPACE: A Class of Problems Beyond NP Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights
More informationRunning Time. Overview. Case Study: Sorting. Sorting problem: Analysis of algorithms: framework for comparing algorithms and predicting performance.
Running Time Analysis of Algorithms As soon as an Analytic Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will arise -
More informationCOMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background
COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background 1 Polynomial Time Brute force. For many non-trivial problems, there is a natural brute force search algorithm that checks every
More informationThis means that we can assume each list ) is
This means that we can assume each list ) is of the form ),, ( )with < and Since the sizes of the items are integers, there are at most +1pairs in each list Furthermore, if we let = be the maximum possible
More informationCMPSCI 311: Introduction to Algorithms Second Midterm Exam
CMPSCI 311: Introduction to Algorithms Second Midterm Exam April 11, 2018. Name: ID: Instructions: Answer the questions directly on the exam pages. Show all your work for each question. Providing more
More informationCopyright 2000, Kevin Wayne 1
// lgorithmic Paradigms haptr Dynamic Programming rd. Build up a solution incrmntally, myopically optimizing som local critrion. Divid-and-conqur. Brak up a problm into two sub-problms, solv ach sub-problm
More informationSimilarity Search. The String Edit Distance. Nikolaus Augsten. Free University of Bozen-Bolzano Faculty of Computer Science DIS. Unit 2 March 8, 2012
Similarity Search The String Edit Distance Nikolaus Augsten Free University of Bozen-Bolzano Faculty of Computer Science DIS Unit 2 March 8, 2012 Nikolaus Augsten (DIS) Similarity Search Unit 2 March 8,
More informationChapter 5. Divide and Conquer. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
Chapter 5 Divide and Conquer Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Divide-and-Conquer Divide-and-conquer. Break up problem into several parts. Solve each
More informationAlgorithms Exam TIN093 /DIT602
Algorithms Exam TIN093 /DIT602 Course: Algorithms Course code: TIN 093, TIN 092 (CTH), DIT 602 (GU) Date, time: 21st October 2017, 14:00 18:00 Building: SBM Responsible teacher: Peter Damaschke, Tel. 5405
More informationReductions. Reduction. Linear Time Reduction: Examples. Linear Time Reductions
Reduction Reductions Problem X reduces to problem Y if given a subroutine for Y, can solve X. Cost of solving X = cost of solving Y + cost of reduction. May call subroutine for Y more than once. Ex: X
More informationOutline. Approximation: Theory and Algorithms. Motivation. Outline. The String Edit Distance. Nikolaus Augsten. Unit 2 March 6, 2009
Outline Approximation: Theory and Algorithms The Nikolaus Augsten Free University of Bozen-Bolzano Faculty of Computer Science DIS Unit 2 March 6, 2009 1 Nikolaus Augsten (DIS) Approximation: Theory and
More information