Dynamic Programming 1

Similar documents
Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Copyright 2000, Kevin Wayne 1

CSE 202 Dynamic Programming II

Copyright 2000, Kevin Wayne 1

CS 580: Algorithm Design and Analysis

Areas. ! Bioinformatics. ! Control theory. ! Information theory. ! Operations research. ! Computer science: theory, graphics, AI, systems,.

Chapter 6. Weighted Interval Scheduling. Dynamic Programming. Algorithmic Paradigms. Dynamic Programming Applications

6. DYNAMIC PROGRAMMING I

6. DYNAMIC PROGRAMMING I

6. DYNAMIC PROGRAMMING I

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

Chapter 6. Dynamic Programming. CS 350: Winter 2018

Dynamic Programming. Cormen et. al. IV 15

CSE 421 Weighted Interval Scheduling, Knapsack, RNA Secondary Structure

Dynamic Programming: Interval Scheduling and Knapsack

CS 580: Algorithm Design and Analysis

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

CSE 421 Dynamic Programming

CS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016

Dynamic Programming( Weighted Interval Scheduling)

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Outline DP paradigm Discrete optimisation Viterbi algorithm DP: 0 1 Knapsack. Dynamic Programming. Georgy Gimel farb

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Lecture 2: Divide and conquer and Dynamic programming

RNA Secondary Structure. CSE 417 W.L. Ruzzo

Algorithm Design and Analysis

Algoritmiek, bijeenkomst 3

The Double Helix. CSE 417: Algorithms and Computational Complexity! The Central Dogma of Molecular Biology! DNA! RNA! Protein! Protein!

1.1 First Problem: Stable Matching. Six medical students and four hospitals. Student Preferences. s6 h3 h1 h4 h2. Stable Matching Problem

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Matching Residents to Hospitals

Dynamic programming. Curs 2015

Dynamic programming. Curs 2017

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

Matrix Multiplication. Data Structures and Algorithms Andrei Bulatov

Algorithm Design and Analysis

Partha Sarathi Mandal

Aside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

What is Dynamic Programming

13 Dynamic Programming (3) Optimal Binary Search Trees Subset Sums & Knapsacks

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Greedy Algorithms My T. UF

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithms Design & Analysis. Dynamic Programming

Algorithm Design and Analysis

Design and Analysis of Algorithms

Lecture 9. Greedy Algorithm

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

1 Assembly Line Scheduling in Manufacturing Sector

COMP 355 Advanced Algorithms

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS473 - Algorithms I

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Even More on Dynamic Programming

Recursion: Introduction and Correctness

6. DYNAMIC PROGRAMMING II

Weighted Activity Selection

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

Algorithms and Theory of Computation. Lecture 9: Dynamic Programming

CSE 421 Greedy Algorithms / Interval Scheduling

Discrete Mathematics: Logic. Discrete Mathematics: Lecture 17. Recurrence

CS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming

6. DYNAMIC PROGRAMMING II. sequence alignment Hirschberg's algorithm Bellman-Ford distance vector protocols negative cycles in a digraph

Design and Analysis of Algorithms

CSE 421. Dynamic Programming Shortest Paths with Negative Weights Yin Tat Lee

Properties of Context-Free Languages

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Lecture 2: Pairwise Alignment. CG Ron Shamir

Greedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:

6.6 Sequence Alignment

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

CPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009

CS781 Lecture 3 January 27, 2011

Introduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Pair Hidden Markov Models

Knapsack and Scheduling Problems. The Greedy Method

CS 6783 (Applied Algorithms) Lecture 3

Lecture 7: Dynamic Programming I: Optimal BSTs

REDUCING THE WORST CASE RUNNING TIMES OF A FAMILY OF RNA AND CFG PROBLEMS, USING VALIANT S APPROACH

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

CS 6901 (Applied Algorithms) Lecture 3

CSE 202 Homework 4 Matthias Springer, A

CSE 421 Introduction to Algorithms Final Exam Winter 2005

Applications of Regular Algebra to Language Theory Problems. Roland Backhouse February 2001

Introduction to Divide and Conquer

Computational Genomics and Molecular Biology, Fall

Foundations of Natural Language Processing Lecture 6 Spelling correction, edit distance, and EM

CS612 Algorithm Design and Analysis

General Methods for Algorithm Design

CS 580: Algorithm Design and Analysis

Divide and Conquer Algorithms

More Dynamic Programming

Internet Monetization

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Dynamic Programming. Prof. S.J. Soni

Transcription:

Dynamic Programming 1

lgorithmic Paradigms Divide-and-conquer. Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to sub-problems to form solution to original problem. Dynamic programming. Break up a problem into a series of overlapping sub-problems, and build up solutions to larger and larger sub-problems. 2

Dynamic Programming History Bellman. Pioneered the systematic study of dynamic programming in the 1950s. Etymology. Dynamic programming = planning over time. Secretary of Defense was hostile to mathematical research. Bellman sought an impressive name to avoid confrontation. "it's impossible to use dynamic in a pejorative sense" "something not even a Congressman could object to" Reference: Bellman, R. E. Eye of the Hurricane, n utobiography. 3

Dynamic Programming pplications reas. Bioinformatics. Control theory. Information theory. Operations research. Computer science: theory, graphics, I, systems,. Some famous dynamic programming algorithms. Viterbi for hidden Markov models. nix diff for comparing two files. Smith-Waterman for sequence alignment. Bellman-Ford for shortest path routing in networks. Cocke-Kasami-Younger for parsing context free grammars. 4

6.1 Weighted Interval Scheduling

Weighted Interval Scheduling Weighted interval scheduling problem. Job j starts at s j, finishes at f j, and has weight or value v j. Two jobs compatible if they don't overlap. oal: find maximum weight subset of mutually compatible jobs. a b c d e f g 0 1 2 3 4 5 6 7 8 9 10 11 h Time 6

Weighted Interval Scheduling Notation. Label jobs by finishing time: f 1 f 2... f n. Def. p(j) = largest index i < j such that job i is compatible with j. Ex: p(8) = 5, p(7) = 3, p(2) = 0. 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 8 9 10 11 8 Time 7

Dynamic Programming: Binary Choice Notation. OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2,..., j. Case 1: OPT selects job j. can't use incompatible jobs { p(j) + 1, p(j) + 2,..., j - 1 } must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., p(j) optimal substructure Case 2: OPT does not select job j. must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., j-1 0 if j 0 OPT( j) max v j OPT( p( j)), OPT( j 1) otherwise 8

Weighted Interval Scheduling: Brute Force Brute force algorithm. Input: n, s 1,,s n, f 1,,f n, v 1,,v n Sort jobs by finish times so that f 1 f 2... f n. Compute p(1), p(2),, p(n) Compute-Opt(j) { if (j = 0) return 0 else return max(v j + Compute-Opt(p(j)), Compute-Opt(j-1)) } 9

Weighted Interval Scheduling: Brute Force Observation. Recursive algorithm fails spectacularly because of redundant sub-problems exponential algorithms. Ex. Number of recursive calls for family of "layered" instances grows like Fibonacci sequence. 5 1 4 3 2 3 3 2 2 1 4 5 2 1 1 0 1 0 p(1) = 0, p(j) = j-2 1 0 10

Weighted Interval Scheduling: Memoization Memoization. Store results of each sub-problem in a cache; lookup as needed. Input: n, s 1,,s n, f 1,,f n, v 1,,v n Sort jobs by finish times so that f 1 f 2... f n. Compute p(1), p(2),, p(n) for j = 1 to n M[j] = empty M[j] = 0 global array M-Compute-Opt(j) { if (M[j] is empty) M[j] = max(w j + M-Compute-Opt(p(j)), M-Compute-Opt(j-1)) return M[j] } 11

Weighted Interval Scheduling: Running Time Claim. Memoized version of algorithm takes O(n log n) time. Sort by finish time: O(n log n). Computing p( ) : O(n) after sorting by start time. M-Compute-Opt(j): each invocation takes O(1) time and either (i) returns an existing value M[j] (ii) fills in one new entry M[j] and makes two recursive calls Progress measure = # nonempty entries of M[]. initially = 0, throughout n. (ii) increases by 1 at most 2n recursive calls. Overall running time of M-Compute-Opt(n) is O(n). Remark. O(n) if jobs are pre-sorted by start and finish times. 12

Weighted Interval Scheduling: Finding a Solution Q. Dynamic programming algorithms computes optimal value. What if we want the solution itself?. Do some post-processing. Run M-Compute-Opt(n) Run Find-Solution(n) Find-Solution(j) { if (j = 0) output nothing else if (v j + M[p(j)] > M[j-1]) print j Find-Solution(p(j)) else Find-Solution(j-1) } # of recursive calls n O(n). 13

Weighted Interval Scheduling: Bottom-p Bottom-up dynamic programming. nwind recursion. Input: n, s 1,,s n, f 1,,f n, v 1,,v n Sort jobs by finish times so that f 1 f 2... f n. Compute p(1), p(2),, p(n) Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) } 14

6.3 Segmented Least Squares

Segmented Least Squares Least squares. Foundational problem in statistic and numerical analysis. iven n points in the plane: (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ). Find a line y = ax + b that minimizes the sum of the squared error: n SSE (y i ax i b) 2 i 1 y x Solution. Calculus min error is achieved when a n x i y i i ( i x i ) ( i y i ), b n x 2 i ( x i ) 2 i i i y i a i x i n 16

Segmented Least Squares Segmented least squares. Points lie roughly on a sequence of several line segments. iven n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with x 1 < x 2 <... < x n, find a sequence of lines that minimizes f(x). Q. What's a reasonable choice for f(x) to balance accuracy and parsimony? goodness of fit number of lines y x 17

Segmented Least Squares Segmented least squares. Points lie roughly on a sequence of several line segments. iven n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with x 1 < x 2 <... < x n, find a sequence of lines that minimizes: the sum of the sums of the squared errors E in each segment the number of lines L Tradeoff function: E + c L, for some constant c > 0. y x 18

Dynamic Programming: Multiway Choice Notation. OPT(j) = minimum cost for points p 1, p i+1,..., p j. e(i, j) = minimum sum of squares for points p i, p i+1,..., p j. To compute OPT(j): Last segment uses points p i, p i+1,..., p j for some i. Cost = e(i, j) + c + OPT(i-1). 0 if j 0 OPT( j) min e(i, j) c OPT(i 1) otherwise 1 i j 19

Segmented Least Squares: lgorithm INPT: n, p 1,,p N, c Segmented-Least-Squares() { M[0] = 0 for j = 1 to n for i = 1 to j compute the least square error e ij for the segment p i,, p j for j = 1 to n M[j] = min 1 i j (e ij + c + M[i-1]) } return M[n] Running time. O(n 3 ). can be improved to O(n 2 ) by pre-computing various statistics Bottleneck = computing e(i, j) for O(n 2 ) pairs, O(n) per pair using previous formula. 20

6.5 RN Secondary Structure

22 RN Secondary Structure RN. String B = b 1 b 2 b n over alphabet {, C,, }. Secondary structure. RN is single-stranded so it tends to loop back and form base pairs with itself. This structure is essential for understanding behavior of a molecule. C C C C C C C C Ex: CCCCCCC complementary base pairs: -, C-

RN Secondary Structure Secondary structure. set of pairs S = { (b i, b j ) } that satisfy: [Watson-Crick.] S is a matching and each pair in S is a Watson- Crick complement: -, -, C-, or -C. [No sharp turns.] The ends of each pair are separated by at least 4 intervening bases. If (b i, b j ) S, then i < j - 4. [Non-crossing.] If (b i, b j ) and (b k, b l ) are two pairs in S, then we cannot have i < k < j < l. Free energy. sual hypothesis is that an RN molecule will form the secondary structure with the optimum total free energy. approximate by number of base pairs oal. iven an RN molecule B = b 1 b 2 b n, find a secondary structure S that maximizes the number of base pairs. 23

RN Secondary Structure: Examples Examples. C C C C C base pair C C C C C 4 ok sharp turn crossing 24

RN Secondary Structure: Subproblems First attempt. OPT(j) = maximum number of base pairs in a secondary structure of the substring b 1 b 2 b j. match b t and b n 1 t n Difficulty. Results in two sub-problems. Finding secondary structure in: b 1 b 2 b t-1. Finding secondary structure in: b t+1 b t+2 b n-1. OPT(t-1) need more sub-problems 25

Dynamic Programming Over Intervals Notation. OPT(i, j) = maximum number of base pairs in a secondary structure of the substring b i b i+1 b j. Case 1. If i j - 4. OPT(i, j) = 0 by no-sharp turns condition. Case 2. Base b j is not involved in a pair. OPT(i, j) = OPT(i, j-1) Case 3. Base b j pairs with b t for some i t < j - 4. non-crossing constraint decouples resulting sub-problems OPT(i, j) = 1 + max t { OPT(i, t-1) + OPT(t+1, j-1) } take max over t such that i t < j-4 and b t and b j are Watson-Crick complements Remark. Same core idea in CKY algorithm to parse context-free grammars. 26

Bottom p Dynamic Programming Over Intervals Q. What order to solve the sub-problems?. Do shortest intervals first. RN(b 1,,b n ) { for k = 5, 6,, n-1 for i = 1, 2,, n-k j = i + k Compute M[i, j] i 4 3 2 1 0 0 0 0 0 0 } return M[1, n] using recurrence 6 7 8 9 j Running time. O(n 3 ). 27

Dynamic Programming Summary Recipe. Characterize structure of problem. Recursively define value of optimal solution. Compute value of optimal solution. Construct optimal solution from computed information. Dynamic programming techniques. Binary choice: weighted interval scheduling. Multi-way choice: segmented least squares. Viterbi algorithm for HMM also uses DP to optimize a maximum likelihood tradeoff between parsimony and accuracy Dynamic programming over intervals: RN secondary structure. CKY parsing algorithm for context-free grammar has similar structure Top-down vs. bottom-up: different people have different intuitions. 28