CS483 Design and Analysis of Algorithms

Similar documents
Dynamic Programming ACM Seminar in Algorithmics

Maximum sum contiguous subsequence Longest common subsequence Matrix chain multiplication All pair shortest path Kna. Dynamic Programming

CS Analysis of Recursive Algorithms and Brute Force

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

MAKING A BINARY HEAP

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

MAKING A BINARY HEAP

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Dynamic Programming: Shortest Paths and DFA to Reg Exps

1 Assembly Line Scheduling in Manufacturing Sector

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 20: Dynamic Programming

CS483 Design and Analysis of Algorithms

More Dynamic Programming

Lecture 2: Divide and conquer and Dynamic programming

More Dynamic Programming

CSE 202 Dynamic Programming II

CS483 Design and Analysis of Algorithms

Today s Outline. CS 561, Lecture 15. Greedy Algorithms. Activity Selection. Greedy Algorithm Intro Activity Selection Knapsack

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Dynamic Programming. Cormen et. al. IV 15

Lecture 7: Dynamic Programming I: Optimal BSTs

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CS483 Design and Analysis of Algorithms

CS 561, Lecture: Greedy Algorithms. Jared Saia University of New Mexico

Chapter 8 Dynamic Programming

General Methods for Algorithm Design

15.083J/6.859J Integer Optimization. Lecture 2: Efficient Algorithms and Computational Complexity

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Data Structures in Java

Aside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n

Lecture 4: NP and computational intractability

Algorithms Exam TIN093 /DIT602

6. DYNAMIC PROGRAMMING I

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

Algorithm Design Strategies V

CS Algorithms and Complexity

P, NP, NP-Complete. Ruth Anderson

Chapter 8 Dynamic Programming

CS 320, Fall Dr. Geri Georg, Instructor 320 NP 1

CS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming

Lecture 9. Greedy Algorithm

Algorithms and Theory of Computation. Lecture 9: Dynamic Programming

Approximation Algorithms

CS60007 Algorithm Design and Analysis 2018 Assignment 1

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

Dynamic Programming. Prof. S.J. Soni

Introduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them

CS/COE

Unit 1A: Computational Complexity

Question Paper Code :

P and NP. Inge Li Gørtz. Thank you to Kevin Wayne, Philip Bille and Paul Fischer for inspiration to slides

CS3233 Competitive i Programming

CS 350 Algorithms and Complexity

CS 350 Algorithms and Complexity

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

CS473 - Algorithms I

CS 580: Algorithm Design and Analysis

Topics in Complexity Theory

Find: a multiset M { 1,..., n } so that. i M w i W and. i M v i is maximized. Find: a set S { 1,..., n } so that. i S w i W and. i S v i is maximized.

Dynamic Programming. Reading: CLRS Chapter 15 & Section CSE 2331 Algorithms Steve Lai

CS Lunch. Dynamic Programming Recipe. 2 Midterm 2! Slides17 - Segmented Least Squares.key - November 16, 2016

ECS 120 Lesson 24 The Class N P, N P-complete Problems

CS483 Design and Analysis of Algorithms

Even More on Dynamic Programming

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

Algorithms Re-Exam TIN093/DIT600

CS481: Bioinformatics Algorithms

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

NP-complete problems. CSE 101: Design and Analysis of Algorithms Lecture 20

Copyright 2000, Kevin Wayne 1

Tractable & Intractable Problems

6. DYNAMIC PROGRAMMING I

Algorithm Design and Analysis

CS 6783 (Applied Algorithms) Lecture 3

Dynamic Programming. Shuang Zhao. Microsoft Research Asia September 5, Dynamic Programming. Shuang Zhao. Outline. Introduction.

Discrete (and Continuous) Optimization WI4 131

CS 580: Algorithm Design and Analysis

Dynamic Programming (CLRS )

CS 151. Red Black Trees & Structural Induction. Thursday, November 1, 12

IE418 Integer Programming

P and NP. Warm Up: Super Hard Problems. Overview. Problem Classification. Tools for classifying problems according to relative hardness.

Algorithm Design CS 515 Fall 2015 Sample Final Exam Solutions

CS173 Lecture B, November 3, 2015

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Lecture #8: We now take up the concept of dynamic programming again using examples.

2. A vertex in G is central if its greatest distance from any other vertex is as small as possible. This distance is the radius of G.

CSC2420: Algorithm Design, Analysis and Theory Fall 2017

Lecture 4: An FPTAS for Knapsack, and K-Center

Dynamic Programming( Weighted Interval Scheduling)

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

Lecture 6 January 21, 2013

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

CMPSCI 611 Advanced Algorithms Midterm Exam Fall 2015

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Do not turn this page until you have received the signal to start. Please fill out the identification section above. Good Luck!

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Dynamic Programming. p. 1/43

Analysis of Algorithms. Unit 5 - Intractable Problems

Outline DP paradigm Discrete optimisation Viterbi algorithm DP: 0 1 Knapsack. Dynamic Programming. Georgy Gimel farb

Limitations of Incremental Dynamic Programming

Transcription:

CS483 Design and Analysis of Algorithms Lectures 15-16 Dynamic Programming Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: STII, Room 443, Friday 4:00pm - 6:00pm or by appointments Course web-site: http://www.cs.gmu.edu/ lifei/teaching/cs483 fall08/ Figures unclaimed are from books Algorithms and Introduction to Algorithms 1 / 27

Dynamic Programming 1 Shortest Path 2 Longest Increasing Subsequences 3 Edit Distance 4 Knapsack 5 Chain Matrix Multiplication 6 Independent Sets in Trees 2 / 27

Algorithmic Paradigms 1 Greed Build up a solution incrementally, optimizing some local criterion in each step 2 Divide-and-conquer Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to sub-problems to form solution to original problem 3 Dynamic programming Identify a collection of subproblems and tackling them one by one, smallest first, using the answers to smaller problems to help figure out larger ones, until the whole lot of them is solved 3 / 27

Shortest Paths in Directed Acyclic Graphs (DAG) Remark The special distinguishing feature of a DAG is that its node can be linearized. dist(d) = min{dist(b) + 1, dist(c) + 3}. This algorithm is solving a collection of subproblems, {dist(u) : u V }. We start with the smallest of them, dist(s) = 0. 4 / 27

Some Thoughts on Dynamic Programming Remark 1 In dynamic programming, we are not given a DAG; the DAG is implicit. 2 Its nodes are the subproblems we define, and its edges are the dependencies between the subproblems: If to solve subproblem B we need to answer the subproblem A, then there is a (conceptual) edge from A to B. In this case, A is thought of as a smaller subproblems than B and it will always be smaller, in an obvious sense. Problem How to solve/calculate above subproblems values in Dynamic Programming? Solution? 5 / 27

Longest Increasing Subsequences Definition The input is a sequence of numbers a 1,..., a n. A subsequence is any subset of these numbers taken in order, of the form a i1, a i2,..., a ik where 1 < i 1 < i 2 <... < i k n, and an increasing subsequence is one in which the numbers are getting strictly larger. The task is to find the increasing subsequence of greatest length. 6 / 27

Longest Increasing Subsequences 7 / 27

Longest Increasing Subsequences 1 8 / 27

Longest Increasing Subsequences 1 2 function inc-subsequence(g = (V, E)) for j = 1, 2,... n L(j) = 1 + maxl(i): (i, j) E; return max j L(j); 9 / 27

Longest Increasing Subsequences Remark There is an ordering on the subproblems, and a relation that shows how to solve a subproblem given the answers to smaller subproblems, that is, subproblems that appear earlier in the ordering. Theorem The algorithm runs in polynomial time O(n 2 ). Proof. L(j) = 1 + max{l(i) : (i, j) E}. Problem Why not using recursion? For example, L(j) = 1 + max{l(1), L(2),..., L(j 1)}. Solution Bottom-up versus (top-down + divide-and-conquer). 10 / 27

Announcements 1 Assignment 7 is released today. The due date is Nov. 19th. 11 / 27

Edit Distance from Wayne s slides on Algorithm Design Problem If we use the brute-force method, how many alignments do we have? Solution? 12 / 27

Edit Distance 1 Goal. Given two strings X = x 1 x 2... x m and Y = y 1 y 2... y n, find alignment of minimum cost. Call this problem E(m, n). 2 Subproblem E(i, j). Define diff (i, j) = 0 if x[i] = y[j] and diff (i, j) = 1 otherwise E(i, j) = min{1 + E(i 1, j), 1 + E(i, j 1), diff (i, j) + E(i 1, j 1)} function edit-distance(x, Y) for i = 0, 1, 2,... m E(i, 0) = i; for j = 1, 2,... n E(0, j) = j; for i = 1, 2,... m for j = 1, 2,... n E(i, j) = min { E(i - 1, j) + 1, E(i, j - 1) + 1, E(i - 1, j - 1) + diff(i, j) }; return E(m, n); 13 / 27

Edit Distance Theorem edit-distance runs in time O(m n) 14 / 27

The Underlying DAG Remark Every dynamic program has an underlying DAG structure: Think of each node as representing a subproblem, and each edge as a precedence constraint on the order in which the subproblems can be tackled. Having nodes u 1,..., u k point to v means subproblem v can only be solved once the answers to u 1, u 2,..., u k are known. Remark Finding the right subproblems takes creativity and experimentation. 15 / 27

Solving Problems Using Dynamic Programming Approach 1 What is a subproblem? Can you define it clearly? 2 What is the relation between a smaller-size subproblem and a larger-size subproblem? Can we get the solution of the larger one from the smaller one? What is the dependency between them? What is the DAG? Is there a relationship between the optimality of a smaller subproblem and a larger subproblem? 3 How to solve this problem? What is the running-time complexity? 16 / 27

Knapsack Knapsack Problem Given n objects, each object i has weight w i and value v i, and a knapsack of capacity W, find most valuable items that fit into the knapsack http://en.wikipedia.org/wiki/knapsack problem 17 / 27

Knapsack Problem 1 What will you do if all items are splittable? 2 What will you do if some items are splittable? 3 What will you do if all items are non-splittable? 4 What is the subproblem the node in a DAG? 5 Can we always have a polynomial-time algorithm when we use Dynamic Programming approach? 18 / 27

Knapsack Problem Subproblem: K(w, j) = maximum value achievable using a knapsack of capacity w and items 1, 2,..., j Goal: K(W, n) function knap-sack(w, S) Initialize all K(0, j) = 0 and all K(w, 0) = 0; for j = 1 to n for w = 1 to W if (w_j > w) K(w, j) = K(w, j - 1); else K(w, j) = max{k(w, j - 1), K(w - w_j, j - 1) + v_j}; return K(W, n); 19 / 27

from Wayne s slides on Algorithm Design 20 / 27

Chain Matrix Multiplication 21 / 27

Chain Matrix Multiplication C(i, j) = minimum cost of multiplying A i A i+1 A j C(i, j) = min i 1 m k m j } i k<j 22 / 27

Traveling Salesman Problems Definition (TSP). Start from his hometown, suitcase in hand, he will conduct a journey in which each of his target cities is visited exactly once before he returned home. Given the pairwise distance between cities, what is the best order in which to visit them, so as to minimize the overall distance traveled? Figure: http://watching-movies-online.blogspot.com/2008/09/watch-mr-beans-holiday-hindi-dubbed.html 23 / 27

Traveling Salesman Problems Definition (TSP). Start from his hometown, suitcase in hand, he will conduct a journey in which each of his target cities is visited exactly once before he returned home. Given the pairwise distance between cities, what is the best order in which to visit them, so as to minimize the overall distance traveled? Subproblem. Let C(S, j) be the length of the shortest path visiting each node in S exactly once, starting at 1 and ending at j. Relation. C(S, j) = min C(S {j}, i) + d ij. i S,i j 24 / 27

Independent Sets Definition A subset of nodes S V is an independent set of graph G = (V, E) if there are no edges between them Goal: Find the largest independent set Known: This problem is believed to be intractable 25 / 27

Independent Sets in Trees I (u) = size of largest independent set of subtree hanging from u I (u) = max{1 + I (w), grandchildren w of u I (w)} children w of u 26 / 27

Independent Sets in Trees Theorem The running time of using Dynamic Programming to find Independent Sets in Trees is O( V + E ) Proof. The number of subproblems is exactly the number of vertices? 27 / 27