Algorithms and Theory of Computation. Lecture 9: Dynamic Programming

Similar documents
Aside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n

Partha Sarathi Mandal

6. DYNAMIC PROGRAMMING II

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CS173 Running Time and Big-O. Tandy Warnow

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Graduate Algorithms CS F-09 Dynamic Programming

Lecture 2: Divide and conquer and Dynamic programming

Dynamic Programming. p. 1/43

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Dynamic Programming. Cormen et. al. IV 15

CS473 - Algorithms I

Lecture 7: Dynamic Programming I: Optimal BSTs

CSE 202 Dynamic Programming II

Week 7 Solution. The two implementations are 1. Approach 1. int fib(int n) { if (n <= 1) return n; return fib(n 1) + fib(n 2); } 2.

CSCI Honor seminar in algorithms Homework 2 Solution

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

1 Closest Pair of Points on the Plane

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Dynamic Programming( Weighted Interval Scheduling)

COMP 355 Advanced Algorithms

Dynamic Programming. Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!

Algorithms and Theory of Computation. Lecture 11: Network Flow

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

Data Structures in Java

Algorithms Design & Analysis. Dynamic Programming

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Dynamic Programming. Weighted Interval Scheduling. Algorithmic Paradigms. Dynamic Programming

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

Chapter 6. Dynamic Programming. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Introduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

6. DYNAMIC PROGRAMMING I

Algorithm Design and Analysis

Tricks of the Trade in Combinatorics and Arithmetic

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

More Dynamic Programming

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

More Dynamic Programming

CPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009

Notes for Recitation 14

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2)

Chapter 4 Divide-and-Conquer

CSE 421 Dynamic Programming

Design and Analysis of Algorithms

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Dynamic Programming. Data Structures and Algorithms Andrei Bulatov

1 Substitution method

CS Non-recursive and Recursive Algorithm Analysis

Omega notation. Transitivity etc.

Divide and Conquer Strategy

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Section 11.1 Sequences

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

CPSC 320 Sample Final Examination December 2013

Algorithm efficiency analysis

Introduction to Algorithms 6.046J/18.401J/SMA5503

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Divide-and-Conquer Algorithms Part Two

Mergesort and Recurrences (CLRS 2.3, 4.4)

CS 580: Algorithm Design and Analysis

Lecture 22: Multithreaded Algorithms CSCI Algorithms I. Andrew Rosenberg

CS483 Design and Analysis of Algorithms

Running Time. Assumption. All capacities are integers between 1 and C.

CS 6901 (Applied Algorithms) Lecture 3

CS483 Design and Analysis of Algorithms

MAKING A BINARY HEAP

Problem Set 1 Solutions

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

CPS 616 DIVIDE-AND-CONQUER 6-1

Review Of Topics. Review: Induction

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

Data Structures and Algorithms CSE 465

Chapter 6. Dynamic Programming. CS 350: Winter 2018

2. ALGORITHM ANALYSIS

6. DYNAMIC PROGRAMMING I

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Dynamic Programming: Interval Scheduling and Knapsack

General Methods for Algorithm Design

Copyright 2000, Kevin Wayne 1

Data Structures and Algorithms

Data Structures and Algorithms Chapter 3

CS 470/570 Dynamic Programming. Format of Dynamic Programming algorithms:

CS481: Bioinformatics Algorithms

MAKING A BINARY HEAP

CS 580: Algorithm Design and Analysis

Chapter 6. Weighted Interval Scheduling. Dynamic Programming. Algorithmic Paradigms. Dynamic Programming Applications

Algorithm Design and Analysis

Copyright 2000, Kevin Wayne 1

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

COMP Analysis of Algorithms & Data Structures

Matching Residents to Hospitals

Transcription:

Algorithms and Theory of Computation Lecture 9: Dynamic Programming Xiaohui Bei MAS 714 September 10, 2018 Nanyang Technological University MAS 714 September 10, 2018 1 / 21

Recursion in Algorithm Design Divide and Conquer: problem reduced to multiple independent sub-problems that are solved separately. Conquer step puts together solution for bigger problem. Examples: merge sort, quick sort, closest pairs, etc. Dynamic Programming: problem reduced to multiple (typically) dependent or overlapping sub-problems. Use memoization to avoid recomputation of common solutions leading to iterative bottom-up algorithm. Nanyang Technological University MAS 714 September 10, 2018 2 / 21

Fibonacci Numbers Fibonacci Numbers are defined by recurrence: F(n) = F(n 1) + F(n 2) and F(0) = 0, F(1) = 1. These numbers have many interesting and amazing properties. A journal The Fibonacci Quarterly F(n) = φn (1 φ) n 5, where φ is the golden ratio (1 + 5)/2 1.618 lim n F(n + 1)/F(n) = φ Nanyang Technological University MAS 714 September 10, 2018 3 / 21

Fibonacci Numbers Fibonacci Numbers Given n, compute F(n). First try: Algorithm: Fib(n): if n = 0 then return 0 else if n = 1 then return 1 else return Fib(n 1) + Fib(n 2) Nanyang Technological University MAS 714 September 10, 2018 4 / 21

Fibonacci Numbers Running time analysis: and T(0) = T(1) = 1 T(n) = T(n 1) + T(n 2) + 1 T(n) = Θ(φ n ), roughly same as F(n) The number of additions is exponential in n. Can we do better? Nanyang Technological University MAS 714 September 10, 2018 5 / 21

Recursion with memoization An iterative algorithm Algorithm: FibIter(n): if n = 0 then return 0 else if n = 1 then return 1 F[0] = 0, F[1] = 1; for i = 2 to n do F[i] = F[i 1] + F[i 2]; return F[n] Running time: O(n) additions. Nanyang Technological University MAS 714 September 10, 2018 6 / 21

Intuition Recursive algorithm is doing the same computation again and again. Iterative algorithm is storing computed values and building bottom up the final value. memoization. Dynamic Programming Finding a recursion that can be effectively/efficiently memorized. Nanyang Technological University MAS 714 September 10, 2018 7 / 21

Automatic memoization Can we convert recursive algorithm into an efficient algorithm without explicitly doing an iterative algorithm? Algorithm: Fib(n): if n = 0 then return 0 else if n = 1 then return 1 if Fib(n) was previously computed then return stored value of Fib(n) else return Fib(n 1) + Fib(n 2) How do we keep track of previously computed values? explicitly implicitly (via data structure) Nanyang Technological University MAS 714 September 10, 2018 8 / 21

Polynomial Time Issues Is the iterative algorithm a polynomial time algorithm? Input is n and hence input size is Θ(log n). So no. Output is F(n) and output size is Θ(n). Output size is exponential in input size = no polynomial time algorithm possible! Nanyang Technological University MAS 714 September 10, 2018 9 / 21

Longest Common Subsequence (LCS) A sequence is an ordered list a 1, a 2,..., a n. The length of a sequence is the number of elements in the list. a i1,..., a ik is a subsequence of a 1,..., a n if 1 i 1 < i 2 < < i k n. Longest Common Subsequence (LCS) Given two sequence X = x 1, x 2,..., x m and Y = y 1, y 2,..., y n, find the length of a longest sequence Z that is a subsequence of both X and Y. Example Sequence: X = 6, 3, 5, 2, 7, 8, 1, 9 Subsequences: 5, 2, 1 and 3, 7, 1, 9, etc Applications: Unix diff, speech recognition, computational biology,... Nanyang Technological University MAS 714 September 10, 2018 10 / 21

Naive Enumeration Assume X and Y are contained in two arrays X[1..m], Y[1..n] Algorithm: LCSNaive(X[1..m], Y[1..n]): ans= 0; foreach subsequence Z of X do if Z is also a subsequence of Y and Z > ans then ans = Z ; return ans Running time: (n + m)2 m 2 m subsequences of X and O(m + n) time to check if Z is a subsequence of Y. Nanyang Technological University MAS 714 September 10, 2018 11 / 21

Problem Structure Let Z[1..k] be the LCS of sequences X[1..m] and Y[1..n]. 1 If x m = y n, then z k = x m = y n, and Z[1..k 1] is LCS of X[1..m 1] and Y[1..n 1] why? proof by greedy 2 If x m y n and z k x m, then Z is LCS of X[1..m 1] and Y[1..n] 3 If x m y n and z k y n, then Z is LCS of X[1..m] and Y[1..n 1] Let C[i, j] denote the length of an LCS of sequences X[1..i] and Y[1..j]. 0 if i = 0 or j = 0 C[i, j] = C[i 1, j 1] + 1 if x i = y j max{c[i, j 1], C[i 1, j]} otherwise Nanyang Technological University MAS 714 September 10, 2018 12 / 21

Dynamic Programming for LCS Algorithm: LCS(X[1..m], Y[1..n]): for i = 0 to m do C[i, 0] = 0; for j = 0 to n do C[0, j] = 0; for i = 1 to m do for j = 1 to n do if X[i] = Y[j] then C[i, j] = C[i 1, j 1] + 1; else C[i, j] = max{c[i, j 1], C[i 1, j]}; return C[m, n] Running time: O(mn) Space: O(mn) Nanyang Technological University MAS 714 September 10, 2018 13 / 21

LCS: the Sequence How to compute the LCS? Create another array A of arrows during the computation. x i = y j : arrow goes left and up x i y j : C[i, j] = C[i, j 1]: arrow goes left C[i, j] = C[i 1, j]: arrow goes up Construct LCS: follow the arrow starting at C[m, n]. arrows are at elements of the LCS Nanyang Technological University MAS 714 September 10, 2018 14 / 21

LCS Example B A B A C 0 0 0 0 0 0 C 0 0 0 0 0 1 A 0 0 1 1 1 1 B 0 1 1 2 2 2 C 0 1 1 2 2 3 LCS(BABAC, CABC) = ABC Nanyang Technological University MAS 714 September 10, 2018 15 / 21

Dynamic Programming: Summary 1 Find a smart recursion for the problem in which the number of distinct subproblems is small. polynomial in the original problem size 2 Natural ordering of subproblems from smallest to largest, compute the problems bottom up by storing the intermediate values in an appropriate data structure. 3 The total running time is upper bounded by the number of subproblems, the time to evaluate each subproblem and the space needed to store the value. Nanyang Technological University MAS 714 September 10, 2018 16 / 21

Shortest Paths Shortest Paths Problem Given a directed weighted graph G = (V, E) with arbitrary edge lengths. For edge e = (u, v), l(e) = l(u, v) is its length. Given vertices s, t, find a shortest path from s to t. b 10 f 6 a 13 9 6 8 e c 30 16 18 d 25 11 16 6 g h Nanyang Technological University MAS 714 September 10, 2018 17 / 21

Negative Lengths Dijkstra s algorithm can fail if there are negative edges lengths. 0s 1 y1 5 x5 5 1 20 z w 1 3 False Assumption: Dijkstra s algorithm is based on the assumption that if s = v 0 v 1 v k = v is a shortest path from s to v, then dist(s, v i ) dist(s, v) for all 1 i < k. Holds true only for non-negative edge lengths. Nanyang Technological University MAS 714 September 10, 2018 18 / 21

Negative Cycles A negative cycle is a directed cycle such that the sum of its edge lengths is negative. a 13 9 6 8 e b c 30 10 16 18 d 25 f 3 11 16 g 6 h Nanyang Technological University MAS 714 September 10, 2018 19 / 21

Shortest Paths and Negative Cycles Lemma 1 If some path from s to t contains a negative cycle, then there does not exist a shortest path from s to t. Proof. If there exists such a cycle, can build a s t path of arbitrarily negative weight by detouring around cycle as many times as desired. Lemma 2 If G has no negative cycles, then there exists a shortest path from s to t that is simple. Proof. Consider a shortest s t path P that uses the fewest number of edges. If P contains a cycle C, can remove portion of P corresponding to C without increasing the total length. Nanyang Technological University MAS 714 September 10, 2018 20 / 21

Shortest Path and Negative Cycle Problems Negative Cycle Problem Given a directed weighted graph G = (V, E) with arbitrary edge lengths, find a negative cycle (if one exists). Shortest Path Problem Given a directed weighted graph G = (V, E) with arbitrary edge lengths and no negative cycles, find a shortest path from s to t. Nanyang Technological University MAS 714 September 10, 2018 21 / 21