Lecture #8: We now take up the concept of dynamic programming again using examples.

Similar documents
Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Introduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

1 Assembly Line Scheduling in Manufacturing Sector

Michael Kaiser Assignment 1 Comp 510 Fall 2012

CS473 - Algorithms I

General Methods for Algorithm Design

Maximum sum contiguous subsequence Longest common subsequence Matrix chain multiplication All pair shortest path Kna. Dynamic Programming

MVE165/MMG630, Applied Optimization Lecture 6 Integer linear programming: models and applications; complexity. Ann-Brith Strömberg

Graduate Algorithms CS F-09 Dynamic Programming

ACO Comprehensive Exam October 18 and 19, Analysis of Algorithms

Lecture 7: Dynamic Programming I: Optimal BSTs

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

(tree searching technique) (Boolean formulas) satisfying assignment: (X 1, X 2 )

13 Dynamic Programming (3) Optimal Binary Search Trees Subset Sums & Knapsacks

Stochastic Shortest Path Problems

Analysis of Approximate Quickselect and Related Problems

Chapter 8 Dynamic Programming

Dynamic Programming. Shuang Zhao. Microsoft Research Asia September 5, Dynamic Programming. Shuang Zhao. Outline. Introduction.

Chapter 8 Dynamic Programming

Dynamic Programming: Shortest Paths and DFA to Reg Exps

On the Complexity of Mapping Pipelined Filtering Services on Heterogeneous Platforms

The three-dimensional matching problem in Kalmanson matrices

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

A Questionable Distance-Regular Graph

Lecture 20 : Markov Chains

Econ 172A, Fall 2012: Final Examination (I) 1. The examination has seven questions. Answer them all.

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CS483 Design and Analysis of Algorithms

Relations Graphical View

Algorithmic Approach to Counting of Certain Types m-ary Partitions

/463 Algorithms - Fall 2013 Solution to Assignment 4

Sequential pairing of mixed integer inequalities

Characterizations of Strongly Regular Graphs: Part II: Bose-Mesner algebras of graphs. Sung Y. Song Iowa State University

Classical Complexity and Fixed-Parameter Tractability of Simultaneous Consecutive Ones Submatrix & Editing Problems

Lecture #14: NP-Completeness (Chapter 34 Old Edition Chapter 36) Discussion here is from the old edition.

Chapter 7 Network Flow Problems, I

Dominating Set Counting in Graph Classes

A fast algorithm to generate necklaces with xed content

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES

MULTI-RESTRAINED STIRLING NUMBERS

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CHAPTER 10 Shape Preserving Properties of B-splines

On the Exponent of the All Pairs Shortest Path Problem

Introduction to Mathematical Programming IE406. Lecture 21. Dr. Ted Ralphs

Algorithms and Theory of Computation. Lecture 22: NP-Completeness (2)

RMT 2013 Power Round Solutions February 2, 2013

Dynamic Programming. Prof. S.J. Soni

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Advanced Linear Programming: The Exercises

Algorithms Design & Analysis. Dynamic Programming

Data Structures in Java

CMPSCI 611 Advanced Algorithms Midterm Exam Fall 2015

Shortest paths with negative lengths

Subcubic Equivalence of Triangle Detection and Matrix Multiplication

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Quick Tour of Linear Algebra and Graph Theory

COMP 633: Parallel Computing Fall 2018 Written Assignment 1: Sample Solutions

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

Maximum union-free subfamilies

Algorithms Re-Exam TIN093/DIT600

Outline. Computer Science 331. Cost of Binary Search Tree Operations. Bounds on Height: Worst- and Average-Case

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

Discrete Applied Mathematics

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Today s Outline. CS 362, Lecture 13. Matrix Chain Multiplication. Paranthesizing Matrices. Matrix Multiplication. Jared Saia University of New Mexico

Dynamic Programming ACM Seminar in Algorithmics

Divide-and-Conquer. Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, CSE 6331 Algorithms Steve Lai

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

Determinants of Partition Matrices

1 Directional Derivatives and Differentiability

Notes. Relations. Introduction. Notes. Relations. Notes. Definition. Example. Slides by Christopher M. Bourke Instructor: Berthe Y.

Assignment 4. CSci 3110: Introduction to Algorithms. Sample Solutions

ON MULTI-AVOIDANCE OF RIGHT ANGLED NUMBERED POLYOMINO PATTERNS

8 Priority Queues. 8 Priority Queues. Prim s Minimum Spanning Tree Algorithm. Dijkstra s Shortest Path Algorithm

Problem A Sum of Consecutive Prime Numbers Input: A.txt

Partition of Integers into Distinct Summands with Upper Bounds. Partition of Integers into Even Summands. An Example

R ij = 2. Using all of these facts together, you can solve problem number 9.

Lec 03 Entropy and Coding II Hoffman and Golomb Coding

Week 15-16: Combinatorial Design

APTAS for Bin Packing

Question Paper Code :

DETECTION OF FXM ARBITRAGE AND ITS SENSITIVITY

Computational Integer Programming. Lecture 2: Modeling and Formulation. Dr. Ted Ralphs

Boolean Inner-Product Spaces and Boolean Matrices

Game Theory and its Applications to Networks - Part I: Strict Competition

CHAPTER I. Rings. Definition A ring R is a set with two binary operations, addition + and

8.5 Sequencing Problems

Easy Shortcut Definitions

Exam in Discrete Mathematics

Integer programming: an introduction. Alessandro Astolfi

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

Lecture Note 3: Interpolation and Polynomial Approximation. Xiaoqun Zhang Shanghai Jiao Tong University

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

Routing. Topics: 6.976/ESD.937 1

Econ 172A, Fall 2012: Final Examination Solutions (I) 1. The entries in the table below describe the costs associated with an assignment

Self-dual interval orders and row-fishburn matrices

Notes on Inductive Sets and Induction

Transcription:

Lecture #8: 0.0.1 Dynamic Programming:(Chapter 15) We now take up the concept of dynamic programming again using examples. Example 1 (Chapter 15.2) Matrix Chain Multiplication INPUT: An Ordered set of Matrices [A 1 A 2 A 3...A n ] with A i of size p i 1 p i (given vector p =[p 0,p 1,..., p n ]). OUTPUT: The product of these matrices taken in this order. MATRIX-CHAIN-ORDER(p) n length(p) 1 for i 1 to n do m[i, i] 0 for l 2 to n do for i 1 to n l +1 do j i + l 1 m[i, j] for k i to j 1 do q m[i, k]+m[k +1,j]+p i 1 p k p j if q<m[i, j] then m[i, j] q; s[i, j] k return m and s Given a parenthesization for a matrix chain, we have a corresponding binary tree whose leaves correspond to the matrices and conversely. For example, if we have ((A 1 )((A 2 A 3 )((A 4 A 5 )(A 6 )))) [also written as (A 1 )((A 2 A 3 )((A 4 A 5 )(A 6 )))] 1

we have the following tree: A 1 A 6 A 2 A 3 A 4 A 5 Example 2 (Chapter 15.4) Longest Common Subsequence INPUT: Two sequences X = hx 1,x 2,..., x n i and Y = hy 1,y 2,..., y m i OUTPUT: Matrix C and (pointer) matrix B LCS-Length(X, Y ) m length[x] n length[y ] for i 1 to m do c[i, 0] 0 for j 1 to n do c[0,j] 0 for i 1 to m do for j 1 to n do if x i = y j then c[i, j] c[i 1,j 1] + 1; b[i, j] - else if c[i 1,j] c[i, j 1] then c[i, j] c[i 1,j]; b[i, j] else c[i, j] c[i, j 1]; b[i, j] return c and b Example 3 ((First Edition)) Optimal Polygonal Triangulation 2

INPUT: An ordered set of vertices of a convex polygon v i ;0 i n (given vector v =[v 0,v 1,..., v n ])andw( v i v k v j ). OUTPUT: Triangulation with minimum cost. POLYGONAL TRIANGULATION(v) n length(v) 1 for i 1 to n do m[i, i] 0 for l 2 to n do for i 1 to n l +1 do j i + l 1 m[i, j] for k i to j 1 do q m[i, k]+m[k +1,j]+w( v i 1 v k v j ) if q<m[i, j] then m[i, j] q; s[i, j] k return m and s 3

Consider the following convex polygon and a possible triangulation: v 0 A 1 v 1 v 6 A 2 A 6 v 2 v 5 A 3 A 5 v 3 v 4 This triangulation corresponds to the same tree as in matrix chain multiplication. Indeed, there is a one-to-one correspondence between the triangulation and a parenthesization. It is this that is used in the above algorithm. Example 4 (Chapter 15.5) Binary Search Trees: We are given an ordered sequence K = {k 1,k 2,..., k n } of n distinct keys in sorted order so that k i <k i+1. We wish to build a binary search tree from these keys. For each key k i we have a probability p i that a search will be for k i. Some searches may be for values not in K. So we have "dummy" keys d 0,d 1,..., d n representing values not in K. In the sorted order these would look like: d 0 <k 1 <d 1 <k 2 <..., k n <d n (meaning that d i represents values in between k i and k i+1. The probability q i associated with d i is the probability that the search will ask for values in between k i and k i+1. q 0 is the probability that values less than k 1 will be asked for and q n is the probability that values greater than k n will be requested. A 4 4

Recall, that in a binary search tree if y is a node in the left sub-tree of x and z is a node in right sub-tree of x, thenkey[y] key[x] key[z]. If the frequencies with which these keys occur is uniform, a balanced tree is the best to minimize the average search time. If, however, they are not uniform, then it may be advantageous to have tree that is not balanced in order to minimize the expected number of comparisons. For example, consider: i 0 1 2 3 4 5 p i 0.15 0.10 0.05 0.10 0.20 q i 0.05 0.10 0.05 0.05 0.05 0.10 Two binary search trees are shown below for this example. k2 k2 k1 k4 k1 k5 d0 d1 k3 k5 d0 d1 k4 d5 d2 d3 d4 d5 d4 k3 d2 d3 Expected time is given by 2.40 and 2.35. E[T ] = = p i [d T (k i )+1]+ p i + q i [d T (d i )] i=0 p i [d T (k i )] + q i [d T (d i )] where d T (k i ) and d T (d i ) denote the depth of these nodes in the tree from the root. Time here is measured by the number of queries. We now take up the problem of designing a binary search tree by using dynamic programming. Problem 5 Given an ordered set k 1 <k 2 <k 3 <... < k n of n distinct keys. The frequency of occurrence of requests to key k i is p i and the i=0 5

frequency for dummy key d i is q i. We want a binary search tree that minimizes B T where B T = p i + p i [d T (k i )] + q i [d T (d i )] where d T (k i ) is the depth of key k i and that of d i is d T (d i ) in the tree T. What is the difference between this and the Huffman trees? If we select k j for the root, then [d 0,k 1,..., k j 1,d j 1 ] form the left sub-tree and [d j,k j+1,...k n,d n ] form the right sub-tree. This gives the recursion for dynamic programming. Let c[i, j] denote the minimum cost for a search tree containing keys [d i 1,k i,...,k j,d j ]. Then, for i j, c[i, j] = min {p Xr 1 r + c[i, r 1] + p k + i r j = jx p k + k=i jx k=i 1 k=i i=0 Xr 1 k=i 1 q k + c[r +1,j]+ q k + min {c[i, r 1] + c[r +1,j]} i r j = w(i, j)+ min{c[i, r 1] + c[r +1,j]} i r j jx k=r+1 where w(i, j) = P j k=i p k + P j k=i 1 q k. For j<i, c[i, j] =0. Please note that this is slightly different from that in the book. This is because our function B T is different. ThiscanbedoneinO(n 3 ) time as shown below. OPTIMAL-BST(p, q, n) 1. for i 1 to n 2. do c[i, i] q i 1 + p i + q i 3. w i,i q i 1 + p i + q i 4. for l 1 to n 1 5. do for i 1 to n l 6. do j i + l 7. c[i, j] 8. w i,j w i,j 1 + p j + q j 9. for r i to j 10. do t c[i, r 1] + c[r +1,j]+w i,j 11. if t c[i, j] 12. then c[i, j] t p k + jx q k } k=r 6

13. m[i, j] r 14. return c and m Actually, it can be done faster as shown below: Let m[i, j] be the value of r that determines the minimum in the above expression for c[i, j]. LetM := {m[i, j] :i<j}. {w i,j } are said to satisfy quadrangle inequality (QI) if the following holds {i k j l} = w i,j + w k,l w k,j + w i,l They are said to be monotone if {i k l j} = w k,l w i,j If w i,j = P j k=i p k + P j k=i 1 q k then QI holds as equality and monotonicity is obvious since the probabilities are nonnegative. Suppose {w i,j } satisfy quadrangle inequality and monotonicity conditions above. Consider the set of recursion equations of the form: t[i, i] = 0 t[i, j] = f i,j + min [t[i, k 1] + t[k, j]] i<k j for i<j Then t canbecomputedino(n 2 ) time. Substituting t[i, j +1]=c[i, j] and f i,j+1 = w i,j let us transform BST to this. Lemma 6 If f satisfies QI and monotonicity, then t satisfies QI. Proof. Want to show that {i k j l} = t[i, j]+t[k, l] t[k, j]+t[i, l] We do this by induction on l i. {l i =0} = {i = j = k = l} and there is nothing to prove. {l i =1} = {i = k or j = l}. In this case also the inequality holds as an equality. Several cases to consider. Case 1. i<k= j<lwe need to show: {i k j l} = t[i, j]+t[j, l] t[i, l] Let t m [i, j] =f i,j + t{i, m 1] + t[m, j] and let m[i, j] =max{m : t m [i, j] =t[i, j]}. Ifm[i, j] j,then we have Bitonic Tours (Problem 15-1 page 364) The Euclidean Traveling Salesperson Problem (ETSP): Determine the shortest closed tour that connects a given set of n points in the plane. A 7

bitonic tourisatourthatstartsattheleftmostpointgolefttorightto the rightmost point and return right to left back to the starting point. See figures below for a tour and a bitonic tour. Tour Bitonic Tour The problem is to find the shortest bitonic tour by using dynamic programming. Assume that no two points have the same x coordinate. Planning a Company Party (Problem 15-4) Edit Distance (Problem 15-3 (simplified)) Partitioning Problem: INPUT: An ordered set S[1, 2,..., n] and an integer p. OUTPUT: Partition S into p (or fewer) subsets of consecutive elements of S so as to minimize the maximum sum over these subsets. That is, Xk 1 max{ S[i], Xk 2 i=k 1+1 S[i],..., k p X i=k p 1+1 S[i]} is minimized. We are selecting the values for k 1 <k 2 <... < k p. Example: S =[1, 2, 3, 4, 5, 6, 7, 8, 9] and p =3. The partition [1, 2, 3, 4, 5], [6, 7], [8, 9] has a cost equal to 17. Wewantthistobeminimum. Calculating Binomial Coefficients: µ n Recall = n(n 1)...(n k+1) n! k k! = k!(n k)! We use the relation that µ µ n n 1 = k k 1 to calculate these numbers. µ n 1 + k 8

Shortest Path Problems (Chapter 25 & 26) The last set will be covered in Graph Algorithms. Since much of this material is directly from the book, there are no notes at this time. 9