CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 20: Dynamic Programming

Similar documents
MAKING A BINARY HEAP

MAKING A BINARY HEAP

CS483 Design and Analysis of Algorithms

Lecture 7: Dynamic Programming I: Optimal BSTs

Data Structures in Java

Partha Sarathi Mandal

Dynamic Programming. Reading: CLRS Chapter 15 & Section CSE 2331 Algorithms Steve Lai

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Note that M i,j depends on two entries in row (i 1). If we proceed in a row major order, these two entries will be available when we are ready to comp

Algorithms and Theory of Computation. Lecture 9: Dynamic Programming

Midterm Exam 2 Solutions

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Dynamic Programming. Reading: CLRS Chapter 15 & Section CSE 6331: Algorithms Steve Lai

1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

CS173 Lecture B, November 3, 2015

Lecture 18: More NP-Complete Problems

CSE 421. Dynamic Programming Shortest Paths with Negative Weights Yin Tat Lee

Divide and Conquer Strategy

Dynamic Programming. Prof. S.J. Soni

CSE 202 Homework 4 Matthias Springer, A

CS3233 Competitive i Programming

Aside: Golden Ratio. Golden Ratio: A universal law. Golden ratio φ = lim n = 1+ b n = a n 1. a n+1 = a n + b n, a n+b n a n

Algorithm Analysis Divide and Conquer. Chung-Ang University, Jaesung Lee

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17

Lecture 2: Divide and conquer and Dynamic programming

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CSE 200 Lecture Notes Turing machine vs. RAM machine vs. circuits

Lecture 2: Regular Expression

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Separating Hierarchical and General Hub Labelings

R ij = 2. Using all of these facts together, you can solve problem number 9.

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

Dynamic Programming( Weighted Interval Scheduling)

Maximum sum contiguous subsequence Longest common subsequence Matrix chain multiplication All pair shortest path Kna. Dynamic Programming

Lecture 6 January 21, 2013

Design and Analysis of Algorithms

CMPSCI 311: Introduction to Algorithms Second Midterm Exam

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Today s Outline. CS 362, Lecture 13. Matrix Chain Multiplication. Paranthesizing Matrices. Matrix Multiplication. Jared Saia University of New Mexico

Unit 1A: Computational Complexity

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 9

NP-complete problems. CSE 101: Design and Analysis of Algorithms Lecture 20

Limitations of Incremental Dynamic Programming

Lecture 23: Dynamic Programming: the algorithm

Compare the growth rate of functions

CS 4407 Algorithms Lecture: Shortest Path Algorithms

Dynamic Programming. Shuang Zhao. Microsoft Research Asia September 5, Dynamic Programming. Shuang Zhao. Outline. Introduction.

Lecture 10 September 27, 2016

Breadth First Search, Dijkstra s Algorithm for Shortest Paths

Dynamic Programming ACM Seminar in Algorithmics

1 Primals and Duals: Zero Sum Games

Lecture 19: Finish NP-Completeness, conp and Friends

Introduction. I Dynamic programming is a technique for solving optimization problems. I Key element: Decompose a problem into subproblems, solve them

Column Generation. i = 1,, 255;

Quick Tour of Linear Algebra and Graph Theory

Lecture 4: NP and computational intractability

APTAS for Bin Packing

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk

Optimisation and Operations Research

Divide and Conquer. Maximum/minimum. Median finding. CS125 Lecture 4 Fall 2016

CS173 Running Time and Big-O. Tandy Warnow

CPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009

CS4800: Algorithms & Data Jonathan Ullman

More Dynamic Programming

Review Exercise 2. 1 a Chemical A 5x+ Chemical B 2x+ 2y12 [ x+ Chemical C [ 4 12]

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

CS Analysis of Recursive Algorithms and Brute Force

More Dynamic Programming

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Design and Analysis of Algorithms April 16, 2015 Massachusetts Institute of Technology Profs. Erik Demaine, Srini Devadas, and Nancy Lynch Quiz 2

CSC 1700 Analysis of Algorithms: Warshall s and Floyd s algorithms

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

Nearest Neighbor Search with Keywords

Lecture 9: Dantzig-Wolfe Decomposition

Lecture 13. More dynamic programming! Longest Common Subsequences, Knapsack, and (if time) independent sets in trees.

Dynamic programming. Curs 2017

MODEL ANSWERS TO THE SEVENTH HOMEWORK. (b) We proved in homework six, question 2 (c) that. But we also proved homework six, question 2 (a) that

Discrete Optimization Lecture 5. M. Pawan Kumar

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

IE418 Integer Programming

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Design and Analysis of Algorithms

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Deterministic Finite Automaton (DFA)

Automata Theory CS Complexity Theory I: Polynomial Time

Activity selection. Goal: Select the largest possible set of nonoverlapping (mutually compatible) activities.

CS473 - Algorithms I

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

I may not have gone where I intended to go, but I think I have ended up where I needed to be. Douglas Adams

DAA Unit- II Greedy and Dynamic Programming. By Mrs. B.A. Khivsara Asst. Professor Department of Computer Engineering SNJB s KBJ COE, Chandwad

Chapter 6. Dynamic Programming. CS 350: Winter 2018

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

Bayesian Networks and Markov Random Fields

Algorithms Exam TIN093 /DIT602

Midterm 1 for CS 170

Lecture 9. Greedy Algorithm

COL351: Analysis and Design of Algorithms (CSE, IITD, Semester-I ) Name: Entry number:

Transcription:

CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.ucsd.edu Office 4208 CSE Building Lecture 20: Dynamic Programming

DYNAMIC PROGRAMMING Dynamic programming is an algorithmic paradigm in which a problem is solved by identifying a collection of subproblems and tackling them one by one, smallest first, using the answers to small problems to help figure out larger ones, until they are all solved. Examples:

DYNAMIC PROGRAMMING Dynamic programming is an algorithmic paradigm in which a problem is solved by identifying a collection of subproblems and tackling them one by one, smallest first, using the answers to small problems to help figure out larger ones, until they are all solved. Examples: findmax, findmin, fib2,

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS)

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS)

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) Shortest distance from D to another node x will be denoted dist(x). Notice that the shortest distance from D to C is dist C =

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) Shortest distance from D to another node x will be denoted dist(x). Notice that the shortest distance from D to C is dist C = min(dist E + 5, dist B + 2)

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) Shortest distance from D to another node x will be denoted dist(x). Notice that the shortest distance from D to C is dist C = min(dist E + 5, dist B + 2) This kind of relation can be written for every node. Since it s a DAG, the arrows only go to the right so by the time we get to node x, we have all the information needed!!

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) Step1: Define the subproblems: Step 2: Base Case: Step 3: express recursively: Step 4: order the subproblems

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) Step1: Define the subproblems: the distance to the ith vertex Step 2: Base Case: the distance to the first vertex to itself is 0 Step 3: express recursively: dist(v) = min (u,v) E dist u + l u, v Step 4: order the subproblems linearized order

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS) initialize all dist(.) values to infinity dist(s):=0 for each v V\{s} in linearized order dist(v)= min (u,v) E dist u + l u, v Like D/C, this algorithm solves a family of subproblems. We start with dist(s)=0 and we get to the larger subproblems in linearized order by using the smaller subproblems.

DYNAMIC PROGRAMMING (SHORTEST PATH IN DAGS)

DP (LONGEST INCREASING SUBSEQUENCE) Given a sequence of distinct positive integers a[1],,a[n] An increasing subsequence is a sequence a[i_1],,a[i_k] such that i_1< <i_k and a[i_1]< <a[i_k]. For Example: 15, 18, 8, 11, 5, 12, 16, 2, 20, 9, 10, 4 5, 16, 20 is an increasing subsequence. How long is the longest increasing subsequence?

DP (LONGEST INCREASING SUBSEQUENCE) Let s make a DAG out of our example: 15 18 8 11 5 12 16 2 20 9 10 4

DP (LONGEST INCREASING SUBSEQUENCE) Let s make a DAG out of our example: 15 18 8 11 5 12 16 2 20 9 10 4 Now, instead of finding the longest increasing subsequence of a list of integers, we are finding the longest path in a DAG!!!!

DYNAMIC PROGRAMMING (LONGEST INCREASING SUBSEQUENCE) Step1: Define the subproblems: Step 2: Base Case: Step 3: express recursively: Step 4: order the subproblems

DYNAMIC PROGRAMMING (LONGEST INCREASING SUBSEQUENCE) Step1: Define the subproblems: L(k) will be the length of the longest increasing subsequence ending exactly at position k Step 2: Base Case: L(1) = 0 Step 3: express recursively: L(k) = 1+max({L[i]:(i,j) is an edge}) Step 4: order the subproblems from left to right

DP (LONGEST INCREASING SUBSEQUENCE) Finding longest path in a DAG: L[1]:=0 for j=1 n L[j]=1+max({L[i]:(i,j) is an edge}) prev(j)=i return max({l[j]})

DP (LONGEST INCREASING SUBSEQUENCE) Let s make a DAG out of our example: 15 18 8 11 5 12 16 2 20 9 10 4

DP (LONGEST INCREASING SUBSEQUENCE) Let s make a DAG out of our example: 15 18 8 11 5 12 16 2 20 9 10 4

DP (LONGEST INCREASING SUBSEQUENCE) Finding longest path in a DAG: L[1]:=0 for j=1 n L[j]=max({L[i]:(i,j) is an edge}) prev(j)=i return max({l[j]}) How long does this take?

DP (LONGEST INCREASING SUBSEQUENCE) Finding longest path in a DAG: L[1]:=0 for j=1 n L[j]=max({L[i]:(i,j) is an edge}) prev(j)=i return max({l[j]}) How long does this take? To solve L[j]=max({L[i]:(i,j) is an edge}), we need to know L[i] for each edge (i,j) in E. This is equal to the indegree of j. So we sum over all vertices we get that j V so the runtime is O( E ). d in (j) = E

DP (LONGEST INCREASING SUBSEQUENCE) The runtime is dependent on the number of edges in the DAG. Note that if the sequence is increasing 1 2 3 4 5 If the sequenece is decreasing then 10 9 8 7 6..

DP (LONGEST INCREASING SUBSEQUENCE) The runtime is dependent on the number of edges in the DAG. What are the maximum and minimum number of edges?

DP (LONGEST INCREASING SUBSEQUENCE) The runtime is dependent on the number of edges in the DAG. Note that if the sequence is increasing then E = n 2 1 2 3 4 5 If the sequenece is decreasing then E =0 10 9 8 7 6..

DP (LONGEST INCREASING SUBSEQUENCE) What is the expected number of edges?

DP (STRING RECONSTRUCTION) Given a string of letters with no spaces or punctuation, how would you figure out how to separate the words? Example: THESEARETHERULES

DP (STRING RECONSTRUCTION) Given a string of letters with no spaces or punctuation, how would you figure out how to separate the words? Example: THESEARETHERULES greedy approach: Find the first real word, remove it from the string and repeat on the remaining string. If it doesn t work try a different way.

DP (STRING RECONSTRUCTION) THESEARETHERULES

DYNAMIC PROGRAMMING (STRING RECONSTRUCTION) Step1: Define the subproblems: Step 2: Base Case: Step 3: express recursively: Step 4: order the subproblems