CS 470/570 Divide-and-Conquer. Format of Divide-and-Conquer algorithms: Master Recurrence Theorem (simpler version)

Similar documents
Data Structures and Algorithms CSE 465

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Divide and Conquer. Arash Rafiey. 27 October, 2016

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Introduction to Algorithms 6.046J/18.401J/SMA5503

Algorithms And Programming I. Lecture 5 Quicksort

Chapter 4 Divide-and-Conquer

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Divide and conquer. Philip II of Macedon

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Divide and Conquer. Andreas Klappenecker

Divide and Conquer algorithms

Divide-and-conquer. Curs 2015

Review Of Topics. Review: Induction

1 Divide and Conquer (September 3)

Lecture 4. Quicksort

CS 161 Summer 2009 Homework #2 Sample Solutions

CSCI 3110 Assignment 6 Solutions

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Divide-and-Conquer. a technique for designing algorithms

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

Design and Analysis of Algorithms

Algorithm Design and Analysis

Quick Sort Notes , Spring 2010

Divide and Conquer: Polynomial Multiplication Version of October 1 / 7, 24201

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

Chapter 1 Divide and Conquer Algorithm Theory WS 2016/17 Fabian Kuhn

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

Asymptotic Analysis and Recurrences

Analysis of Algorithms - Using Asymptotic Bounds -

Design and Analysis of Algorithms

Class Note #14. In this class, we studied an algorithm for integer multiplication, which. 2 ) to θ(n

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

Quicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Divide and Conquer. Maximum/minimum. Median finding. CS125 Lecture 4 Fall 2016

Divide & Conquer. Jordi Cortadella and Jordi Petit Department of Computer Science

The Divide-and-Conquer Design Paradigm

Divide and Conquer Strategy

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

CPS 616 DIVIDE-AND-CONQUER 6-1

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Data Structures and Algorithms Chapter 3

Analysis of Multithreaded Algorithms

Analysis of Algorithms CMPSC 565

Asymptotic Algorithm Analysis & Sorting

CS483 Design and Analysis of Algorithms

CSE 613: Parallel Programming. Lectures ( Analyzing Divide-and-Conquer Algorithms )

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

CS361 Homework #3 Solutions

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Mergesort and Recurrences (CLRS 2.3, 4.4)

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

1 Quick Sort LECTURE 7. OHSU/OGI (Winter 2009) ANALYSIS AND DESIGN OF ALGORITHMS

Divide-and-Conquer Algorithms and Recurrence Relations. Niloufar Shafiei

Introduction to Divide and Conquer

Chapter 1 Divide and Conquer Polynomial Multiplication Algorithm Theory WS 2015/16 Fabian Kuhn

Fast Convolution; Strassen s Method

Algorithms Chapter 4 Recurrences

Divide-and-conquer algorithm

CSE 421 Algorithms. T(n) = at(n/b) + n c. Closest Pair Problem. Divide and Conquer Algorithms. What you really need to know about recurrences

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Reductions, Recursion and Divide and Conquer

Searching. Sorting. Lambdas

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

COMP Analysis of Algorithms & Data Structures

5. DIVIDE AND CONQUER I

Week 5: Quicksort, Lower bound, Greedy

COMP Analysis of Algorithms & Data Structures

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Partition and Select

CS483 Design and Analysis of Algorithms

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Data Structures and Algorithms CSE 465

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2

5. DIVIDE AND CONQUER I

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

Chapter 5 Divide and Conquer

data structures and algorithms lecture 2

The divide-and-conquer strategy solves a problem by: 1. Breaking it into subproblems that are themselves smaller instances of the same type of problem

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

Bucket-Sort. Have seen lower bound of Ω(nlog n) for comparisonbased. Some cheating algorithms achieve O(n), given certain assumptions re input

Divide-and-Conquer Algorithms Part Two

1 Substitution method

CSCI Honor seminar in algorithms Homework 2 Solution

Data Structures and Algorithms Chapter 3

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

COMP 250: Quicksort. Carlos G. Oliver. February 13, Carlos G. Oliver COMP 250: Quicksort February 13, / 21

CSE 613: Parallel Programming. Lecture 8 ( Analyzing Divide-and-Conquer Algorithms )

CS2223 Algorithms D Term 2009 Exam 3 Solutions

Transcription:

CS 470/570 Divide-and-Conquer Format of Divide-and-Conquer algorithms: Divide: Split the array or list into smaller pieces Conquer: Solve the same problem recursively on smaller pieces Combine: Build the full solution from the recursive results Master Recurrence Theorem (simpler version) If T(n) = a T(n/b) + θ(n k ) then the solution is: T(n) = θ(n k ) if log b a < k (or a < b k ) T(n) = θ(n k lg n) if log b a = k (or a = b k ) T(n) = θ(n log b a ) if log b a > k (or a > b k ) Proof of Master Recurrence Theorem (simpler version): T(n) = a T(n/b) + cn k = a [a T(n/b 2 ) + c(n/b) k ] + cn k = a 2 T(n/b 2 ) + cn k (a/b k + 1)

= a 2 [a T(n/b 3 ) + c(n/b 2 ) k ] + cn k (1 + a/b k ) = a 3 T(n/b 3 ) + cn k (1 + a/b k + (a/b k ) 2 ) = = a t T(n/b t ) + cn k Σ 0 j t 1 (a/b k ) j [ Let n/b t = 1, so t = log b n, and a t = a log b n = n log b a. ] = n log b a T(1) + cn k Σ 0 j t 1 (a/b k ) j Consider S = Σ 0 j t 1 (a/b k ) j Three cases for sum of geometric series with ratio r = a/b k : If r < 1 (decreasing series) then a < b k, log b a < k, S = θ(first term) = θ((a/b k ) 0 ) = θ(1), and T(n) = θ(n log b a + n k 1) = θ(n k ). If r = 1 then a = b k, log b a = k S = θ(number of terms) = θ(t) = θ(log b n) = θ(lg n), and T(n) = θ(n log b a + n k lg n) = θ(n k lg n). If r > 1 (increasing series) then a > b k, log b a > k, S = θ(last term) = θ((a/b k ) t 1 ) = θ((a/b k ) t ) = θ((a b k ) log b n ) = θ(n log b a b k ) = θ(n log b a k ), and T(n) = θ(n log b a + n k n log b a k ) = θ(n log b a ).

Examples: Merge sort: T(n) = 2 T(n/2) + θ(n) a=2, b=2, k=1 log 2 2 = 1 So T(n) = θ(n k lg n) = θ(n lg n) Binary search: T(n) = T(n/2) + θ(1) a=1, b=2, k=0 log 2 1 = 0 So T(n) = θ(n k lg n) = θ(lg n)

But what if recurrence is not in form of master recurrence? Example: T(n) = a T(n 1/b ) + lg k n Change of variable, let n = 2 m and m = lg n T(2 m ) = a T(2 m/b ) + m k Now let S(m) = T(2 m ) = T(n) S(m) = a S(m/b) + m k By master recurrence theorem: S(m) = θ(m log b a ) if a > b k S(m) = θ(m k lg m) if a = b k S(m) = θ(m k ) if a < b k T(n) = θ((lg n) log b a ) T(n) = θ((lg n) k lg lg n) T(n) = θ((lg n) k ) if a > b k if a = b k if a < b k

Example: T(n) = a T(n d) + c n Change of variable, let n = lg m and m = 2 n T(lg m) = a T(lg m lg 2 d ) + c lg m = a T(lg (m/2 d )) + m lg c Let S(m) = T(lg m) = T(n) S(m) = a S(m/2 d ) + m lg c Let b = 2 d and d = lg b Let k = lg c and c = 2 k S(m) = a S(m/b) + m k By master recurrence theorem: S(m) = θ(m k ) if a < b k S(m) = θ(m k lg m) if a = b k S(m) = θ(m log b a ) if a > b k Note b k = 2 dk = c d Also m k = 2 nk = c n Also m log b a = a log b m = a lg m/lg b = a n/d Therefore: T(n) = θ(c n ) T(n) = θ(n c n ) T(n) = θ(a n/d ) if a < c d if a = c d if a > c d

Divide-and-conquer algorithms Polynomial multiplication problem Given two polynomials: P = 5x 3 + 7x 2 + 6x + 2 [5, 7, 6, 2] = [p 3, p 2, p 1, p 0 ] Q = x 3 8x 2 + 9x 1 [1, 8, 9, 1] = [q 3, q 2, q 1, q 0 ] P*Q = (5x 3 + 7x 2 + 6x + 2)(x 3 8x 2 + 9x 1) Divide-and-conquer Algorithm 1 for Polynomial multiplication: P = 5x 3 + 7x 2 + 6x + 2 = (5x + 7)x 2 + (6x + 2) = Ax 2 + B Q = x 3 8x 2 + 9x 1 = (x 8)x 2 + (9x 1) = Cx 2 + D A = 5x + 7 [5, 7] B = 6x + 2 [6, 2] C = x 8 [1, 8] D = 9x 1 [9, 1] Let n = number of terms in P and Q P = Ax n/2 + B Q = Cx n/2 + D A, B, C, D are polynomials with n/2 terms P*Q = (Ax n/2 + B)(Cx n/2 + D) = (AC)x n + (BC + AD)x n/2 + (BD)

Four recursive subproblems: A*C B*C A*D B*D Stop the recursion when n = 1 Analysis of Algorithm 1 for polynomial multiplication: T(n) = 4 T(n/2) + θ(n) a=4, b=2, k=1 log 2 4 = 2 > 1 So T(n) = θ(n log b a ) = θ(n 2 )

Divide-and-conquer Algorithm 2 (Karatsuba s algorithm) for Polynomial multiplication: As before, P*Q = (Ax n/2 + B)(Cx n/2 + D) = (AC)x n + (BC + AD)x n/2 + (BD) This time we write (BC + AD) = (A+B)(C+D) (AC) (BD) So only three recursive calls: A*C B*D (A+B)*(C+D) Again stop the recursion when n = 1 Analysis of Algorithm 2 for polynomial multiplication: T(n) = 3 T(n/2) + θ(n) a=3, b=2, k=1 log 2 3 > 1 So T(n) = θ(n log b a ) = θ(n lg 3 ) θ(n 1.58 )

Matrix multiplication problem If matrices A, B are square n-by-n matrices, then product C = A*B is also square n-by-n matrix Example with n=3: 1 2 3 10 11 12 4 5 6 13 14 15 = 7 8 9 16 17 18 1 10 + 2 13 + 3 16 7 12 + 8 15 + 9 18 Each element C[i][j] = Σ 1 k n A[i][k]*B[k][j] The above formula leads to the following naive algorithm for computing the C matrix in θ(n 3 ) total time: for i = 1 to n for j = 1 to n { C[i][j] = 0; for k = 1 to n C[i][j] += A[i][k]*B[k][j]; }

Divide-and-conquer Algorithm 1 for matrix multiplication: Assume n is even Otherwise, add extra row and extra column with all 0s Stop the recursion when n=1 aa 1,1 aa 1,nn bb 1,1 bb 1,nn If A = and B = aa nn,1 aa nn,nn bb nn,1 bb nn,nn then write A = SS TT XX and B = WW where each of S, T, U, UU VV YY ZZ V, W, X, Y, Z is a square (n/2)-by-(n/2) matrix aa 1,1 aa 1,nn/2 For example, S = aa nn/2,1 aa nn/2,nn/2 C = A*B = SS TT UU VV Eight recursive calls: S*W T*Y S*X T*Z U*W V*Y U*X V*Z WW XX YY ZZ WW + TT YY SS XX + TT ZZ = SS UU WW + VV YY UU XX + VV ZZ

Analysis of Algorithm 1 for matrix multiplication: T(n) = 8 T(n/2) + θ(n 2 ) a=8, b=2, k=2 log 2 8 = 3 > 2 So T(n) = θ(n 3 ) Same running time as the previous naïve algorithm

Divide-and-conquer Algorithm 2 (Strassen s algorithm) for matrix multiplication: Again we write A = SS TT XX and B = WW UU VV YY ZZ 7 recursive calls: P1 = S * (X Z) P2 = (S + T) * Z P3 = (U + V) * W P4 = V * (Y W) P5 = (S + V) * (W + Z) P6 = (T V) * (Y + Z) P7 = (U S) * (W + X) PP5 + PP4 PP2 + PP6 C = PP3 + PP4 PP1 + PP2 PP5 + PP1 PP3 + PP7 Difficult to derive, but easy to verify the algebra: P1 + P2 = S*X + T*Z P3 + P4 = U*W + V*Y etc.

Analysis of Algorithm 2 for matrix multiplication: T(n) = 7 T(n/2) + θ(n 2 ) a=7, b=2, k=2 log 2 7 > 2 So T(n) = θ(n lg 7 ) θ(n 2.81 ) Current best known algorithm for matrix multiplication θ(n 2.37 ) time, by further extending this D&C approach, but the formulas are too complicated to present here

Majority element problem: Given array A of size n, does there exist any value M that appears more than n/2 times in array A? Majority element algorithm: Phase 1: Use divide-and-conquer to find candidate value M Phase 2: Check if M really is a majority element, θ(n) time, simple loop Phase 1 details: Divide: o Group the elements of A into n/2 pairs o If n is odd, there is one unpaired element, x Check if this x is majority element of A If so, then return x, but otherwise discard x o Compare each pair (y, z) If (y==z) keep y and discard z If (y!=z) discard both y and z o So we keep n/2 elements Conquer: One recursive call on subarray of size n/2 Combine: Nothing remains to be done, so omit this step

Example: A = [7, 7, 5, 2, 5, 5, 4, 5, 5, 5, 7] (7, 7) (5, 2) (5, 5) (4, 5) (5, 5) (7) A = [7, 5, 5] (7, 5) (5) return 5 (candidate, also majority) Example: A = [1, 2, 3, 1, 2, 3, 1, 2, 9, 9] (1, 2) (3, 1) (2, 3) (1, 2) (9, 9) A = [9] return 9 (candidate, but not majority) Analysis: Let T(n) = running time of Phase 1 on array of size n T(n) = T(n/2) + θ(n) Number of recursive subproblems = 1 Size of each subproblem = n/2 [worst-case] Time for all the non-recursive steps = θ(n) Solution: T(n) = T(n/2) + θ(n) a=1, b=2, k=1 log 2 1 = 0 < 1 So T(n) = θ(n k ) = θ(n)

Selection problem: Given array A[1 n] and value k where 1 k n, find and return the k th smallest element in A If k=1 minimum element If k=n maximum element If k=(1+n)/2 median element

Algorithm 1 for selection problem: sort A into ascending order; return A[k]; Algorithm 1 takes θ(n lg n) time using merge sort or heap sort Can we develop a faster algorithm? Note: without sorting, we can determine the minimum or maximum element in θ(n) time. [How?] Goal: solve the selection problem for arbitrary k in θ(n) time Idea: divide-and-conquer, use pivot similar to quick sort, but only make recursive call on one of the two subarrays, because we don t need to completely sort the entire array

Algorithm 2 for selection problem: Choose pivot element that is hopefully near the median of A Recall these strategies for choosing the pivot in quick sort: pivot = A[low] pivot = A[high] pivot = A[(low+high)/2] pivot = A[random (low, high)] pivot = median (A[random (low, high)], A[random (low, high)], A[random (low, high)]) pivot = median (A[low], A[(low+high)/2], A[high]) Select (A, k) { pivot = ; // choose any of the above strategies create three empty lists: L, E, G; for each x in A if (x<pivot) add x to L; else if (x==pivot) add x to E; else /* (x>pivot) */ add x to G; if (k <= L.size) return Select (L, k); else if (k <= L.size + E.size) return pivot; else return Select (G, k L.size E.size); }

Analysis of Algorithm 2 for selection problem: Best case: θ(n), if pivot happens to be the k th smallest element Worst case: θ(n 2 ), if pivot is always near the minimum or maximum value Average case: θ(n), if sometimes pivot yields a good split and sometimes a bad split, based on probabilities Recurrence for worst case: T(n) = T(n 1) + θ(n) or T(n) = T(n 2) + θ(n) Recurrence for average case (assuming no duplicates): T(n) = 1 Σ 1 k n [ k 1 T(k 1) + n k T(n k)] + θ(n) n n n Does not conform to the master recurrence theorem, so it s difficult to solve How can we achieve worst-case θ(n) time for selection? Note: if we could be very lucky to always guess the median element as the pivot, then T(n) = T(n/2) + θ(n) T(n) = θ(n) So we want a new strategy for choosing a pivot that s always close to the median

Algorithm 3 for selection problem: Same as algorithm 2, except for new pivot strategy: Choose an odd number g (later we ll see g=5 is best) Partition the n elements into groups of size g each (So the number of groups = n/g) Find the median of each group (Note: we can sort each group in θ(g 2 ) = θ(1) time) Let M = list of all these group medians, so size of M is n/g Find the median of M by calling Algorithm 3 recursively (Note: because we can t sort M in θ(n) time) Let pivot = the median of M = Select (M, (1 + n/g)/2) (So pivot is the median-of-medians) Next continue the same as in Algorithm 2: create three empty lists: L, E, G; for each x in A if (x<pivot) add x to L; else if (x==pivot) add x to E; else /* (x>pivot) */ add x to G; if (k <= L.size) return Select (L, k); else if (k <= L.size + E.size) return pivot; else return Select (G, k L.size E.size);

Stop the recursion when n is below some threshold (such as n < 3g or n < g 2 ), and solve using Algorithm 1 or Algorithm 2 Example: n=25, let g=5 A 1 14 11 15 13 23 17 4 19 6 0 10 8 3 2 9 21 12 22 16 24 18 5 20 7 To find the median of A, call Select (A, (1+25)/2) = Select (A, 13) A 1 14 11 15 13 23 17 4 19 6 0 10 8 3 2 9 21 12 22 16 24 18 5 20 7 M = [13, 17, 3, 16, 18] pivot = Select (M, (1+25/5)/2) = Select ([13,17,3,16,18], 3) = 16 L = [1,14,11,15,13,4,6,0,10,8,3,2,9,12,5,7] L.size = 16 E = [16] E.size = 1 G = [23,17,19,21,22,24,18,20] G.size = 8 k=13 k <= L.size Select (L, 13) = 12 Next suppose we call Select (A, 21) using same array A Almost everything proceeds exactly as above k=21 k > L.size + E.size Select (G, 21 16 1) Select (G, 4) = 20

Analysis of Algorithm 3 for selection problem: Two recursive calls pivot = Select (M, (1 + n/g)/2) only one of Select (L, k) or Select (G, k L.size E.size) T(n) = T(M.size) + T(max(L.size, G.size)) + θ(n) Recall M.size = n/g What is upper bound for L.size and G.size? Note: pivot is the median of M So half of the n/g elements in M must be pivot Half of the n/g groups have medians pivot Each of these groups has at least g/2 elements pivot Altogether, at least (1/2)(n/g)(g/2) = n/4 elements pivot All these n/4 elements are in L and E (so they re not in G) Therefore G.size 3n/4 Analogously we can show that L.size 3n/4 so max(l.size, G.size) 3n/4

Intuition: Select (A, n/4) pivot Select (A, 3n/4), so pivot is closer to median than it is to min or max elements T(n) = T(n/g) + T(3n/4) + θ(n) Does not conform to the master recurrence theorem, but we can solve it easily by another approach T(n) = T(n/g) + T(3n/4) + cn, for some constant c > 0 Guess that T(n) = dn, for some other constant d > 0 dn = d(n/g) + d(3n/4) + cn d = d/g + 3d/4 + c d ( 1/4 1/g ) = c Note: must have g > 4 for this equation to be solvable, so choose group size g=5 d ( 1/4 1/5 ) = c d ( 1/20 ) = c d = 20c So T(n) = dn = 20cn = θ(n) Algorithm 3 is a worst-case θ(n)-time algorithm