Algorithms Chapter 4 Recurrences

Similar documents
Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Analysis of Algorithms - Using Asymptotic Bounds -

Chapter 4. Recurrences

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

Divide and Conquer. Andreas Klappenecker

Data Structures and Algorithms Chapter 3

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Chapter 4 Divide-and-Conquer

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

1 Substitution method

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

CS F-01 Algorithm Analysis 1

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

6.046 Recitation 2: Recurrences Bill Thies, Fall 2004 Outline

Recurrence Relations

Data Structures and Algorithms CMPSC 465

Asymptotic Analysis and Recurrences

Divide and Conquer. Arash Rafiey. 27 October, 2016

Algorithms Design & Analysis. Analysis of Algorithm

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

CS473 - Algorithms I

Data Structures and Algorithms

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Methods for solving recurrences

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Data Structures and Algorithms CSE 465

The Divide-and-Conquer Design Paradigm

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Divide-and-Conquer Algorithms and Recurrence Relations. Niloufar Shafiei

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Review Of Topics. Review: Induction

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Data Structures and Algorithms Chapter 3

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

CS 4349 Lecture August 30th, 2017

COMP 9024, Class notes, 11s2, Class 1

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

Algorithm Design and Analysis

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Appendix: Solving Recurrences

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

Algorithms. Adnan YAZICI Dept. of Computer Engineering Middle East Technical Univ. Ankara - TURKEY. Algorihms, A.Yazici, Fall 2007 CEng 315

Notes for Recitation 14

Divide & Conquer. CS 320, Fall Dr. Geri Georg, Instructor CS320 Div&Conq 1

with the size of the input in the limit, as the size of the misused.

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Inf 2B: Sorting, MergeSort and Divide-and-Conquer

Practical Session #3 - Recursions

CS 470/570 Divide-and-Conquer. Format of Divide-and-Conquer algorithms: Master Recurrence Theorem (simpler version)

Appendix II: Solving Recurrences [Fa 10] Wil Wheaton: Embrace the dark side! Sheldon: That s not even from your franchise!

Algorithm efficiency analysis

CPS 616 DIVIDE-AND-CONQUER 6-1

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

3.1 Asymptotic notation

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

COMP Analysis of Algorithms & Data Structures

COMP 382: Reasoning about algorithms

COMP Analysis of Algorithms & Data Structures

Mergesort and Recurrences (CLRS 2.3, 4.4)

Introduction to Divide and Conquer

Pre-lecture R: Solving Recurrences

CS Data Structures and Algorithm Analysis

Divide and Conquer Algorithms

CS Analysis of Recursive Algorithms and Brute Force

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Lecture 4. Quicksort

data structures and algorithms lecture 2

COMP 355 Advanced Algorithms

Introduction to Algorithms 6.046J/18.401J/SMA5503

Recurrences COMP 215

Divide and Conquer. Recurrence Relations

Notes on Recurrence Relations

Asymptotic Algorithm Analysis & Sorting

Growth of Functions (CLRS 2.3,3)

Divide and Conquer Algorithms

Lecture 3. Big-O notation, more recurrences!!

Divide and Conquer: Polynomial Multiplication Version of October 1 / 7, 24201

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Computational Complexity

MIDTERM I CMPS Winter 2013 Warmuth

CSCI 3110 Assignment 6 Solutions

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Divide and Conquer algorithms

CS483 Design and Analysis of Algorithms

IS 709/809: Computational Methods in IS Research Fall Exam Review

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

Recurrence Relations

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

4.4 Recurrences and Big-O

Transcription:

Algorithms Chapter 4 Recurrences Instructor: Ching Chi Lin 林清池助理教授 chingchi.lin@gmail.com Department of Computer Science and Engineering National Taiwan Ocean University

Outline The substitution method The recursion tree method The master method

The purpose of this chapter When an algorithm contains a recursive call to itself, its running time can often be described by a recurrence. A recurrence is an equation or inequality that describes a function in terms of its value on small inputs. For example: the worst case running time of Merge Sort T ( n) = θ (1) T ( n / ) + θ ( n) if if n n = 1, > 1. The solution is T(n)=Θ(nlgn). Three methods for solving recurrences the substitution method the recursion tree method the master method 3

Technicalities 1/ The running time T(n) is only defined when n is an integer, since the size of the input is always an integer for most algorithms. For example: the running time of Merge Sort is really θ (1) T ( n) = T ( n / ) + T ( n / ) + θ ( n) if n = 1, if n > 1. Typically, we ignore the boundary conditions. Since the running time of an algorithm on a constant sized input is a constant, we have T(n) = Θ(1) for sufficiently small n. Thus, we can rewrite the recurrence as T(n) = T(n/)+ Θ(n). 4

Technicalities / When we state and solve recurrences, we often omit floors, ceilings, and boundary conditions. We forge ahead without these details and later determine whether or not they matter. They usually don't, but it is important to know when they do. 5

The substitution method 1/3 The substitution method entails two steps: Guess the form of the solution Use mathematical induction to find the constants and show that the solution works This method is powerful, but it obviously can be applied only in cases when it is easy to guess the form of the answer 6

The substitution method /3 For example: determine an upper bound on the recurrence guess that the solution is T(n)=O(nlgn) prove that there exist positive constants c>0 and n 0 such that T(n) lgn for all n n 0 Basis step: T(n) =T( n/ )+ n, where T(1)=1 when n=1, T(1) c 1 1lg 1=0, which is odds with T(1)=1 since the recurrence does not depend directly on T(1), we can replace T(1) by T()=4 and T(3)=5 as the base cases T () c 1 lg and T(3) c 1 3lg3 for any choice of c 1 thus, we can choose c 1 = and n 0 = 7

The substitution method 3/3 Induction step: assume T( n/ ) c n/ lg( n/ ) for n/ then, T(n) (c n/ lg( n/ ))+n c nlg(n/)+ n = c nlgn-c nlg+ n = c nlgn-c n + n c nlgn the last step holds as long as c 1 There exist positive constants c = max{,1} and n 0 = such that T(n) lgn for all n n 0 8

Making a good guess Experience: T(n)=T( n/ +17)+n when n is large, the difference between T ( n/ ) and T ( n/ +17) is not that large: both cut n nearly evenly in half. we make the guess that T (n) = O(nlgn) Loose upper and lower bounds: T(n)=T( n/ )+n prove a lower bound of T (n) = Ω(n) and an upper bound of T(n) = O(n ) gradually lower the upper bound and raise the lower bound until we converge on the tight solution of T (n) = Θ(nlgn) Recursion trees: will be introduced later 9

Subtleties Consider the recurrence: T(n) = T( n/ )+T ( n/ )+1 guess that the solution is O(n), i.e., T(n) then, T(n) c n/ +(c n/ +1 = +1 c does not exist new guess T(n) -b then, T(n) (c n/ -b)+(c n/ -b)+1 = -b+1 -b for b 1 also, the constant c must be chosen large enough to handle the boundary conditions 10

Avoiding pitfalls It is easy to err in the use of asymptotic notation. For example: T(n) =T( n/ ) + n guess that the solution is O(n), i.e., T(n) then, T(n) (c n/ )+n +n = O(n) wrong the error is that we haven't proved the exact form of the inductive hypothesis, that is, that T(n) 11

Changing variables Sometimes, a little algebraic manipulation can make an unknown recurrence similar to one you have seen before. For example, consider the recurrence T(n) =T( n 1/ ) +lgn Renaming m=lgn yields T( m ) =T( m/ ) + m Renaming S(m)=T( m ) produces S(m) =S(m/) + m S(m) = O(mlgm) Changing back S(m)toT(m), we obtain T(n) = T( m ) = S(m) = O(mlgm) = O(lgnlglgn) 1

Outline The substitution method The recursion tree method The master method 13

The recursion tree method A recursion tree is best used to generate a good guess, which is then verified by the substitution method. Tolerating a small amount of sloppiness, we could use recursion tree to generate a good guess. One can also use a recursion tree as a direct proof of a solution to a recurrence. Ideas: in a recursion tree, each node represents the cost of a single subproblem sum the costs within each level to obtain a set of per level costs sum all the per level costs to determine the total cost 14

An example For example: T(n) =3T( n/4 ) + Θ(n ) Tolerating the sloppiness: ignore the floor in the recurrence assume n is an exact power of 4 Rewrite the recurrence as T(n) =3T(n/4) + 15

The construction of a recursion tree 1/ T(n) =3T(n/4) + T(n) (a) T(n/4) T(n/4) T(n/4) (b) c(n/4) c(n/4) c(n/4) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) (c) 16

The construction of a recursion tree / c(n/4) c(n/4) c(n/4) (3/16) log 4 n c(n/16) c(n/16) c(n/16) c(n/16) c(n/16) c(n/16) c(n/16) c(n/16) c(n/16) (3/16) T(1)T(1) T(1) T(1) T(1)T(1) Θ(n log 4 3 ) 3 log4n = n log 4 3 (d) Total:O(n ) 17

Determine the cost of the tree 1/ The subproblem size for a node at depth i is n/4 i. Thus, the tree has log 4 n + 1 levels (0, 1,,..., log 4 n). Each node at depth i, has a cost of c(n/4 i ) for 0 i log 4 n-1. So, the total cost over all nodes at depth i is 3 i *c(n/4 i ) =(3/16) i. The last level, at log 4 n,has log 4 log 3 3 n = n 4 nodes. The cost of the entire tree: T ( n) = = = log 4 n i= 0 3 + 16 1 3 ( 16 log 4 n (3 /16) 1 (3 /16) 1 ) i + 3 ( 16 ) + Θ( n log 4 3 + Θ( n +... + ) log 4 3 ) 3 ( 16 ) log 4 n 1 + θ ( n log 4 3 ) 18

Determine the cost of the tree / Take advantage of small amounts of sloppiness, we have T ( n ) = < = = = log 4 i = 0 i = 0 1 n 1 3 ( 16 16 13 O ( n ) 3 ( 16 ) i ) 1 (3 / 16 ) + i + Θ ( n log + Θ ( n + 4 3 Θ ( n Θ ( n ) log 4 log 3 log ) 4 4 3 3 ) ) Thus, we have derived a guess of T(n)=O(n ). 19

Verify the correctness of our guess Now we can use the substitution method to verify that our guess is correct. We want to show that T(n) dn for some constant d > 0. Using the same constant c > 0 as before, we have T ( n) 3T 3d ( n / 4 n / 4 3d ( n / 4) ) + + + = 3/16dn + dn, where the last step holds for d (16/13)c. 0

Another example Another example: T(n) = T(n/3)+T(n/3)+O(n). The recursion tree: c(n/3) c(n/3) log 3/ n c(n/9) c(n/9) c(n/9) c(n/9) Total:O(nlgn)

Determine the cost of the tree The height of tree is log 3/ n. The recursion tree has fewer than log 3/ log3/ n = n leaves. log The total cost of all leaves would then be θ ( n 3/ ), which is ω( n lg n). Also, not all levels contribute a cost of exactly. Thus, we derived a guess of T(n)=O(nlgn).

Verify the correctness of our guess We can verify the guess by the substitution method. We have T( n) T( n / 3) + T(n / 3) + d( n / 3)lg( n / 3) + d(n / 3)lg(n / 3) + = ( d( n / 3)lgn d( n / 3)lg3) + ( d(n / 3)lgn d(n / 3)lg(3/ )) + = dnlgn d(( n / 3)lg3+ (n / 3)lg(3/ )) + = dnlgn d(( n / 3)lg3+ (n / 3)lg3 (n / 3)lg) + = dnlgn dn(lg3 /3) + dnlgn for d c/(lg3-(/3)).

Outline The substitution method The recursion tree method The master method 4

The master method 1/ The master method provides a "cookbook" method for solving recurrences of the form T(n) = at(n/b) + f(n) a 1 and b>1 are constants f(n) is an asymptotically positive function It requires memorization of three cases, but then the solution of many recurrences can be determined quite easily. 5

The master method / The recurrence T(n) = at(n/b) + f(n)describes the running time of an algorithm that divides a problem of size n into a subproblems, each of size n/b each of subproblems is solved recursively in time T(n/b) the cost of dividing and combining the results is f(n) For example, the recurrence arising from the MERGE SORT procedure has a =, b =, and f (n) = Θ(n). Normally, we omit the floor and ceiling functions when writing divide and conquer recurrences of this form. 6

Master theorem Master theorem: Let a 1 and b > 1 be constants, let f(n) be a function, and let T(n) be defined on the nonnegative integers by the recurrence 7 T(n) = at(n/b) + f(n) where we interpret n/b to mean either n/b or n/b. Then, T(n) can be bounded asymptotically as follows. logb a ε log a 1. If f ( n) = O( n ) for some constant ε > 0, then T ( n) = θ ( n b ). log a logb a. If f ( n) = θ ( n b ), then T ( n) = θ ( n lgn). logb a+ ε 3. If f ( n) = Ω( n ) for some constant ε > 0, and if af(n/b) cf(n) for some constant c<1 and all sufficiently large n, then T ( n) = θ ( f ( n)).

Intuition behind the master method Intuitively, the solution to the recurrence is determined by a comparing the two functions f(n) and n log b. Case 1: if Case : if b a n log is asymptotically larger than f(n) by a factor of n ε logb a for some constant ε > 0, then the solution is T ( n) = θ ( n ). b a n log is asymptotically equal to f(n), then the solution logb a is T ( n) = θ ( n lgn). n log b a Case 3: if is asymptotically smaller then f(n)by a factor of n ε, and the function f(n) satisfies the "regularity condition that af(n/b) cf(n), then the solution is T ( n) = θ ( f ( n)). The three cases do not cover all the possibilities for f(n). 8

Using the master method 1/3 Example 1: T(n) =9T(n/3) + n For this recurrence, we have a = 9, b = 3, f(n) = n. Thus, log b a log 3 n = n 9 = θ ( n ). Since ( ) ( log 9 ε f n = O n 3 ),where ε = 1, we can apply case 1. The solution is T(n) = Θ(n ). Example : T(n) = T(n/3) + 1 For this recurrence, we have a = 1, b = 3/, f(n) = 1. n = n n 3 = = logb a log / 1 0 Thus,. f ( n) = O( n logb a Since,we can apply case. The solution is T(n) = Θ(lgn). 1 ) = θ (1) 9

Using the master method /3 Example 3: T(n) =3T(n/4) + nlgn For this recurrence, we have a = 3, b = 4, f(n) = nlgn. log 3 0.793 Thus, n logb a = n 4 = O( n ). For sufficiently large n, af(n/b) = 3(n/4) lg(n/4) (3/4)nlgn = cf(n) for c = 3/4. f ( n) = Ω( n log 4 3+ ε Since with ε holds for f(n) case 3 applies. The solution is T(n) = Θ(nlgn). ) and the regularity condition 30

Using the master method 3/3 Example 4: T(n) =T(n/) + nlgn For this recurrence, we have a =, b =, f(n) = nlgn. The function f(n) = nlgn is asymptotically larger log than n b a log = n = n. But, it is not polynomially larger since the ratio logb a f ( n) / n = ( n lg n) / n = lg n is asymptotically less than n ε for any positive constant ε. Consequently, the recurrence falls into the gap between case and case 3. If g(n) is asymptotically larger than f(n) by a factor of n ε for some constant ε>0, then we said g(n) is polynomially larger than f(n). 31