Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

Similar documents
Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

Analysis of Algorithms - Using Asymptotic Bounds -

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Divide and Conquer. Andreas Klappenecker

Data Structures and Algorithms Chapter 3

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Algorithms Design & Analysis. Analysis of Algorithm

Data Structures and Algorithms CMPSC 465

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Methods for solving recurrences

CMPS 2200 Fall Divide-and-Conquer. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

Computational Complexity - Pseudocode and Recursions

Design and Analysis of Algorithms Recurrence. Prof. Chuhua Xian School of Computer Science and Engineering

COMP Analysis of Algorithms & Data Structures

Algorithm efficiency analysis

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Data Structures and Algorithms Chapter 3

Practical Session #3 - Recursions

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

COMP Analysis of Algorithms & Data Structures

MIDTERM I CMPS Winter 2013 Warmuth

Asymptotic Analysis and Recurrences

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

CPS 616 DIVIDE-AND-CONQUER 6-1

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Algorithms Chapter 4 Recurrences

Chapter 4 Divide-and-Conquer

1 Substitution method

CS 5321: Advanced Algorithms - Recurrence. Acknowledgement. Outline. Ali Ebnenasir Department of Computer Science Michigan Technological University

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

CS 5321: Advanced Algorithms Analysis Using Recurrence. Acknowledgement. Outline

A design paradigm. Divide and conquer: (When) does decomposing a problem into smaller parts help? 09/09/ EECS 3101

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Design Patterns for Data Structures. Chapter 3. Recursive Algorithms

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Chapter 4. Recurrences

Divide & Conquer. CS 320, Fall Dr. Geri Georg, Instructor CS320 Div&Conq 1

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

CSCI 3110 Assignment 6 Solutions

CS/SE 2C03. Sample solutions to the assignment 1.

Divide and Conquer. Recurrence Relations

CSC236 Week 4. Larry Zhang

The Divide-and-Conquer Design Paradigm

CS483 Design and Analysis of Algorithms

data structures and algorithms lecture 2

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Space Complexity of Algorithms

Analysis of Algorithms

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Divide and Conquer. Arash Rafiey. 27 October, 2016

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Introduction to Algorithms 6.046J/18.401J/SMA5503

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

V. Adamchik 1. Recurrences. Victor Adamchik Fall of 2005

Divide-and-Conquer Algorithms and Recurrence Relations. Niloufar Shafiei

Recurrence Relations

Lecture 3. Big-O notation, more recurrences!!

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Pre-lecture R: Solving Recurrences

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CS 61B Asymptotic Analysis Fall 2017

CSE 421 Algorithms: Divide and Conquer

CS 161 Summer 2009 Homework #2 Sample Solutions

CS483 Design and Analysis of Algorithms

Divide-and-conquer: Order Statistics. Curs: Fall 2017

Divide and Conquer: Polynomial Multiplication Version of October 1 / 7, 24201

Design and Analysis of Algorithms

Notes on Recurrence Relations

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Asymptotic Algorithm Analysis & Sorting

Mergesort and Recurrences (CLRS 2.3, 4.4)

Review Of Topics. Review: Induction

COMP 355 Advanced Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Data Structures and Algorithms

Find an Element x in an Unsorted Array

Asymptotic Running Time of Algorithms

CS F-01 Algorithm Analysis 1

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS

Divide and Conquer algorithms

CS Analysis of Recursive Algorithms and Brute Force

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

CS 310 Advanced Data Structures and Algorithms

CSCI Honor seminar in algorithms Homework 2 Solution

Solving Recurrences. Lecture 23 CS2110 Fall 2011

Transcription:

Recurrences

Objective running time as recursive function solve recurrence for order of growth method: substitution method: iteration/recursion tree method: MASTER method prerequisite: - mathematical induction, recursive definitions - arithmetic manipulations, series, products

Pipeline Proble Algorithm Running Time Recursion Solve Recursion

Running time will call it T(n) = number of computational steps required to run the algorithm/program for input of size n we are interested in order of growth, not exact values - for example T(n) = Θ(n 2 ) means quadratic running time - T(n) = O(n logn) means T(n) grows not faster than CONST*n*log(n) for simple problems, we know the answer right away - example: finding MAX of an array - solution: traverse the array, keep track of the max encountered - running time: one step for each array element, so n steps for

Running time for complex problems complex problems involve solving subproblems, usually - init/prepare/preprocess, define subproblems - solve subproblems - put subproblems results together thus T(n) cannot be computed straight forward - instead, follow the subproblem decomposition

Running time for complex problems often, subproblems are the same problem for a smaller input size: - for example max(array) can be solved as: - split array in array_left, array_right - solve max(array_left), max (array_right) - combine results to get global max Max(A=[a 1,a2,...,an]) if (n==1) return a1 k= n/2 max_left = Max([a1,a2,..,ak]) max_right = Max([ak+1,ak+2,..,an]) if(max_left>max_right) return max_left else return max_right

Running time for complex problems many problems can be solved using a divide-andconquer strategy - prepare, solve subproblems, combine results running time can be written recursively - T(n) = time(preparation) + time(subproblems) + time(combine) - for MAX recursive: T(n) = 2*T(n/2) + O(1)

Running time for complex problems many problems can be solved using a divide-andconquer strategy - prepare, solve subproblems, combine results running time can be written recursively - T(n) = time(preparation) + time(subproblems) + time(combine) - for MAX recursive: T(n) = 2*T(n/2) + O(1) 2 subproblems of size n/2 max(array_left) ; max(array_right)

Running time for complex problems many problems can be solved using a divide-andconquer strategy - prepare, solve subproblems, combine results running time can be written recursively - T(n) = time(preparation) + time(subproblems) + time(combine) - for MAX recursive: T(n) = 2*T(n/2) + O(1) 2 subproblems of size n/2 max(array_left) ; max(array_right) constant time to check the maximum out of the two max Left and Right

Recurrence examples T(n) = 2T(n/2) + O(1) T(n)= 2T(n/2) + O(n) - 2 subproblems of size n/2 each, plus O(n) steps to combine results T(n) = 4T(n/3) + n - 4 subproblems of size n/3 each, plus n steps to combine results T(n/4) + T(n/2) +n 2 - a subproblem of size n/3, another of size n/2; n 2 to combine want to solve such recurrences, to obtain the order of growth of function T

Substitution method T(n) = 4T(n/2) + n STEP1 : guess solution, order of growth T(n)= O(n 3 ) - that means there is a constant C and a starting value n 0, such that T(n) Cn 3, for any n n 0 STEP2: verify by induction - assume T(k) k 3, for k<n - induction step: prove that T(n) Cn 3, using T(k) Ck 3, for k<n T (n) = 4T ( n )+n (1) 2 n 3 apple 4c + n (2) 2 = c 2 n3 + n (3) = cn 3 ( c 2 n3 n) (4) apple cn 3 ; if c 2 n3 n>0,choose c 2 (5)

Substitution method STEP 3 : identify constants, in our case c=2 works so we proved T(n)=O(n 3 ) thats correct, but the result is too weak - technically we say the bound O(n 3 ) "cubic" is too lose - can prove better bounds like T(n) "quadratic" T(n)=O(n 2 ) - Our guess was wrong! (too big) lets try again : STEP1: guess T(n)=O(n 2 ) STEP2: verify by induction - assume T(k) Ck 2, for k<n - induction step: prove that T(n) Cn 2, using T(k) Ck 2, for k<n

Substitution method Fallacious argument T (n) = 4T ( n )+n (1) 2 n 2 apple 4c + n (2) 2 = cn 2 + n (3) = O(n 2 ) (4) apple cn 2 (5) - cant prove T(n)=O(n 2 ) this way: need same constant steps 3-4-5 - maybe its not true? Guess O(n 2 ) was too low? - or maybe we dont have the right proof idea common trick: if math doesnt work out, make a stronger assumption (subtract a lower degree term) - assume instead T(k) C 1 k 2 -C 2 k, for k<n

Substitution method T (n) = 4T ( n 2 )+n apple 4 n c 1 2 2 c 2 n 2 + n = c 1 n 2 2c 2 n + n = c 1 n 2 c 2 n (c 2 n n) apple c 1 n 2 c 2 n for c 2 > 1 So we can prove T(n)=O(n 2 ), but is that asymptotically correct? - maybe we can prove a lower upper bound, like O(nlogn)? NOPE to make sure its the asymptote, prove its also the lower bound - T(n) =Ω(n 2 ) or there is a different constant d s.t. T(n) dn 2

Substitution method: lower bound induction step T (n) = 4T ( n 2 )+n n 2 4d + n 2 = dn 2 + n dn 2 now we know its asymptotically close, T(n)=Θ(n 2 ) hard to make the initial guess Θ(n 2 ) - need another method to educate our guess

Iteration method T (n) =n +4T ( n 2 ) = n + 4( n 2 +4T (n 4 )) = n +2n +42 T ( n 2 2 ) = n +2n +4 2 ( n 2 2 +4T ( n 2 3 )) = n +2n +22 n +4 3 T ( n 2 3 ) =... = n +2n +2 2 n +... +2 k 1 n +4 k T ( n 2 k ) = kx 1 i=0 2 i n +4 k T ( n 2 k );

Iteration method T (n) =n +4T ( n 2 ) = n + 4( n 2 +4T (n 4 )) = n +2n +42 T ( n 2 2 ) = n +2n +4 2 ( n 2 2 +4T ( n 2 3 )) = n +2n +22 n +4 3 T ( n 2 3 ) =... = n +2n +2 2 n +... +2 k 1 n +4 k T ( n 2 k ) = kx 1 i=0 2 i n +4 k T ( n 2 k ); want k = log(n), n 2 k =1

Iteration method T (n) =n +4T ( n 2 ) = n + 4( n 2 +4T (n 4 )) = n +2n +42 T ( n 2 2 ) = n +2n +4 2 ( n 2 2 +4T ( n 2 3 )) = n +2n +22 n +4 3 T ( n 2 3 ) =... = n +2n +2 2 n +... +2 k 1 n +4 k T ( n 2 k ) = kx 1 i=0 2 i n +4 k T ( n 2 k ); want k = log(n), n 2 k =1 = n log(n) 1 X i=0 2 i +4 log(n) T (1) = n 2log(n) 1 + n 2 T (1) 2 1 = n(n 1) + n 2 T (1) = (n 2 )

Iteration method T (n) =n +4T ( n 2 ) = n + 4( n 2 +4T (n 4 )) = n +2n +42 T ( n 2 2 ) = n +2n +4 2 ( n 2 2 +4T ( n 2 3 )) = n +2n +22 n +4 3 T ( n 2 3 ) =... = n +2n +2 2 n +... +2 k 1 n +4 k T ( n 2 k ) = = n kx 1 i=0 2 i n +4 k T ( n 2 k ); log(n) 1 X i=0 2 i +4 log(n) T (1) = n 2log(n) 1 + n 2 T (1) 2 1 = n(n 1) + n 2 T (1) = (n 2 ) want k = log(n), n 2 k =1 math can be messy - recap sum, product, series, logarithms - iteration method good for guess, but usually unreliable for an exact result - use iteration for guess, and substitution for proofs stopping condition - T(...) = T(1), solve for k

Iteration method: visual tree T (n) +n T ( n 2 ) T ( n T ( n 2 ) T ( n 2 ) 2 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) +2n +4n

Iteration method: visual tree T (n) +n T ( n 2 ) T ( n T ( n 2 ) T ( n 2 ) 2 ) +2n T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) T ( n 4 ) T (n 4 ) T (n 4 ) T (n 4 ) +4n compute the tree depth: how many levels till nodes become leaves T(1)? log(n) compute the total number of leaves T(1) in the tree (last level): 4 log(n) compute the total additional work (right side) n+2n+4n+... = n(n-1) add the work 4 log(n) + n(n-1) = Θ(n 2 )

Iteration Method : derivation T(n) = n 2 + T(n/2) + T(n/4) T (n) =n 2 + T ( n 2 )+T(n 4 ) = n 2 +( n 2 )2 + T ( n 4 )+T(n 8 )+(n 4 )2 + T ( n 8 )+T( n 16 ) = n 2 + 5 16 n2 + T ( n 4 )+2T(n 8 )+T( n 16 ) = n 2 + 5 16 n2 +( n 4 )2 + T ( n 8 )+T( n 16 ) + 2(n 8 )2 +2T ( n 16 )+2T( n 32 )+(n 16 )2 + T ( n 32 )+T( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 + T ( n 8 )+3T( n 16 )+3T( n 32 )+T( n 64 )

Iteration Method : derivation T(n) = n 2 + T(n/2) + T(n/4) T (n) =n 2 + T ( n 2 )+T(n 4 ) = n 2 +( n 2 )2 + T ( n 4 )+T(n 8 )+(n 4 )2 + T ( n 8 )+T( n 16 ) = n 2 + 5 16 n2 + T ( n 4 )+2T(n 8 )+T( n 16 ) = n 2 + 5 16 n2 +( n 4 )2 + T ( n 8 )+T( n 16 ) + 2(n 8 )2 +2T ( n 16 )+2T( n 32 )+(n 16 )2 + T ( n 32 )+T( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 + T ( n 8 )+3T( n 16 )+3T( n 32 )+T( n 64 )

Iteration Method : derivation T(n) = n 2 + T(n/2) + T(n/4) T (n) =n 2 + T ( n 2 )+T (n 4 ) = n 2 +( n 2 )2 + T ( n 4 )+T (n 8 )+(n 4 )2 + T ( n 8 )+T ( n 16 ) = n 2 + 5 16 n2 + T ( n 4 )+2T (n 8 )+T ( n 16 ) = n 2 + 5 16 n2 +( n 4 )2 + T ( n 8 )+T ( n 16 ) + 2(n 8 )2 +2T ( n 16 )+2T ( n 32 )+(n 16 )2 + T ( n 32 )+T ( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 + T ( n 8 )+3T ( n 16 )+3T ( n 32 )+T ( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 +T ( n 16 )+T( n 32 )+(n 8 )2 +3T ( n 32 )+3T( n 64 )+3( n 16 )2 + 3T ( n 64 )+3T( n 128 ) + 3( n 32 )2 + T ( n 128 )+T( n 256 )+(n 64 )2

Iteration Method : derivation T(n) = n 2 + T(n/2) + T(n/4) T (n) =n 2 + T ( n 2 )+T (n 4 ) = n 2 +( n 2 )2 + T ( n 4 )+T (n 8 )+(n 4 )2 + T ( n 8 )+T ( n 16 ) = n 2 + 5 16 n2 + T ( n 4 )+2T (n 8 )+T ( n 16 ) = n 2 + 5 16 n2 +( n 4 )2 + T ( n 8 )+T ( n 16 ) + 2(n 8 )2 +2T ( n 16 )+2T ( n 32 )+(n 16 )2 + T ( n 32 )+T ( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 + T ( n 8 )+3T ( n 16 )+3T ( n 32 )+T ( n 64 ) = n 2 + 5 16 n2 +( 5 16 )2 n 2 +T ( n 16 )+T( n 32 )+(n 8 )2 +3T ( n 32 )+3T( n 64 )+3( n 16 )2 + 3T ( n 64 )+3T( n 128 ) + 3( n 32 )2 + T ( n 128 )+T( n 256 )+(n 64 )2 = n 2 + 5 16 n2 +( 5 16 )2 n 2 +( 5 16 )3 n 2 + T ( n 16 )+4T( n 32 )+6T( n 64 )+4T( n 128 )+ T ( n 256 )

Iteration Method : tree T (n) T ( n 2 ) T ( n 4 ) T ( n 4 ) T ( n 8 ) T (n 8 ) T ( n 16 ) T ( n 8 ) T ( n 16 ) T ( n 16 ) T ( n 32 ) T ( n 16 ) T ( n 32 ) T ( n 32 ) T ( n 64 )

Iteration Method : tree T (n) +n 2 T ( n 2 ) T ( n 4 ) + 5 16 n2 T ( n 4 ) T ( n 8 ) T (n 8 ) T ( n 16 ) + 2 5 n 2 16 T ( n 8 ) T ( n 16 ) T ( n 16 ) T ( n 32 ) T ( n 16 ) T ( n 32 ) T ( n 32 ) T ( n 64 ) + 3 5 n 2 16

Iteration Method : tree T (n) +n 2 T ( n 2 ) T ( n 4 ) + 5 16 n2 T ( n 4 ) T ( n 8 ) T (n 8 ) T ( n 16 ) + 2 5 n 2 16 T ( n 8 ) T ( n 16 ) T ( n 16 ) T ( n 32 ) T ( n 16 ) T ( n 32 ) T ( n 32 ) T ( n 64 ) + 3 5 n 2 16 depth : at most log(n) leaves: at most 2 log(n) = n; computational cost nt(1) =O(n) work : n 2 + 5 16 n2 +( 5 16 )2 n 2 +( 5 16 )3 n 2 +... apple n 2 total Θ(n 2 ) 1X i=0 ( 5 16 )i = 16 11 n2 = (n 2 )

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c )

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c ) R=a/b c, compare R with 1, or c with log b (a)

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c ) R=a/b c, compare R with 1, or c with log b (a)

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c ) R=a/b c, compare R with 1, or c with log b (a) - MergeSort T(n) = 2T(n/2) + Θ(n); a=2 b=2 c=1 case 2 ; T(n) = Θ(nlogn)

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c ) R=a/b c, compare R with 1, or c with log b (a) - MergeSort T(n) = 2T(n/2) + Θ(n); a=2 b=2 c=1 case 2 ; T(n) = Θ(nlogn) - Strassen's T(n) =7T(n/2) + Θ(n 2 ) ; a=7 b=2 c=2 case 1, T(n)= Θ(nlog 2 (7))

Master Theorem - simple simple general case T(n) = at(n/b) + Θ(n c ) R=a/b c, compare R with 1, or c with log b (a) - MergeSort T(n) = 2T(n/2) + Θ(n); a=2 b=2 c=1 case 2 ; T(n) = Θ(nlogn) - Strassen's T(n) =7T(n/2) + Θ(n 2 ) ; a=7 b=2 c=2 case 1, T(n)= Θ(nlog 2 (7)) - Binary Search T(n)=T(n/2) + Θ(1); a=1 b=2 c=0 case 2, T(n)= Θ(logn)

Master Theorem - why 3 cases that sum is geometric progression with base R=a/ b c it comes down to R being <1, =1, >1. So three

Master Theorem - why 3 cases

Master Theorem - why 3 cases

Master Theorem - why 3 cases

Master Theorem general case T(n) = at(n/b) + f(n) CASE 1 : CASE 2: CASE 3: f(n) =O(n log b a ) ) T (n) = (n log b a ) f(n) = (n log b a log k n) ) T (n) = (n log b a log k+1 n) f(n) = (n log b a+ ); af(n/b) f(n) < 1 ) T (n) = (f(n))

Master Theorem Example recurrence: T(n)=4T(n/2) +Θ(n 2 logn) Master Theorem: a=4; b=2; f(n)= n 2 logn - f(n) / n log b a = f(n)/n 2 = logn, so case 2 with k=1 solution T(n)=Θ(n 2 log 2 n)

Master Theorem Example T(n)=4T(n/2) + Θ(n 3 ) Master Theorem: a=4; b=2; f(n)=n 3 - f(n) / n log b a = f(n)/n 2 = n, so case 3 - check case 3 condition: - 4f(n/2)/f(n) = 4(n/2) 3 /n 3 = 1/2 < 1-ε solution T(n) = Θ(n 3 )

NON-Master Theorem Example T(n) = 4T(n/2) + n 2 /logn ; f(n) = n 2 /logn f(n) / n log b a = f(n)/n 2 = 1/logn - case1: - case2: - case3: no case applies - cant use Master Theorem use iteration method for guess, and substitution for a proof - see attached pdf