COMPUTER ALGORITHMS. Athasit Surarerks.

Similar documents
CSC Design and Analysis of Algorithms. Lecture 1

Analysis of Algorithms

CS Non-recursive and Recursive Algorithm Analysis

Cpt S 223. School of EECS, WSU

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Growth of Functions (CLRS 2.3,3)

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithms and Their Complexity

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Computational Complexity

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Introduction to Algorithms

Asymptotic Analysis 1

Module 1: Analyzing the Efficiency of Algorithms

Introduction to Algorithms and Asymptotic analysis

Ch01. Analysis of Algorithms

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithms

EECS 477: Introduction to algorithms. Lecture 5

3.1 Asymptotic notation

with the size of the input in the limit, as the size of the misused.

CS 4407 Algorithms Lecture 2: Growth Functions

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Algorithms Design & Analysis. Analysis of Algorithm

Data Structures and Algorithms Chapter 3

Running Time Evaluation

CS F-01 Algorithm Analysis 1

Data Structures and Algorithms Running time and growth functions January 18, 2018

Advanced Counting Techniques. Chapter 8

Analysis of Multithreaded Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Review Of Topics. Review: Induction

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Module 1: Analyzing the Efficiency of Algorithms

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Data Structures and Algorithms

Introduction to Algorithms: Asymptotic Notation

CS473 - Algorithms I

1 Substitution method

Introduction. An Introduction to Algorithms and Data Structures

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Divide and Conquer. Andreas Klappenecker

Ch 01. Analysis of Algorithms

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

The Time Complexity of an Algorithm

CSC2100B Data Structures Analysis

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Analysis of Algorithms

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

Asymptotic Algorithm Analysis & Sorting

Topic 17. Analysis of Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

COMP Analysis of Algorithms & Data Structures

COMP 382: Reasoning about algorithms

Analysis of Algorithms

Recurrence Relations

COMP Analysis of Algorithms & Data Structures

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

Chapter 2: The Basics. slides 2017, David Doty ECS 220: Theory of Computation based on The Nature of Computation by Moore and Mertens

Advanced Counting Techniques

Big , and Definition Definition

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

Analysis of Algorithms - Using Asymptotic Bounds -

CISC 235: Topic 1. Complexity of Iterative Algorithms

Enumerate all possible assignments and take the An algorithm is a well-defined computational

The Time Complexity of an Algorithm

2. ALGORITHM ANALYSIS

CS Data Structures and Algorithm Analysis

Data Structures and Algorithms CSE 465

The Growth of Functions and Big-O Notation

Practical Session #3 - Recursions

Introduction to Computer Science Lecture 5: Algorithms

Asymptotic Analysis Cont'd

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

csci 210: Data Structures Program Analysis

How many hours would you estimate that you spent on this assignment?

csci 210: Data Structures Program Analysis

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Data Structures and Algorithms. Asymptotic notation

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

data structures and algorithms lecture 2

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

3. Algorithms. What matters? How fast do we solve the problem? How much computer resource do we need?

Answer the following questions: Q1: ( 15 points) : A) Choose the correct answer of the following questions: نموذج اإلجابة

CS Analysis of Recursive Algorithms and Brute Force

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

CS/COE 1501 cs.pitt.edu/~bill/1501/ Integer Multiplication

P, NP, NP-Complete, and NPhard

CSCI 3110 Assignment 6 Solutions

Transcription:

COMPUTER ALGORITHMS Athasit Surarerks.

Introduction

EUCLID s GAME Two players move in turn. On each move, a player has to write on the board a positive integer equal to the different from two numbers already in the board; this number must be new. The player who cannot move loses the game.

ADDITION s GAME Player I and II alternately choose integers, each choice being one of the integers 1, 2,, k, and each choice being made with knowledge of all preceding choices. As soon as the sum of the chosen integers exceeds N, the last player to choose loses.

WORLD PUZZLE Four people want to cross a bridge; they are all begin in the same side. You must have 17 minutes to get them all across to the other side. It is night, and they have one flashlight. A maximum of two people can cross the bridge at one time. Any party that crosses must have the flashlight with them. Each person walks at a different speed: 1, 2, 5, and 10. A pair must walk together at the rate of the slower person.

Problem solving Problem Solution Abstract model Transformed model

Problem solving Buy 10 pen (5$ each), sell 8$, at least pen have to sell? Problem 7 pens Solution Abstract model Transformed model Min β ( 8β 50 ) β = 7

Given two complex numbers, X = a + b j Y = c + d j Complex number Find an algorithm to perform the product XY. 4.02 USD Cost of a multiplier is 1 USD. Cost of an adder is 0.01 USD. (ac-bd) + (ad+bc) j

Given two complex numbers, X = a + b j Y = c + d j Complex number Find an algorithm to perform the product XY. 3.05 USD Cost of a multiplier is 1 USD. Cost of an adder is 0.01 USD. (a+b)(c+d) = ac+ad+bc+bd

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring

Graph coloring Chromatic number is 2.

Problem solving Communication Language translation Function Composition function Algorithms

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 0

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 1

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 2

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 3

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 4

Rabbits on an island By Leonardo di Pisa, 13 th century A pair of rabbits does not breed until they are two months old, then each pair produces another pair each month. month 5 Assuming that no rabbits ever die, how many pairs of rabbits after n months.

Problem size Traveling salesperson Minimum spanning tree String matching Independent set Sum of subset Linear partition

Organization Introduction Analysis of algorithm Design of algorithm Complexity of algorithm

Organization Introduction Problems Asymptotic notations

Organization Introduction Analysis of algorithm Worst-case analysis Average-case analysis Amortized analysis

Organization Introduction Analysis of algorithm Design of algorithm Divide-and-conquer Dynamic programming Approximation algorithm

Organization Introduction Analysis of algorithm Design of algorithm Complexity of algorithm

ALGOL statement s program-name(variables listed) begin statement(s) end variable expression if condition then statement(s) else statement(s) endif while condition do statement(s) enddo for variable initial-value step size until final-value do statement(s) enddo goto label return variables listed input variables listed output variables listed any other miscellaneous statement

Maximum-Minimum search 001 begin 002 answer A[1] 003 for I 2 step 1 until n do 004 if A[I] > answer then answer = A[I] endif 005 enddo 006 return answer 007 end Time used : TA 1 (n) = t 002 + nt 003 + (n-1)t 004 + t 006.

Maximum-Minimum search 011 max-search (A[1..n]) 012 begin 013 if n = 1 then return A[1] endif 014 answer max-search(a[1..n-1]) 015 if A[n] > answer then return A[n] else return answer endif 016 end Time used : T A2 (n) = T A2 (n-1) + t 013 + t 014 + t 015 = nt 013 + (n-1)(t 014 +t 015 ).

Efficiency The efficiency of an algorithm is the resources used to find an answer. It is usually measured in terms of the theoretical computations, such as comparisons or data moves, the memory used, the number of messages passed, the number of disk accesses, etc.

Efficiency Most often we shall be interested in the rate of growth of the time or space required to solve larger and larger instances of problem. We would like to associate with a problem and integer, called the size of the problem, which is measure of the quantity of input data.

Efficiency The time needed by an algorithm expressed as a function of the size of a problem is called the time complexity of the algorithm. The limiting behavior of the complexity as size increases is called the asymptotic time complexity. Analogous definitions can be made for space complexity and asymptotic space complexity.

Complexity Algorithm Growth rate Time complexity A 1 1000n n A 2 100nlog n n log n A 3 10n 2 n 2 A 4 n 3 n 3 A 5 2 n 2 n

Growth rate 2 n n 2 n log n

Growth rate Let f and g be two functions, and if f(n) < g(n) if and only if, lim n f(n)/g(n) = 0.

Exercises Compare the growth rate of 0.5 n 1 log 10 n n 10 n 0.5 n < 1 < log n < n < 10 n

Exercises Compare the growth rate of ln 9 n and n 0.1, n 100 and 2 n.

L hôpital s rules Let f and g be two functions, and if lim n f(n) =, lim n g(n) = Then lim n f(n)/g(n) = lim n f(n) /g(n).

Asymptotic notations It is the asymptotic complexity of an algorithm which ultimately determines the size of problems that can be solved by the algorithm.

ω A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = (g(n)) means g(n) becomes insignificant relative to f(n) as n goes to infinity. More formally, it means that for any positive constant c, there exists a constant k, such that 0 cg(n) < f(n) for all n k. The value of k must not depend on n, but may depend on c. That is ω(g(n)) = {f(n) lim n (f(n)/g(n))= }.

Θ A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = Θ(g(n)) means it is within a constant multiple of g(n). More formally, it means there are positive constants c 1, c 2, and k, such that 0 c 1 g(n) f(n) c 2 g(n) Θ(g(n)) = {f(n) lim n (f(n)/g(n))=c,c 0,c }. for all n k. The values of c 1, c 2, and k must be fixed for the function f and must not depend on n.

Big-O A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). More formally it means there are positive constants c and k, such that 0 f(n) cg(n) O(g(n)) = {f(n) lim n (f(n)/g(n))=c,c }. O(g(n)) = o(g(n)) Θ(g(n)). for all n k. The values of c and k must be fixed for the function f and must not depend on n.

Big-O The importance of this measure can be seen in trying to decide whether an algorithm is adequate, but may just need a better implementation, or the algorithm will always be too slow on a big enough input. For instance, quicksort, which is O(n log n) on average, running on a small desktop computer can beat bubble sort, which is O(n 2 ), running on a supercomputer if there are a lot of numbers to sort. To sort 1,000,000 numbers, the quicksort takes 6,000,000 steps on average, while the bubble sort takes 1,000,000,000,000 steps!

Ω A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = Ω(g(n)) means it is more than some constant multiple of g(n). More formally, it means there are positive constants c and k, such that 0 cg(n) f(n) Ω(g(n)) = ω(g(n)) Θ(g(n)). for all n k. The values of c and k must be fixed for the function f and must not depend on n.

Some properties Asymptotic Transitivity Reflexivity Symmetry Little-o Yes No No ω Yes No No Θ Yes Yes Yes Big-O Yes Yes No Ω Yes Yes No

Some properties Asymptotic Transitivity Reflexivity Symmetry Little-o Yes No No ω Yes No No Θ Yes Yes Yes Big-O Yes Yes No Ω Yes Yes No

Transpose Symmetry f(n) = O(g(n)) if and only if g(n) = Ω(f(n)) f(n) = o(g(n)) if and only if g(n) = ω(f(n)).

Asymptotic Notations c 2 g(x) f(x) f(x) = Ω(g(x)) For all x k 2, f(x) c 1 g(x). c 1 g(x) f(x) = O(g(x)) For all x k 1, f(x) c 2 g(x). f(x) = Θ(g(x)) For all x k 1, k 2, c 1 g(x) f(x) c 2 g(x). k 1 k 2

Asymptotic Notations SPECIAL ORDERS OF GROWTH constant : Θ( 1 ) logarithmic : Θ( log n ) polylogarithmic : Θ( log c n ), c 1 sublinear : Θ( n a ), 0 < a < 1 linear : Θ( n ) quadratic : Θ( n 2 ) polynomial : Θ( n c ), c 1 exponential : Θ( c n ), c > 1

Exercises Show that log n! = Θ(n log n), log a n = Θ( log b n).

Analysis of algorithm

A detail unambiguous sequence of actions to perform to accomplish some task. Named after a Persian mathematician Abu Ja'far Mohammed ibn Mûsâ Al-Khawarizmi who wrote a book with arithmetic rules dating from about 825 A.D. Al-Khawarizmi

CRITERIAs Correctness Amount of work done Amount of space used Simplicity Optimality

Maximum-Minimum search 001 begin 002 answer A[1] 003 for I 2 step 1 until n do 004 if A[I] > answer then answer = A[I] endif 005 enddo 006 return answer 007 end Time used : TA 1 (n) = t 002 + nt 003 + (n-1)t 004 + t 006. T A1 (n) = Θ(t 002 ) + nθ(t 003 ) + (n-1) Θ (t 004 ) + Θ(t 006 ) T A1 (n) = Θ (n)

Maximum-Minimum search 011 max-search (A[1..n]) 012 begin 013 if n = 1 then return A[1] endif 014 answer max-search(a[1..n-1]) 015 if A[n] > answer then return A[n] else return answer endif 016 end Time used : T A2 (n) = T A2 (n-1) + t 013 + t 014 + t 015 = nt 013 + (n-1)(t 014 +t 015 ). T A2 (n) = nθ(t 013 ) + (n-1)(θ(t 014 )+Θ(t 015 )) T A2 (n) = Θ(n)

RUNNING TIME ANALYSIS To simplify running time analysis, Count only barometer instructions Use asymptotic analysis

Basic instruction To analyze total time used of an algorithm, we usually translate each operation into basic operation such that it has a constant time complexity, Θ(1). Example: Which statements are basic? STATEMENT if a < b then a a+1 else a a-1 endif. if b < c then e c! endif d c n goto 005 call max(a[1..n]) return subroutine(a,b) Y/N YES NO NO YES NO NO

Sorting Algorithm 001 sort(a[1..n]) 002 begin 003 for last n step 1 until 2 do 004 for i 2 step 1 until last do 005 if A[i] < A[i-1] thenbuffer A[i] A[i] A[i-1] A[i-1] buffer endif 006 enddo 007 enddo 008 end Operation Times Complexity 003 n Θ(n) 004 2 last n last Θ(n(n+1)/2-1) 005 2 last n last-1 Θ(n(n-1)/2) Total Θ(n 2 )

Sequencing Give two operations, S 1 and S 2, such that S 2 will be processed after S 1 sequentially. Total time complexity depends on the complexity of each operation. Upper bound of S 1 S 2 = max( upper bound S 1, upper bound S 2 ) Lower bound of S 1 S 2 = max( lower bound S 1, lower bound S 2 )

Sequencing Time Time Time S 1 low/up S 2 low/up low/up S 1 S 2 Θ(n) Θ(n) Θ(n) O(n) Θ(n) O(n 2 ) Θ(n 2 ) Θ(n)

Sequencing Time Time Time S 1 low/up S 2 low/up low/up S 1 S 2 Θ(n) Ω(n) O(n) Θ(n) Ω(n) O(n) Ω(n) O(n) Θ(n) Θ(n) Ω(n) O(n) O(n) unknown O(n) Ω(n) O(n) Θ(n) Θ(n) Ω(n) O(n) O(n 2 ) unknown O(n 2 ) Ω(n) O(n 2 ) O(n 2 ) Θ(n 2 ) Ω(n 2 ) O(n 2 ) Θ(n) Ω(n) O(n) Ω(n 2 ) O(n 2 ) Θ(n 2 )

Given three commands, S 1, S 2 and S 3. Let t 1, t 2 and t 3 be time complexity of S 1, S 2 and S 3 respectively. Consider a statement If statement if S 1 then S 2 else S 3 endif Let T be time complexity of this statement, we have that: Lower bound of T = t 1 + min( t 2,t 3 ) Upper bound of T = t 1 + max( t 2,t 3 )

If statement Example: Find the time complexity of if S 1 then S 2 else S 3 endif if t 1 = Θ(2 n ), t 2 = O(n) and t 3 = O(n log n). Upper bound of T = Ω(2 n ) + min( unknown, unknown ) = Ω(2 n ). Lower bound of T = O(2 n ) + max( O(n), O(n log n) ) = O(2 n ) + O(n log n) = O(2 n ). Time complexity is Θ(2 n ).

Exercise Find the time complexity of if Θ(n log n) then O(n 2 ) else Θ(log n) endif O(n 2 ) if O(n 3 ) then O(n log n) else Θ(2 n ) endif O(2 n ) if O(n!) then Θ(n 2 ) else Θ(n) endif O(n!) if Θ(n) then O(n log n) else Θ(n log n) endif O(n log n) If statement

For-loop Statement Time complexity can be computed by summary of time of each loop. Let t i be time used in the i th iteration. Then total time used is equal to Σ all i t i.

For-loop Statement Example: Find time complexity of 001 for last n step 1 until 2 do 002 for i 2 step 1 until last do 003 if A[i] < A[i-1] then swap(a[i],a[i-1]) endif 004 enddo 005 enddo Time complexity = Σ 2 last n Σ 2 i last+1 Θ(1) = Σ 2 last n Θ(last) = Θ(Σ 2 last n last) = Θ(n(n+1)/2 1) = Θ(n 2 ).

For-loop Statement Example: Find time complexity of 001 for i 1 step 1 until m do 002 for j m step 1 until 1 do 003 for k 1 step 1 until j do 004 sum sum + i + j + k 005 enddo 006 enddo 007 enddo Time complexity = Σ 1 i m (Σ 1 j m (Σ 1 k j (Θ(1))) = Σ 1 i m (Σ 1 j m Θ(Σ 1 k j k)) = Σ 1 i m (Σ 1 j m Θ(j)) = Σ 1 i m (Θ(m 2 )) = Θ(m 3 )

While-loop Statement By the same way, we have to count the number of iterations. Example: Find the time complexity of 001 while (n > 0) do 002 n n/2 003 enddo Consider the value of n in each loop. For the i-th iteration, the value of n is equal to n/2 i. Since the number of iterations is equal to log 2 n, time complexity = Θ(log 2 n).

While-loop Statement Example: Fine the complexity of 001 for i 1 step 1 until n-1 do 002 j i. 003 while j > 0 and A[j] < A[j-1] do 004 sum sum + A[j] 005 j j - 1 006 enddo 007 enddo Time complexity = Θ(n) + Σ 1 i n-1 (O(Σ 1 j i 1)) = O(n 2 ).

While-loop Statement Definition: An algorithm to compute the greatest common divisor of two integers. It is Euclid(a,b){if (b=0) then return a; else return Euclid(b, a mod b);}. Algorithm Euclid s GCD 001 GCD(a,b) 002 begin 003 while ( a > o ) do 004 q a 005 a b mod a 006 b q 007 enddo 008 return b 009 end b > a

Recursive calls It is important to see that the size of problem should be decreased for each iteration of recursive called. We can count all instructions by 1.Set run time complexity = T(n) where n is the size of problem. 2.Time complexity of recursive call statement = T(m) where m is the input size and m < n. 3.Let T(n) = T(m) + time complexity of other instructions. 4.Find T(n) in term of asymptotic complexity function.

Algorithm Selection Sort Recursive calls 001 SelectionSort(A{1..n]) 002 begin 003 if ( n 1 ) then return endif 004 maxindex Max(A[1..n]) 005 buffer A[n] 006 A[n] A[maxindex] 007 A[maxindex] buffer 008 SelectionSort(A[1..n-1]) 009 end Time complexity T(n) = T(n-1) + Θ(n) = T(n-2) + Θ(n-1) + Θ(n) =... = Σ 1 i n Θ(i) =Θ(n 2 ).

Recursive calls Algorithm Binary Search 001 BinarySearch(A[1..n],x) 002 begin 003 if ( A[1] > A[n] ) then return 1 endif 004 mid (A[1] + A[n])/2 005 if ( x = A[mid] ) then return mid endif 006 if ( x < A[mid] ) then return BinarySearch(A[1..mid-1],x) 007 else return BinarySearch(A[mid+1..n],x) endif 008 end Time complexity T(n) T(n/2) + Θ(1) T(n/2 2 ) + Θ(1) + Θ(1) T(n/2 k ) + Σ 1 i k Θ(i) Θ(1) + Σ 1 i log n Θ(1) Θ(log 2 n) = O(log 2 n).

Recursive calls The recursive call in a given algorithm can be represented by using recursive tree. For example, the run time complexity of a given algorithm is T(n) = 2T(n/2) + Θ(n), for any n>1 and T(1) = Θ(1). Θ(n) Θ(n) log 2 n Θ(n/2) Θ(n/2) 2Θ(n/2) Θ(n/4) Θ(n/4) Θ(n/4) Θ(n/4) 4Θ(n/4) Θ( n log n ) Θ(1) Θ(1) (2 lg n ) Θ(n/2 lg n )

Recursive calls T(n) = T(n/2) + T(n/3), for any n>1 and T(1) = Θ(1). T(n) Θ(1) + lg n T(n/2) T(n/3) 2Θ(1) + T(n/4) T(n/6) T(1) Θ( n ) T(n/6) T(1) T(n/9) 4Θ(1) + + (2 log n ) Θ(1) 0 i log n 2 i

Recursive calls T(n) = T(n/a) + T(n/b), for any n>1 and T(1) = Θ(1). T(n) Θ(1) + log n T(n/a) T(n/b) 2Θ(1) + T(n/a 2 ) T(1) T(n/ab) T(n/ab) T(1) T(n/b 2 ) 4Θ(1) + + (2 log n ) Θ(1) a>1 and b>1 0 i log n 2 i

Recursive calls T(n) = T(n/a) + T(n/b) + Θ(n), for any n>1 and T(1) = Θ(1). T(n) Θ(1) + log n T(n/a) T(n/b) 2Θ(1) + T(n/a 2 ) T(1) T(n/ab) T(n/ab) T(1) T(n/b 2 ) 4Θ(1) + + (2 log n ) Θ(1) a>1 and b>1 0 i log n 2 i

Recursive calls T(n) = T(n/a) + T(n/b) + Θ(n), for any n>1 and T(1) = Θ(1). Θ(n) Θ(1) + log n Θ(n/a) Θ(n/b) 2Θ(1) + Θ(n/a 2 ) Θ(1) Θ(n/ab) Θ(n/b 2 ) Θ(n/ab) Θ(1) 4Θ(1) + + (2 log n ) Θ(1) a>1 and b>1 0 i log n 2 i

Recursive calls T(n) = T(n/a) + T(n/b) + Θ(n), for any n>1 and T(1) = Θ(1). Θ(n) Θ(n) + log n Θ(n/a 2 ) Θ(1) Θ(n/a) a>1 and b>1 Θ(n/ab) Θ(n) Θ(n/b) Θ(n/ab) Θ(1) Θ(n/b 2 ) Θ((n/a)+(n/b)) + Θ(((1/a)+(1/b)) 2 n) + + Θ(((1/a)+(1/b)) log n n) 0 i log n ((1/a)+(1/b)) i n

Master method T(n) = at(n/b) + f(n) where a 1 and b 1 f(n) f(n) f(n/b) a f(n/b) af(n/b) f(n/ib 2 ) a f(n/b 2 ) f(n/b 2 ) a f(n/b 2 ) a 2 f(n/b 2 ) lg n f(1) f(1) a lg n f(n/b lg n ) 1 i lg n a i f(n/b i )

Master method T(n) = at(n/b) + f(n) where a 1 and b 1 This complexity depends on three variables, a, b and f(n). f(n) T(n) O(n (logb a)-ε ), for fixed ε > 0 O(n (logb a) ) This means that the growth rate of f(n) is less than O(n (logb a) ). Then total time complexity is O(n (logb a) ).

Master method T(n) = at(n/b) + f(n) where a 1 and b 1 This complexity depends on three variables, a, b and f(n). f(n) T(n) O(n (logb a)-ε ), for fixed ε > 0 O(n (logb a) ) Θ(n (logb a) ) Θ(n (logb a) log n) It is clear that total time complexity is equal to Θ(n (logb a) log n).

Master method T(n) = at(n/b) + f(n) where a 1 and b 1 This complexity depends on three variables, a, b and f(n). This means that f(n) the growth rate of f(n) T(n) is more than Ω(n (logb a) ), and the condition af(n/b) cf(n) for c < 1 and O(n (logb a)-ε ), for fixed ε > 0 n > n O(n (logb a) ) 0 means that the growth rate of f(n) decreases where n increases. Θ(n (logb a) ) Then total growth Θ(n (logb rate a) log is Θ(f(n)). Ω(n (logb a)+ε ), for fixed ε > 0 and Θ(f(n)) af(n/b) cf(n) for c < 1 and n > n 0

Master method Example: Find the solution of T(n) = 16T(n/4) + n T(n) = 27T(n/3) + n 3 T(n) = 3T(n/4) + n log n. T(n) = 16T(n/4) + n Since log 4 16 0.1 > 1, we can conclude that time complexity = O(n 2 ).

Master method Example: Find the solution of T(n) = 16T(n/4) + n T(n) = 27T(n/3) + n 3 T(n) = 3T(n/4) + n log n. T(n) = 27T(n/3) + n 3 Since log 3 27 = 3, it is clear that time complexity = Θ(n 3 log n).

Master method Example: Find the solution of T(n) = 16T(n/4) + n T(n) = 27T(n/3) + n 3 T(n) = 3T(n/4) + n log n. T(n) = 3T(n/4) + n log n. Since log 4 3 < 0.8, and f(n) = n log n = Ω(n 0.8+0.2 ). We also have that 3f(n/4) (3n/4)log n. Then time complexity = Θ(n log n).

Amortized analysis

Three methods Aggregate method Accounting method Potential method