Data Structures and Algorithms Autumn 2018-2019
Outline 1 Algorithm Analysis (contd.)
Outline Algorithm Analysis (contd.) 1 Algorithm Analysis (contd.)
Growth Rates of Some Commonly Occurring Functions Function Name Example c Constant Adding two numbers log log n Log log" Sieve of Eratosthenes (Q. 2.21) log n Logarithmic Searching a balanced binary tree log 2 n Log-squared n Linear Searching a list n log n n log n mergesort, quicksort n 2 Quadratic Insertion/Selection/Bubble Sort n 3 Cubic Multiplying 2 n n matrices (the slow wa 2 n, a n, n! Exponential No. of ways to permute n objects (n!) No. of unique patterns of n bits (2 n )
Outline Algorithm Analysis (contd.) 1 Algorithm Analysis (contd.)
Some Rules Algorithm Analysis (contd.) Some rules for combining Big-Ohs Rule 1 If T 1 (n) = O(f (n)) and T 2 (n) = O(g(n)), then (a) T 1 (n) + T 2 (n) = max(o(f (n)), O(g(n))); (b) T 1 (n) T 2 (n) = O(f (n) g(n)) Rule 2 If T (n) is a polynomial of degree k, then T (n) = Θ(n k ) T (n) = 5n 2 + 20n T (400) = 808, 000 T (n) = 5n 2 T (400) = 800, 000 T (400)/T (400) = 808000/800000 = 1.01 T (4000)/T (4000) = 80080000/80000000 = 1.001 T (n) 5n 2 Rule 3 log k n = O(n) for any constant k. Logarithms grow very slowly.
Outline Algorithm Analysis (contd.) 1 Algorithm Analysis (contd.)
Assumptions of Model When analysing the running time of an algorithm we assume the following computer model: Instructions are executed sequentially Any of the fundamental mathematical operations take a constant single unit of time We do not care about what the units of time are No tricky operations permitted without paying for them Our computer has an infinite amount of memory no paging worries
Weaknesses of Model Weaknesses of this model: Not all operations take same time e.g., addition vs. multiplication vs. disk reading
What to measure Algorithm Analysis (contd.) Almost always running time... When we quantify running time it will be as a function of the input size, the number of items the algorithm works on, expressed in Big-Oh notation Two types of running time of interest: worst and average Worst-case running time is the time the algorithm would take on the worst possibly arranged input Average-case running time is the time the algorithm can be expected to take on an average input Average-case is by far the most useful but by far the most difficult to derive We satisfy ourselves with finding a bound on the very worst our algorithm will perform We ignore the O(n) time taken to read the input
Outline Algorithm Analysis (contd.) 1 Algorithm Analysis (contd.)
Our First Algorithm Analysis Compute S = n i=1 i2 We would expect S to be 1 3 n3 as given in Lect01 But it is running time we care about here int sum(int n) { int partial_sum = 0; // l. 3 for (int i = 1; i <= n; i++) // l. 5 partial_sum += i*i; // l. 6 } return partial_sum; // l. 8
Our First Algorithm (contd.) Counting Operations Lines 3 and 8 get executed once so they contribute 2 units Line 5 will contribute 1, n + 1, 2n units respectively for the three parts; total = 3n + 2 Line 6 contributes 1 mult, 1 add and 1 assignment each time it is executed, for a total of 3n Full count of operations is T (n) = 6n + 4 Therefore the asymptotic running time of the algorithm is Θ(n), i.e., both O(n) 1 and Ω(n) 2 1 T (n) = 6n + 4 7n, 4 n 2 T (n) = 6n + 4 > 6n, 0 n