1 Faculty of Mathematics Waterloo, Ontario Centre for Education in Mathematics and Computing Grade 11/12 Math Circles Fall 2014 - Nov. 5 Recurrences, Part 2 Running time of algorithms In computer science, we care about the running time of algorithms in terms of the size of the input. For example, consider the following code: define f1(n): for i from 1 to n: for j from 1 to n: print i*j How many times does the print statement happen? Exactly n 2 times. Discovering asymptotics Now consider the following code: define f2(n): for i from 1 to n: for j from 1 to i: print i*j How many times does the print statement happen? Exactly 1 + 2 + + n = n(n+1) 2 = 1 2 n2 + 1 2 n.
2 But we usually don t care so much about the difference between n 2 and n2 + n. That is, we would 2 2 view both of these algorithms as taking the same amount of time to run, roughly. By roughly, we mean the running time of both of these functions is Θ(n 2 ). But what does this mean? Formallly defining Θ If we have a function g(n), we can define a set of functions Θ(g(n)) = {f(n) : c 1 > 0, c 2 > 0, n 0 > 0 such that 0 c 1 g(n) f(n) c 2 g(n) n n 0 } What does this actually mean? Let s draw a picture:
3 Using the definition on the two code examples Recall that we said that both n 2 and 1 2 n2 + 1 2 n were Θ(n2 ). Why is n 2 Θ(n 2 )? Pick, for example, c 1 = c 2 = 1 and n 0 = 1. Then clearly n 2 n 2 n 2 for all n 1. What about showing 1 2 n2 + 1 2 n? More care We need to find constants c 1, c 2, n 0 > 0 such that: c 1 n 2 1 2 n2 + 1 2 n c 2n 2 for all values of n that are larger than n 0. Let s pick n 0 = 1, since we can make it work. We can pick c 1 = 1 2 which satisfies the left half. Pick c 2 = 1, and the right half is satisfied, with a bit of algebra: n 2 = 1 2 n2 + 1 2 n2 = 1 2 n2 + n 1 n (n 1) 2 1 2 n2 + 1 2 n Pulling these apart: O and Ω We do not need to be as tight as Θ all the time. Sometimes, we care only about upper-bounds or lower-bounds. Notice that Θ actually has both of these. Let s pull them apart! O(g(n)) = {f(n) : c > 0, n 0 > 0 such that f(n) cg(n) n n 0 } Ω(g(n)) = {f(n) : c > 0, n 0 > 0 such that 0 cg(n) f(n) n n 0 } Notice that f(n) Θ(g(n)) if and only if f(n) O(g(n)) and f(n) Θ(g(n)).
4 Recursion in programming language How does this connect to the lecture from last week? We may have recursive functions! A simple example: define r1(n): if n == 0: print * else: print * r1(n-1) How many stars get printed? Computing the running time of recursive algorithms Let s suppose there are T (n) stars that get printed. Then, we know: T (0) = 1 T (n) = T (n 1) + 1 You should know how to solve this recurrence, especially if we change it into the form a 0 = 1 a n = a n 1 + 1 But, we shall see, that we don t need to find an exact solution if we are looking only for the Θ bounds.
5 Another example: define mergesort(a,s,f): if s < f: mid = (s+f)/2 t1 = mergesort(a,s,mid) t2 = mergesort(a,mid+1,f) A = merge(t1, t2) Here, the algorithm merge merges two sorted lists into one sorted list. Let s trace this algorithm on A = [7,1,8,3,6,4,5,2]
6 The Master Theorem Let s try to find bounds more generally. The Master Theorem 1 Let a 1, b > 1 be constants, and let f(n) be a function, and let T (n) be a recurrence defined on the non-negative integers as T (n) = at (n/b) + f(n). Then, T (n) can be bounded asymptotically as: If f(n) O(n log b a ɛ ) for some constant ɛ > 0, then T (n) = Θ(n log b a ) If f(n) Θ(n log b a ) then T (n) = Θ(n log b a log n) If f(n) Ω(n log b a+ɛ ) for some constant ɛ > 0, and if af(n/b) cf(n) for some constant c < 1, then T (n) = Θ(f(n)) What these cases mean Observations: we compare f(n) to n log b a if f(n) is smaller, then the bounds are Θ(n log b a ) if f(n) is larger, then the bounds are Θ(f(n)) if they are asymptotically equal, then we add a logarithmic term the ɛ term is effectively saying it must be polynomially slower/faster. In other words, n t ɛ = nt n ɛ.
7 Using the Master Theorem Let s use it to solve our mergesort recurrence, which was: T (n) = 2T (n/2) + n Here a = 2, b = 2 and f(n) = n. Notice that n log b a = n 1 = n. Which applies: is 1 Ω(n), O(n) or Θ(n)? n Θ(n) So, the second case applies, and we have T (n) Θ(n log n) Binary Search Suppose that we have a sorted array. We can find an element k using: define binsearch(a, k, s, f): if s > f: return "Not found" else: mid = (s+f)/2 if A[mid] == k: return "Found" if A[mid] > k: return binsearch(a, k, s, mid-1) if A[mid] < k: return binsearch(a, k, mid+1, f)
8 Recurrence for binsearch T (1) = 1 T (n) = T (n/2) + 1 Here a = 1, b = 2 and f(n) = 1. Notice that 1 Θ(n log 2 1 ) = Θ(n 0 ) = Θ(1). Thus, the second case applies and T (n) Θ(n 0 log n) = Θ(log n). One last example Suppose we have T (n) = 9T (n/3) + n. Then a = 9, b = 3 and f(n) = n, with n log b a = n log 3 9 Θ(n 2 ). Since f(n) O(n log 3 9 1 ), we apply case 1 and conclude that T (n) = Θ(n 2 ).