The Growth of Functions and Big-O Notation Big-O Notation Big-O notation allows us to describe the aymptotic growth of a function without concern for i) constant multiplicative factors, and ii) lower-order additive terms. For example, using big-o notation, the function = 3n 2 +6n+7 is assumed to have the same kind of (quadratic) growth as g(n) = n 2. Why do we choose to ignore constant factors and lower-order additive terms? One kind of function that we often consider throughout this course is T (n), which represents the worst-case number of steps required by a an algorithm to process an input of size n. Function T (n) will vary depending on the computing paradigm that is used to represent the algorithm. For example, one paradigm might represent the algorithm as a C program, while another might represent it as a sequence of random-access machine instructions. Now if T measures the number of algorithmic steps for the first paradigm, and T 2 (n) measures the number of steps for the second, then, assuming that a paradigm does not include any unnecessary overhead, these two functions will likely be within multiplicative constant factors of one another. In other words, there will exist two constants C and C 2 for which C T 2 (n) T (n) C 2 T 2 (n). For this reason, big-o notation allows one to describe the steps of an algorithm in a mostly paradigmindepenent manner, yet still be able to give meaningful representations of T (n) by ignoring the paradignm-dependent constant factors. Let and g(n) be functions from the set of nonnegative integers to the set of nonnegative real numbers. Then Big-O = O(g(n)) iff there exist constants C > 0 and k such that Cg(n) for every n k. Big-Ω = Ω(g(n)) iff there exist constants C > 0 and k such that Cg(n) for every n k. Big-Θ = Θ(g(n)) iff = O(g(n)) and = Ω(g(n)). little-o = o(g(n)) iff = 0. g(n)
little-ω = ω(g(n)) iff =. g(n) Note that a more succinct way of saying property P (n) is true for all n k, for some constant k is to say property P (n) holds for sufficiently large n. Although this phrase will be used often throughout the course, nevertheless, when establishing a big-o relationship between two functions, the student should make the effort to provide the value of k for which the inequality is true. Given functions and g(n), to determine the big-o relationship between f and g, we mean establishing which, if any, of the above growth relationships apply to f and g. Note that, if more than one of the above relations is true, then we choose the one that gives the most information. For example, if = o(g(n)) and = O(g(n)), then we would simply write = o(g(n)), since it implies the latter relation. Example. Determine the big-o relationship between i) = 6n 2 + 2n + 5 and g(n) = 50n 2, and ii) the same and g(n) = n 3. Example Solution. We have and = O(g(n)) via C = and k =. Similarly, 6n 2 + 2n + 5 6n 2 + 2n 2 + 5n 2 = 3n 2 50n 2 = g(n), g(n) = 50n 2 (50/6), which implies g(n) = O() via C = 25/3 and k =. Therefore, = Θ(g(n)) is the strongest possible statement. In the case g(n) = n 3, we have it g(n) = it 6n 2 + 2n + 5 n 3 = it (6/n + 2/n2 + 5/n 3 ) = 0, implying that = o(g(n)). Moreover, it is an exercise to show that, whenever = o(g(n), then g(n) O(). Therefore, = o(g(n) is the strongest possible statement. 2
Basic Results In this section we provide some basic results that allow one to determine the big-o growth of a function, or the big-o relationship between two functions, without having to revert to the definitions. Theorem. Let p(n) be a polynomial of degree a and q(n) be a polynomial of degree b. Then p(n) = O(q(n)) if and only if a b p(n) = Ω(q(n)) if and only if a b p(n) = Θ(q(n)) if and only if a = b p(n) = o(q(n)) if and only if a < b p(n) = ω(q(n)) if and only if a > b Thus, Theorem could have been invoked to prove that = o(g(n)), where f and g are the functions from Example. Theorem 2. Let, g(n), h(n), and k(n) be nonnegative integer functions for sufficiently large n. Then. + g(n) = Θ(max(, g(n))) 2. if = Θ(h(n)) and g(n) = Θ(k(n)), then (fg)(n) = g(n) = Θ(h(n)k(n)) 3. Transitivity. Let R {O, o, Θ, Ω, ω} be one of the five big-o relationships. Then if = R(g(n)), and g(n) = R(h(n)) then = R(h(n)). In other words, all five of the big-o relationships are transitive. 3
Example 2. Use the results of Theorems and 2 to give a succinct expression for the big-o growth of (fg)(n) = g(n), where = n log(n 4 + ) + n(log n) 2 and g(n) = n 2 + 2n + 3. Note: by succinct we mean that no constants or lower-order additive terms should appear in the answer. Example 2 Solution. By property 2 of Theorem 2, it suffices to find succinct big-o expressions for the growth of both f and g, and then multiply those expressions to obtain a succinct expression for (fg)(n). Moreover, Theorem tells us that g(n) = Θ(n 2 ). As for the growth of notice that n log(n 4 + ) 4n log n = o(n(log n) 2 ), where we are using the logarithm property log a x b = b log a x. Thus, by property of Theorem 2, = Θ(n log 2 n) (Note: log 2 n means take the logarithm of n and then square the answer, and is a neater way of writing (log n) 2 ). Therefore, (fg)(n) = Θ((n log 2 n)n 2 ) = Θ(n 3 log 2 n). Theorem 3. If g(n) Proof of Theorem 3. Mathematically, = C, for some constant C > 0, then = Θ(g(n)). g(n) = C means that, for every ɛ > 0, there exists k 0, such that C < ɛ, g(n) for all n k. In words, can be made arbitrarily close to C with increasing values of n. Removing g(n) the absolute-value yields C ɛ < g(n) < C + ɛ, which implies (C ɛ)g(n) < < (C + ɛ)g(n). Since C > 0 and ɛ > 0 are constants, the latter inequalities imply = Θ(g(n)) so long as C ɛ > 0. Therefore, choosing ɛ = C/2, the result is proven. 4
Example 3. Suppose a > and b < 0 are constants, with b < a. Prove that a n + b n = Θ(a n ). Example 3 Solution. We have a n + b n it a n bn = it( + a ) = it( + n (b/a)n ) =, since b < a. Therefore, by Theorem 3, a n + b n = Θ(a n ). L Hospital s Rule. Suppose and g(n) are both differentiable functions with either. = g(n) =, or 2. = g(n) = 0. Then g(n) = f (n) g (n). Example 4. Prove that for every ɛ > 0, log n = o(n ɛ ). Example 4 Solution. We have log n it n ɛ and the statement is proved. ( ) ( ) (log n) = it = it = (n ɛ ) (ln 2)n ɛn ɛ ɛ ln 2 it n = 0, ɛ 5
The following terminology is often used to describe the big-o growth of a function. Growth Θ() Θ(log n) Θ(log k n), for some k Θ(n k ) for some positve k < Θ(n) Θ(n 2 ) Θ(n 3 ) Θ(n log n) Θ(n log k n), for some k O(n k ) for some k Ω(n k ), for every k Ω(a n ) for some a > Terminology constant growth logarithmic growth polylogarithmic growth sublinear growth linear growth quadratic growth cubic growth log-linear growth polylog-linear growth polynomial growth superpolynomial growth exponential growth Example 5. Use the above terminology to describe the growth of the functions from Examples and 2. Example 5 Solution. Functions f and g from part of Example both have quadratic growth, while g(n) from part 2 has cubic growth. Function f from Example 2 has polylog-linear growth, while g has quadratic growth. Finally (f g)(n) from Example 2 has polylog-cubic growth. Theorem 4 (Log Ratio Test). Suppose f and g are continuous functions over the interval [, ), and Then log( g(n) ) = log() log(g(n)) = L.. If L = then 2. If L = then 3. If L (, ) is a constant then g(n) =. g(n) = 0. g(n) = 2L. 6
Example 6. Use the log-ratio test to determine the big-o growth of = n log n. Example 6 Solution. has superpolynomial growth since it is a power-like function whose power (logarithmically) increases as n increases, where as a polynomial-growth power function has a constant power. Does f have exponential growth? To determine this we consider the ratio n log n /a n, where a > is arbitrary. Now, since n log n does not have an explicit derivative, we cannot use L Hospital s rule. Instead we use the log ratio test. Indeed, it log ( n log n /a n) = it (log nlog n log a n ) = it (log2 n n log a) =. Therefore, by Case 2 of the Log Ratio Test, it nlog n /a n = 0, implying that does not have exponential growth. Unfortunately, the sum function S(n) for many important series cannot be expressed using a formula. However, the next result shows how we may still determine the big-o growth of S(n), which quite often is our main interest. Integral Theorem. Let f(x) > 0 be an increasing or decreasing Riemann-integrable function over the interval [, ). Then f(i) = Θ( f(x)dx), i= if f is decreasing. Moreover, the same is true if f is increasing, provided = O( f(x)dx). Proof of Integral Theorem. We prove the case when f is decreasing. The case when f is increasing is left as an exercise. The quantity f(x)dx represents the area under the curve of f(x) from to n. Moreover, for i =,..., n, the rectangle R i whose base is positioned from x = i to x = i +, and whose height is f(i + ) lies under the graph. Therefore, n Area(R i ) = i= f(i) i=2 f(x)dx. Adding f() to both sides of the last inequality gives f(i) i= f(x)dx + f(). 7
Now, choosing C > 0 so that f() = C f(x)dx gives f(i) ( + C) i= which proves n f(i) = O( f(x)dx). i= f(x)dx, Now, for i =,..., n, consider the rectangle R i whose base is positioned from x = i to x = i +, and whose height is f(i). This rectangle covers all the area under the graph of f from x = i to x = i +. Therefore, n n Area(R i) = f(i) f(x)dx. Now adding to the left side of the last inequality gives i= i= which proves n f(i) = Ω( f(x)dx). Therefore, i= f(i) i= f(i) = Θ( i= f(x)dx, f(x)dx). Example 7. Determine the big-o growth of the series + 2 + + n. Example 7 Solution. We need to estimate the growth of the sum n f(i), where f(i) =. By the i Integral theorem, this sum grows on the order of i= Therefore, x dx = ln x n = ln n. + 2 + + n = Θ(log n). 8
Exercises. Use the definition of big-ω to prove that n log n = Ω(n + n log n 2 ). Provide appropriate C and k constants. 2. Provide the big-o relationship between = n log n and g(n) = n + n log n 2. 3. Prove that = O(g(n)) if and only if g(n) = Ω(). 4. Use the definition of big-θ to prove that + g(n) = Θ(max(, g(n))). 5. Prove that (n + a) b = Θ(n b ), for all real a and b > 0. Explain why Theorem and L Hospital s rule should be avoided when solving this problem. 6. Prove that if it g(n) 7. Prove or disprove: 2 n+ = O(2 n ). 8. Prove or disprove: 2 2n = O(2 n ). = 0, then = O(g(n)), but g(n) O(). 9. Use any techniques or results from lecture to determine a succinct big-θ expression for the growth of the function log 50 (n)n 2 + log(n 4 )n 2. + 000n 2 + 00000000n. 0. Prove or disprove: if = O(g(n)), then 2 = O(2 g(n) ).. Prove transitivity of big-o: if = O(g(n)), then g(n) = O(h(n)), then = O(h(n)). 2. If g(n) = o(), then prove that + g(n) = Θ(). 3. Use L Hospital s rule to prove Theorem. Hint: assume a and b are nonnegative integers and that a b. 4. Use L Hospital s rule to prove that a n = ω(n k ), for every real a > and integer k. 5. Prove that log a n = Θ(log b n) for all a, b > 0. 6. Determine the big-o growth of the function = 2 2 log n. Explain and show work. 7. Prove that n! = ω(2 n ). 8. Determine the big-o relationship between 2 2n and n!. 9. Determine the big-o growth of the function = (log n) log n. Explain and show work. 20. Determine the big-o growth of the function = ( 2) log n. Explain and show work. 2. Determine the big-o growth of the function = (log n)!. 22. Determine the big-o growth of the function = n / log n. 23. Suppose g(n) for all n, and that g(n)+l, for some constant L 0 and all n. Prove that = O(g(n)). 24. Give an example that shows that the statement of Exercise 23 may not be true if we no longer assume g(n). 9
25. Use the integral theorem to establish that k + 2 k + + n k = Θ(n k+ ), where k is an integer constant. 26. Use the Integeral Theorem to prove that log + log 2 + + log n = Θ(n log n). 27. Show that log(n!) = Θ(n log n). 28. Determine the big-o growth of n + n/2 + n/3 + +. 0
Exercise Hints and Solutions. Answers may vary. We have for n 2, n + n log n 2 n log n + 2n log n = 3n log n. Thus, n log n (/3)(n + n log n 2 ), for all n 2. So C = /3 and k = 2. 2. From the previous exercise we have = Ω(g(n)). But = O(g(n)) with C = k =. Thus, = Θ(g(n)). 3. Set up the inequality for big-o and divide both sides by C. 4. Two inequalities must be established, and C = 0.5, k =, C 2 = 2, k 2 = are adequate constants (why?). 5. Use Theorem 3. 6. Since g(n) = 0, we know that g(n) for sufficiently large n. Thus, = O(g(n)). Now suppose it was true that g(n) C for some constant C > 0, and n sufficiently large. Then dividing both sides by g(n) yields /C for sufficiently large n. But since g(n) we know that g(n) O(). 7. True, since 2 n+ = 2 2 n. C = 2, k =. g(n) = 0, < /C, for sufficiently large n, which is a contradiction. Therefore, g(n) 8. False. 2 2n = 4 n and Now use Exercise 6. 2 n 4 = n 2 = 0. n 9. Θ(n 2. log n). 0. False. Consider = 2n and g(n) = n.. Assume C g(n), for all n k and g(n) C h(n), for all n k 2. Therefore, C g(n) C C 2 h(n) for all n max(k, k 2 ). C = C C 2 and k = max(k, k 2 ). 2. Use Theorem 2 and the fact that g(n) < for n sufficiently large.
3. Let be a nonnegative-valued polynomial of degree a, and g(n) be a nonnegative-valued polynomial of degree b, with a b. Since the k th derivative (as a function of n) of is equal to Θ(n a k ) and that of g(n) is equal to Θ(n b k ) it follows that L Hospital s rule applies to the ratio f [k] (n)/g [k] (n) for all k < b. Moreover, upon applying the rule for the b th time, the numerator will equal a polyhomial of degree a b, while the denominator will equal a constant. Thus = Θ(g(n)) if a = b and = ω(g(n)) if a > b. 4. Since the derivative (as a function of n) of a n equals (ln a)a n, it follows that the k th derivative of a n divided by the k th derivative of n k equals ln k a n /k!, which tends to infinity. Therefore, a n = ω(n k ). 5. By the Change of Base formula, log a n = log b n log b a, and so log a n = C log b n, where C = / log b a. Therefore, log a n = Θ(log b n). 6. Sublinear growth. Use the Log-ratio test with g(n) = n. 7. Use the Log-ratio test, and the fact that n i= log(i) = Θ(n log n). 8. Use the Log-ratio test to show that the first function is little-ω of the second. 9. Superpolynomial but not exponential growth. Use the Log-ratio test against an arbitrary polynomial function n k. Then use it again against an arbitrary exponential function a n, where a >. 20. Sublinear. Use logarithm identities. 2. Superpolynomial. First analyze the growth of log(). Then consider the growth of 2 log() which has the same growth as. 22. Constant growth. Take the logarithm of and analyze its growth. 23. g(n) + L g(n) + Lg(n) ( + L)g(n), for all n 0, where the second inequality is true since g(n). C = + L and k =. The result still holds so long as g(n) is bounded away from zero. 24. Consider = /n and g(n) = /n 2. 25. By the Integral Theorem, 26. By the Integral Theorem, i k = Θ( i= x k dx) = Θ( xk+ n k + ) = Θ(n k+ ). Moreover, ln i = Θ( i= ln xdx). ln xdx = x ln x n dx = 2
n ln n n + = Θ(n ln n). 27. Since log ab = log a + log b, we have log(n!) = log(n(n )(n 2) ) = log n + log(n ) + log(n 2) + log. Therefore, from the previous exercise, we have log(n!) = Θ(n log n). 28. We have n + n/2 + n/3 + + = n( + /2 + /3 + + /n). Therefore, by Example 3, n + n/2 + n/3 + + = Θ(n log n). 3