Big-oh stuff. You should know this definition by heart and be able to give it,

Similar documents
Chapter 1 Review of Equations and Inequalities

Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2:

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Growth of Functions (CLRS 2.3,3)

4.4 Recurrences and Big-O

Induction 1 = 1(1+1) = 2(2+1) = 3(3+1) 2

Math 5a Reading Assignments for Sections

5.2 Infinite Series Brian E. Veitch

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Lecture 10: Powers of Matrices, Difference Equations

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

An analogy from Calculus: limits

MATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King

CS 124 Math Review Section January 29, 2018

Solving with Absolute Value

How many hours would you estimate that you spent on this assignment?

Lecture 3: Big-O and Big-Θ

Algebra & Trig Review

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

This chapter covers asymptotic analysis of function growth and big-o notation.

Mid-term Exam Answers and Final Exam Study Guide CIS 675 Summer 2010

AP Calculus AB Summer Assignment

8.5 Taylor Polynomials and Taylor Series

Solving Equations by Adding and Subtracting

Section 2.7 Solving Linear Inequalities

Section 2.6 Solving Linear Inequalities

This format is the result of tinkering with a mixed lecture format for 3 terms. As such, it is still a work in progress and we will discuss

PROBLEM SET 3: PROOF TECHNIQUES

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

1 Rational Exponents and Radicals

Reading 10 : Asymptotic Analysis

CS173 Strong Induction and Functions. Tandy Warnow

Asymptotic Analysis 1

Chapter 2. Mathematical Reasoning. 2.1 Mathematical Models

Section 20: Arrow Diagrams on the Integers

5.5. The Substitution Rule

Section 1.x: The Variety of Asymptotic Experiences

Introduction. So, why did I even bother to write this?

Proof Techniques (Review of Math 271)

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Alex s Guide to Word Problems and Linear Equations Following Glencoe Algebra 1

Problem Solving. Kurt Bryan. Here s an amusing little problem I came across one day last summer.

AP Calculus AB Summer Assignment

Manipulating Equations

Slope Fields: Graphing Solutions Without the Solutions

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Lines and Their Equations

Logarithms and Exponentials

Finding Limits Graphically and Numerically

1 Recommended Reading 1. 2 Public Key/Private Key Cryptography Overview RSA Algorithm... 2

#29: Logarithm review May 16, 2009

Divide and Conquer. Arash Rafiey. 27 October, 2016

Calculus I. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

Algebra Exam. Solutions and Grading Guide

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Properties of Arithmetic

Pre-calculus is the stepping stone for Calculus. It s the final hurdle after all those years of

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

One sided tests. An example of a two sided alternative is what we ve been using for our two sample tests:

Counting in the Twilight Zone

2. Limits at Infinity

1 Continuity and Limits of Functions

Lesson 21 Not So Dramatic Quadratics

Mathematics-I Prof. S.K. Ray Department of Mathematics and Statistics Indian Institute of Technology, Kanpur. Lecture 1 Real Numbers

Section 1.8 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

Partial Fractions. June 27, In this section, we will learn to integrate another class of functions: the rational functions.

Examples 2: Composite Functions, Piecewise Functions, Partial Fractions

Preparing for the CS 173 (A) Fall 2018 Midterm 1

Lecture 3. Big-O notation, more recurrences!!

MATH 1A, Complete Lecture Notes. Fedor Duzhin

29. GREATEST COMMON FACTOR

One-to-one functions and onto functions

General Technical Remarks on PDE s and Boundary Conditions Kurt Bryan MA 436

Go over the illustrated examples in each section.

( )( b + c) = ab + ac, but it can also be ( )( a) = ba + ca. Let s use the distributive property on a couple of

Since the logs have the same base, I can set the arguments equal and solve: x 2 30 = x x 2 x 30 = 0

Section 3.2 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

Sequences of Real Numbers

MATHEMATICAL INDUCTION

LECTURE 16: TENSOR PRODUCTS

Please bring the task to your first physics lesson and hand it to the teacher.

Math 3361-Modern Algebra Lecture 08 9/26/ Cardinality

Lecture for Week 2 (Secs. 1.3 and ) Functions and Limits

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School

1 Closest Pair of Points on the Plane

Lecture 2: Asymptotic Notation CSCI Algorithms I

The Derivative of a Function

Mathematical Induction

MA554 Assessment 1 Cosets and Lagrange s theorem

CS173 Lecture B, November 3, 2015

In economics, the amount of a good x demanded is a function of the price of that good. In other words,

C if U can. Algebra. Name

Main topics for the First Midterm Exam

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

8. TRANSFORMING TOOL #1 (the Addition Property of Equality)

Infinite Continued Fractions

Supplement for MAA 3200, Prof S Hudson, Fall 2018 Constructing Number Systems

DIFFERENTIAL EQUATIONS

CS1800: Strong Induction. Professor Kevin Gold

Handout 1: Mathematical Background

Transcription:

Big-oh stuff Definition. if asked. You should know this definition by heart and be able to give it, Let f and g both be functions from R + to R +. Then f is O(g) (pronounced big-oh ) if and only if there exists positive constants C 1 and C 2 such that C 1 g(n) whenever n > C 2. This is the definition of what it means to say f is O(g). So, a real proof that f is O(g) requires providing the constants C 1 and C 2 and proving the result above. Furthermore, this definition also applies for functions defined only on natural numbers, only on positive integers, etc. A small modification of the definition is made when the functions can have negative values (e.g., = 3n 2 500). Most of the time you won t be asked to provide the constants, but rather to be able to guess intelligently (and back up your guess if asked) whether a function is big-oh of another function. Comment 1: In some very mathematically oriented presentations, O(g) is defined to be the set of functions f such that C 1, C 2 0 so that C 1 g(n) for all n > C 2. Therefore, you may find statements like f O(g) instead of f is O(g). Do not be surprised - these mean the same thing. Comment 2: Often people write is O(g(n)) instead of f is O(g). Technically, refers to the value of the function f on input n, rather than the function itself. However, by this point in time, both are acceptable ways of communicating the same point. Just don t be surprised when you see either one, as they mean the same thing. Finding the positive constants C 1 and C 2. But now that you know the definition of what big-oh means, you can see the following statements will be true. Technique 1: Suppose C 0 so that lim n g(n) C Then f is O(g). Furthermore, for any C > C, C 2 > 0 such that C g(n) for all n > C 2. Technique 2: Suppose C 0 such that lim (log 2 log 2 g(n)) C n Then f is O(g). Furthermore, for any C > C, C 2 > 0 such that 2 C g(n) for all n > C 2. Comments: 1

1. Note that we require that C > C. Why do we do this? The reason is that the limit may approach C from above, making it necessary to pick some constant bigger than whatever the limit is. For example if = n 2 + 1 and g(n) = n 2, then > g(n) for all n, and yet lim n g(n) = 1. If we set C 1 = 1, we definitely would not be able to find C 2 and prove that C 1 g(n) for n > C 2. On the other hand, for any C 1 > 1, we could find a corresponding value for C 2. For example, if we pick C 1 = 2, then we would need C 2 to satisfy n 2 + 1 2n 2 whenever n C 2. This is equivalent to n 2 1, and so C 2 = 1 works. 2. For Technique 1, if lim n g(n) = C > 0, then f and g are both big-oh of each other! (Note that saying that the limit is C > 0 means that C is a real number, and so by definition is not.) This situation is described by saying that f is Θ(g) (from which it follows that g is Θ(f)). The precise definition of Θ(g) is the set of functions f such that C 1, C 2, N positive real numbers such that whenever n > N then C 1 g(n) C 2 g(n). Thus we may also say f Θ(g) instead of saying f is Θ(g). 3. For Technique 1, if lim n g(n) = 0, then f is O(g) but the converse is not true. 4. For Technique 1, if lim n g(n) =, then f is not O(g), but g is O(f). 5. Note that for Technique 2, you can use any base you want for the log; this means that you can use ln (the natural log, which is log e ), or log base 2 (written log 2 ), or any base you like. These two results may help you find the constant C 1, or determine that f is not O(g)! However, you will still need to find the constant C 2, and this will probably require that you know how to use logarithms, among other things. To use these techniques you should be able to compute limits, something you may not right now be comfortable with. Furthermore, just knowing that the limit exists (which it may not) doesn t make it straightforward to pick the constant. Example #1: = n 2 Consider for example the following pair of functions g(n) = n 3 Let s use the techniques we ve described to find the constants C 1 and C 2 to establish that f is O(g). If you compute lim n g(n), you will get 0. But setting C 1 = 0 won t work. The reason is that that ratio approaches its limit from above. So you need to pick some constant greater than the limit, not equal to the limit. (Conversely, if the ratio approaches its limit from below, you can pick the constant C 1 to be that limit, but setting it to be bigger is always safe.) 2

So pick C 1 = 1, and then solve for C 2. This one is easy: C 2 = 1 works just fine. Example #2: the answer. Here s another example, where it s a bit harder to figure out = 3 n g(n) = 2 n For this pair of functions, we wish to determine whether f is O(g) and whether g is O(f). Part 1: Determining if f is O(g). Trying to figure out whether f is O(g) using lim n g(n) gives you something harder to compute. You ll need to use L Hôpital s rule, but that may be something you are not comfortable with in this context. Let s try the second approach, which is based on logarithms. (Even this you may not be comfortable with!) Let s take logs using base 2. Then we get log 2 = log 2 (3 n ) = n log 2 (3). Note that 1 < log 2 (3) < 2, and so log < 2 n. log 2 g(n) = log 2 (2 n ) = n We continue: log 2 log 2 g(n) = n log 2 (3) n < 2 n n. Note that when n > 4, n > 2. Furthermore, when n > 4, it is easy to see that 2 n < n. Hence, log 2 log 2 g(n) 2 n n < 0 How do we use this? The analysis given above shows that if log log g(n) < log C 1 for large enough n, then f is O(g). Therefore, if we can find a non-negative constant C 1 such that log C 1 > 0, we will have established that f is O(g). What values of C 1 satisfy log C 1 > 0? Answer: all C 1 1. Setting C 1 = 1 then makes sense. What value would you give for C 2? The analysis here shows that C 2 = 4 works. Thus we have shown that f is O(g). Part 1: Determining if g is O(f). We will see if we can find the constants C 1 and C 2 such that log 2 g(n) log 2 < C 1 for all n > C 2. Recall that 2 > log 2 (3) > 1, and that if n > 9 then n > 3 n. Therefore, for n > 9, log g(n) log = n n log 2 (3) > n 2 n > n 3

Since it follows that lim n =, n Thus, g is not O(f). lim [log 2 g(n) log 2 ] lim n = n n Example #3: Consider the following pair of functions: = 100n 2 g(n) = n 3 + 3 From your training, you know that f is O(g) but not vice-versa. We will use the first approach to find the constants C 1 and C 2 to prove that f is O(g). Using L Hôpital s Rule (applied twice!!), we find that lim n g(n) = lim f (n) n g (n) = lim 200n n 3n 2 = lim 200 n 3n = 0 Hence, we can let C 1 be any positive real number. Let s pick C 1 = 100 just to make life easy. Now we want to solve for C 2. That is, we want to find C 2 > 0 such that C 1 g(n) for all n > C 2 Substituting, we see that we want to find C 2 such that 100n 2 100(n 3 + 3) = 100n 3 + 300 for all n > C 2. It is easy to see that C 2 = 1 works. Note that we could have picked C 1 = 1, and then solved for C 2, but it would have been a more intricate piece of algebra if we did that, to find C 2. Suppose we had tried to prove that g is O(f). By the same analysis as done above, g(n) lim n = lim n g (n) f (n) = lim n 3n2 200n = lim n 3n 200 = (Note that we applied L Hôpital s Rule twice in this derivation.) Hence, the limit as n of g(n) is not bounded from above, and so g is not O(f). Example #4: Now let s look at a more complicated example. Let g(n) = n, and let be defined as follows: 5 if n is even = n if n is odd Thus, f(1) = 1, f(2) = 5, f(3) = 3, f(4) = 5, f(5) = 5, f(6) = 5, f(7) = 7, f(8) = 5, etc. We want to know if f is O(g). It is challenging to apply 4

either technique to this pair of functions, since the value of depends on the parity of n. For example, if we try to apply Technique 1, we find 5 g(n) = n if n is even 1 if n is odd In other words, lim n g(n) does not exist. When we look at Technique 2, we get log 5 if n is even log = 1 if n is odd Thus, lim [log log g(n)] = n log 5 1 if n is even 0 if n is odd Hence, once again, the limit does not exist. In other words, these two techniques only work when the limit exists! This does not mean that f is not O(g), but only that a different technique must be used to determine whether f is O(g). However, note that for all n 5, g(n) 1. Hence, even though the limit does not exist, setting C 1 = 1 and C 2 = 5 allows use to prove that f is O(g). What about the converse? Is g big-oh of f? g(n) = n 5 n is even 1 n is odd n Since lim n 5 =, it follows that g is not O(f). Proving that a function f is O(g) Let f and g be functions from R + to R +. To prove that f is O(g), you need to do one of the following: Find constants C 1 and C 2 such that C 1 g(n) whenever n > C 2, and prove that the inequality holds. Proving that the inequality holds might in turn require calculus or induction, or at a minimum some algebra, so it s not enough to just write down the constants. Use one of the techniques given above (which provide implicit proofs that these constants exist). Obviously it s easier to use the techniques than to find the constants and prove the inequality holds. Therefore, it s a good idea to learn how to use the techniques. The skills you need to use these techniques are mostly from pre-calculus and calculus, and you are probably rusty. Please practice! You will need to be able to compute logarithms using any base. 5

You will need to use L Hôpital s Rule. You need to be able to compute limits. You will need to compute derivatives of potentially complicated functions (such as n 3n or (1 + n) n ). Practice questions. Try to answer each of the following questions, any of which could appear on the examlet. You could expect questions like these, even if they are not identical. 1. Provide the definition of the set of functions that are O(n 2 ). 2. Provide the definition of the set of functions that are Θ(n 2 ). 3. Provide the constants C 1 and C 2 proving that 3 n is O(3 n 2 n ). 4. Solve for H(n): 3 n2 1 = 4 H(n) 5. Compute lim n ln n n 6. Compute log 2 (3 n ) 7. Compute log 3 (5n 2 4 n ) 8. Determine (no proof requested), for each pair of functions below, whether (a) f is O(g) but not vice-versa, (b) g is O(f) but not vice-versa, (c) both are big-oh of each other, or (d) neither is big-oh of each other You should only concern yourself with values n 1. = n 2 and g(n) = log(n n ) = (log n) n and g(n) = n = log(n 500 ) and g(n) = 100 = (log n) 500 and g(n) = 100 = 100 + 3 n and g(n) = 5 9. Sally says f is O(g), but Bob notes that > g(n) for all n, and so says f cannot be O(g). What do you think of this argument? Assuming that > g(n) is true, can Sally possibly be right? Or is Bob always right? 6