Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

Similar documents
Algorithms and Their Complexity

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithms 2/6/2018. Algorithms. Enough Mathematical Appetizers! Algorithm Examples. Algorithms. Algorithm Examples. Algorithm Examples

3. Algorithms. What matters? How fast do we solve the problem? How much computer resource do we need?

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Theory of Computation

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

CISC 235: Topic 1. Complexity of Iterative Algorithms

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

Principles of Algorithm Analysis

ECOM Discrete Mathematics

Big O Notation. P. Danziger

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Computational Complexity

Discrete Mathematics CS October 17, 2006

The Growth of Functions (2A) Young Won Lim 4/6/18

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Big O Notation. P. Danziger

The Time Complexity of an Algorithm

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

csci 210: Data Structures Program Analysis

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Analysis of Algorithms

Section 1.8 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

Time Analysis of Sorting and Searching Algorithms

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

csci 210: Data Structures Program Analysis

The Time Complexity of an Algorithm

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Advanced Algorithmics (6EAP)

Section 3.2 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

Ch 01. Analysis of Algorithms

CS 380 ALGORITHM DESIGN AND ANALYSIS

Module 1: Analyzing the Efficiency of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Analysis of Algorithms

Ch01. Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Running Time Evaluation

Assignment 5 Bounding Complexities KEY

Growth of Functions (CLRS 2.3,3)

Mathematics for Business and Economics - I. Chapter 5. Functions (Lecture 9)

Mat Week 6. Fall Mat Week 6. Algorithms. Properties. Examples. Searching. Sorting. Time Complexity. Example. Properties.

What is Performance Analysis?

Orders of Growth. Also, f o(1) means f is little-oh of the constant function g(x) = 1. Similarly, f o(x n )

Student Responsibilities Week 6. Mat Properties of Algorithms. 3.1 Algorithms. Finding the Maximum Value in a Finite Sequence Pseudocode

Programming, Data Structures and Algorithms Prof. Hema Murthy Department of Computer Science and Engineering Indian Institute Technology, Madras

Analytic Geometry and Calculus I Exam 1 Practice Problems Solutions 2/19/7

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Big-oh stuff. You should know this definition by heart and be able to give it,

Reading 10 : Asymptotic Analysis

Asymptotic Analysis Cont'd

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

The Growth of Functions. A Practical Introduction with as Little Theory as possible

Analysis of Algorithms Review

Data Structures and Algorithms. Asymptotic notation

The Growth of Functions and Big-O Notation

A point p is said to be dominated by point q if p.x=q.x&p.y=q.y 2 true. In RAM computation model, RAM stands for Random Access Model.

Problem-Solving via Search Lecture 3

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

COMPUTER ALGORITHMS. Athasit Surarerks.

LIMITS AND DERIVATIVES

Computer Science Section 3.2

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

LECTURE NOTES ON DESIGN AND ANALYSIS OF ALGORITHMS

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Lecture 2: Asymptotic Notation CSCI Algorithms I

2.6 Logarithmic Functions. Inverse Functions. Question: What is the relationship between f(x) = x 2 and g(x) = x?

CSC Design and Analysis of Algorithms. Lecture 1

Math 3012 Applied Combinatorics Lecture 5

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

CS 4407 Algorithms Lecture 2: Growth Functions

Introduction. An Introduction to Algorithms and Data Structures

Discrete Structures CRN Test 3 Version 1 CMSC 2123 Spring 2011

Great Theoretical Ideas in Computer Science. Lecture 9: Introduction to Computational Complexity

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Submit Growable Array exercise Answer Q1-3 from today's in-class quiz.

Big-O Notation and Complexity Analysis

O Notation (Big Oh) We want to give an upper bound on the amount of time it takes to solve a problem.

MATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Example: Fib(N) = Fib(N-1) + Fib(N-2), Fib(1) = 0, Fib(2) = 1

CS Data Structures and Algorithm Analysis

CSCE 222 Discrete Structures for Computing

" $ CALCULUS 2 WORKSHEET #21. t, y = t + 1. are A) x = 0, y = 0 B) x = 0 only C) x = 1, y = 0 D) x = 1 only E) x= 0, y = 1

CSE 417: Algorithms and Computational Complexity

Written Homework #1: Analysis of Algorithms

P, NP, NP-Complete, and NPhard

Convergence of sequences and series

A New Paradigm for the Computational Complexity Analysis of Algorithms and Functions

2015 Math Camp Calculus Exam Solution

Topic 17. Analysis of Algorithms

An Invitation to Mathematics Prof. Sankaran Vishwanath Institute of Mathematical Sciences, Chennai. Unit I Polynomials Lecture 1A Introduction

3.1 Asymptotic notation

Transcription:

Function Growth of Functions Subjects to be Learned Contents big oh max function big omega big theta little oh little omega Introduction One of the important criteria in evaluating algorithms is the time it takes to complete a job. To have a meaningful comparison of algorithms, the estimate of computation time must be independent of the programming language, compiler, and computer used; must reflect on the size of the problem being solved; and must not depend on specific instances of the problem being solved. The quantities often used for the estimate are the worst case execution time, and average execution time of an algorithm, and they are represented by the number of some key operations executed to perform the required computation. As an example for an estimate of computation time, let us consider the sequential search algorithm. Example: Algorithm for Sequential Search Algorithm SeqSearch(L, n, x) L is an array with n entries indexed 1,.., n, and x is the key to be searched for in L. Output: if x is in L, then output its index, else output 0. index := 1; while ( index n and L[ index ] x ) index := index + 1 ; if ( index > n ), then index := 0 return index. The worst case time of this algorithm, for example, can be estimated as follows: First the key operation is comparison of keys comparing L[ index ] with x. Most search algorithms (if not all) need "comparison of keys". The largest number of execution of this comparison is n, which occurs when x is not in L or when x is at L[n], and the while loop is executed n times. This quantity n thus obtained is used as an estimate of the worst case time of this sequential search algorithm. Note that in the while loop two comparisons and one addition are performed. Thus one could use 3n as an estimate just as well. Also note that the very first line and the last two lines are not counted in. The reasons for those are firstly that differences in implementation details such as languages, commands, compilers and machines make differences in constant factors meaningless, and secondly that for large values of n, the highest degree term in n dominates the estimate. Since we are mostly interested in the behavior of algorithms for large values of n, lower terms can be ignored compared with the highest term. The concept that is used to address these issues is something called big oh, and that is what we are going to study here.

Big Oh The following example gives the idea of one function growing more rapidly than another. We will use this example to introduce the concept the big Oh. Example: f(n) = 100 n 2, g(n) = n 4, the following table and figure show that g(n) grows faster than f(n) when n > 10. We say f is big Oh of g. n f(n) g(n) 10 10,000 10,000 50 250,000 6,250,000 100 1,000,000 100,000,000 150 2,250,000 506,250,000 Definition (big oh): Let f and g be functions from the set of integers (or the set of real numbers) to the set of real numbers. Then f(x) is said to be O( g(x) ), which is read as f(x) is big oh of g(x), if and only if there are constants C and n 0 such that f(x) C g(x) whenever x > n 0. Note that big oh is a binary relation on a set of functions (What kinds of properties does it have? reflexive? symmetric? transitive?). The relationship between f and g can be illustrated as follows when f is big oh of g. (Please click on the legend the bottom yellow part of the graph to activate the illustration animation.) For example, 5 x + 10 is big oh of x 2, because 5 x + 10 < 5 x 2 + 10 x 2 = 15 x 2 for x > 1. for C = 15 and n 0 = 1, 5x + 10 C x 2. Similarly it can be seen that 3 x 2 + 2 x + 4 < 9 x 2 for x > 1. 3 x 2 + 2 x + 4 is O( x 2 ). In general, we have the following theorem: Theorem 1: a n x n +... + a 1 x + a 0 is O( x n ) for any real numbers a n,..., a 0 and any nonnegative number n. Note: Let f(x) = 3 x 2 + 2 x + 4, g(x) = x 2, from the above illustration, we have that f(x) is O(g(x)). Also, since x 2 < 3 x 2 + 2 x + 4, we can also get g(x) is O(f(x)). In this case, we say these two functions are of the same order. Please refer to big theta for more information. Growth of Combinations of Functions Big oh has some useful properties. Some of them are listed as theorems here. Let use start with the definition of max function.

Definition(max function): Let f 1 (x) and f 2 (x) be functions from a set A to a set of real numbers B. Then max( f 1 (x), f 2 (x) ) is the function from A to B that takes as its value at each point x the larger of f 1 (x) and f 2 (x). (Please click on the legend the bottom yellow part of the graph to activate the illustration animation.) Theorem 2: If f 1 (x) is O( g 1 (x) ), and f 2 (x) is O( g 2 (x) ), then (f 1 + f 2 )( x ) is O( max( g 1 (x), g 2 (x) ) ). From this theorem it follows that if f 1 (x) and f 2 (x) are O( g(x) ), then (f 1 + f 2 )( x ) is O( g(x) ), and (f 1 + f 2 )( x ) is O( max( f 1 (x), f 2 (x) ) ). Theorem 3: If f 1 (x) is O( g 1 (x) ), and f 2 (x) is O( g 2 (x) ), then (f 1 * f 2 )( x ) is O( g 1 (x) * g 2 (x) ). Big Omega and Big Theta Big oh concerns with the "less than or equal to" relation between functions for large values of the variable. It is also possible to consider the "greater than or equal to" relation and "equal to" relation in a similar way. Big Omega is for the former and big theta is for the latter. Definition (big omega): Let f and g be functions from the set of integers (or the set of real numbers) to the set of real numbers. Then f(x) is said to be ( g(x) ), which is read as f(x) is big omega of g(x), if there are constants C and n 0 such that f(x) C g(x) whenever x > n 0. Definition (big theta): Let f and g be functions from the set of integers (or the set of real numbers) to the set of real numbers. Then f(x) is said to be ( g(x) ), which is read as f(x) is big theta of g(x), if f(x) is O( g(x) ), and ( g(x) ). We also say that f(x) is of order g(x). (We have seen this at Big Oh when introducing the concept of same order.) For example, 3x 2 3x 5 is ( x 2 ), because 3x 2 3x 5 x 2 for integers x > 2 (C = 1, n 0 = 2 ). by Theorem 1 it is ( x 2 ). In general, we have the following theorem:

Theorem 4: a n x n +... + a 1 x + a 0 is. ( x n ) for any real numbers a n,..., a 0 and any nonnegative number n Little Oh and Little Omega If f(x) is O( g(x) ), but not ( g(x) ), then f(x) is said to be o( g(x) ), and it is read as f(x) is little oh of g(x). Similarly for little omega ( ). For example x is o(x 2 ), x 2 is o(2 x ), 2 x is o(x! ), etc. Test Your Understanding of Growth of Function Indicate which of the following statements are correct and which are not. Click True or False, then Submit. There are two sets of questions. Next Calculation of Big Oh Back to Schedule Back to Table of Contents Subjects to be Learned Contents calculation of big oh relation using limit L'Hospital's rule Calculation of Big Oh Basic knowledge of limits and derivatives of functions from calculus is necessary here. Big oh relationships between functions can be tested using limit of function. as follows: Let f(x) and g(x) be functions from a set of real numbers to a set of real numbers. Then 1. If, then f(x) is o( g(x) ). Note that if f(x) is o( g(x) ), then f(x) is O( g(x) ). 2. If, then g(x) is o( f(x) ). 3. If, then f(x) is ( g(x) ). 4. If, then f(x) is O( g(x) ).

For example, ( 4/x + 3/x 2 + 5/x 4 )/(1 3/x 5/x 3 4/x 4 ) = 0. ( 4x 3 + 3x 2 + 5 ) is o(x 4 3x 3 5x 4 ), or equivalently, (x 4 3x 3 5x 4 ) is (4x 3 + 3x 2 + 5 ). Let us see why these rules hold. Here we give a proof for 4. Others can be proven similarly. Proof: Suppose. By the definition of limit this means that, n 0 such that whenever x > n 0. In particular Let =, then whenever x > n 0. Since we are interested in non negative functions f and g, this means that f(x) C g(x) f(x) = O( g(x) ). L'Hospital (L'Hôpital)'s Rule f(x)/g(x) is not always easy to calculate. For example take x 2 /3 x. Since both x 2 and 3 x go to as x goes to and there is no apparent factor common to both, the calculation of the limit is not immediate. One tool we may be able to use in such cases is L'Hospital's Rule, which is given as a theorem below. Theorem 5 ( L'Hospital ): If f(x) = and g(x) =, and f(x) and g(x) have the first derivatives, f '(x) and g'(x), respectively, then f(x)/g(x) = f '(x)/g'(x).

This also holds when f(x) = 0 and g(x) = 0, instead of f(x) = and g(x) =. For example, x/e x = 1/e x = 0, because (e x )' = e x, where e is the base for the natural logarithm. Similarly ln x/x = ( 1/x )/1 = 1/x = 0. Note that this rule can be applied repeatedly as long as the conditions are satisfied. So, for example, x 2 /e x = 2x/e x = 2/e x = 0. Test Your Understanding of L'Hospital's Rule Indicate which of the following statements are correct and which are not. Click True or False, then Submit. There is one set of questions. In the questions below lim means the limit as n goes to infinity, and n**k is n raised to the k th power. Back to Schedule Back to Table of Contents Summary of Big Oh Sometimes, it is very necessary to compare the order of some common used functions including the following: 1 logn n nlogn n 2 2 n n! n n Now, we can use what we've learned above about the concept of big Oh and the calculation methods to calculate the order of these functions. The result shows that each function in the above list is big oh of the functions following them. The figure below displays the graphs of these funcions, using a scale for the values of the functions that doubles for each successive marking on the graph.

Back to Schedule Back to Table of Contents