Comparison of functions. Comparison of functions. Proof for reflexivity. Proof for transitivity. Reflexivity:

Similar documents
Introduction to Algorithms

Asymptotic notation. O-notation 2 78 and g(n) = 1 2 n2.then. O-notation 3 Example. Let f (n) =n 3 and g(n) =2n 2.Thenf (n) 6= O(g(n)).

with the size of the input in the limit, as the size of the misused.

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Growth of Functions (CLRS 2.3,3)

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CS Non-recursive and Recursive Algorithm Analysis

CS 310 Advanced Data Structures and Algorithms

Theory of Computation

CS473 - Algorithms I

CSC Design and Analysis of Algorithms. Lecture 1

CS 4407 Algorithms Lecture 2: Growth Functions

The Growth of Functions and Big-O Notation

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Ch 01. Analysis of Algorithms

Ch01. Analysis of Algorithms

Analysis of Algorithms Review

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Data Structures and Algorithms. Asymptotic notation

CSE 417: Algorithms and Computational Complexity

Cpt S 223. School of EECS, WSU

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

COMP 9024, Class notes, 11s2, Class 1

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Asymptotic Analysis 1

Agenda. We ve discussed

CSE 531 Homework 1 Solution. Jiayi Xian

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Algorithms. Adnan YAZICI Dept. of Computer Engineering Middle East Technical Univ. Ankara - TURKEY. Algorihms, A.Yazici, Fall 2007 CEng 315

This chapter covers asymptotic analysis of function growth and big-o notation.

Data Structures and Algorithms Running time and growth functions January 18, 2018

Functions. Given a function f: A B:

Given a sequence a 1, a 2,...of numbers, the finite sum a 1 + a 2 + +a n,wheren is an nonnegative integer, can be written

Principles of Algorithm Analysis

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

EECS 477: Introduction to algorithms. Lecture 5

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

csci 210: Data Structures Program Analysis

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

CS173 Running Time and Big-O. Tandy Warnow

Please give details of your answer. A direct answer without explanation is not counted.

Problem Set 1 Solutions

Theory of Computation

csci 210: Data Structures Program Analysis

Design and Analysis of Algorithms

Math 324 Summer 2012 Elementary Number Theory Notes on Mathematical Induction

Design and Analysis of Algorithms. Part 1 Program Costs and Asymptotic Notations

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Mathematical Background. e x2. log k. a+b a + b. Carlos Moreno uwaterloo.ca EIT e π i 1 = 0.

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Intro to Theory of Computation

Exercise Sheet #1 Solutions, Computer Science, A M. Hallett, K. Smith a k n b k = n b + k. a k n b k c 1 n b k. a k n b.

3.1 Asymptotic notation

Section 1.8 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

P, NP, NP-Complete, and NPhard

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Introduction to Algorithms: Asymptotic Notation

Introduction to Algorithms and Asymptotic analysis

An analogy from Calculus: limits

Foundations II: Data Structures and Algorithms

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

CSCE 222 Discrete Structures for Computing. Review for the Final. Hyunyoung Lee

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

Section 3.2 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

(Infinite) Series Series a n = a 1 + a 2 + a a n +...

Mid-term Exam Answers and Final Exam Study Guide CIS 675 Summer 2010

The Time Complexity of an Algorithm

The Time Complexity of an Algorithm

Algorithm efficiency analysis

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

1 The distributive law

Recurrences COMP 215

Mat Week 6. Fall Mat Week 6. Algorithms. Properties. Examples. Searching. Sorting. Time Complexity. Example. Properties.

Module 1: Analyzing the Efficiency of Algorithms

Running Time Evaluation

Algorithms Design & Analysis. Analysis of Algorithm

Student Responsibilities Week 6. Mat Properties of Algorithms. 3.1 Algorithms. Finding the Maximum Value in a Finite Sequence Pseudocode

Computational Complexity - Pseudocode and Recursions

MATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

MATH 117 LECTURE NOTES

Math 2602 Finite and Linear Math Fall 14. Homework 8: Core solutions

The Growth of Functions. A Practical Introduction with as Little Theory as possible

13 Lecture 13 L Hospital s Rule and Taylor s Theorem

M17 MAT25-21 HOMEWORK 6

Ph219/CS219 Problem Set 3

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Big O Notation. P. Danziger

Electrical & Computer Engineering University of Waterloo Canada February 26, 2007

A SUMMARY OF RECURSION SOLVING TECHNIQUES

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Transcription:

Comparison of functions Comparison of functions Reading: Cormen et al, Section 3. (for slide 7 and later slides) Transitivity f (n) = (g(n)) and g(n) = (h(n)) imply f (n) = (h(n)) Reflexivity: f (n) = (f (n)) f (n) =O(f (n)) f (n) = (f (n)) f (n) =O(g(n)) and g(n) =O(h(n)) imply f (n) =O(h(n)) Reflexivity does not hold for o() and!( ). E.g., ln n 6= o(ln n). f (n) = (g(n)) and g(n) = (h(n)) imply f (n) = (h(n)) Symmetry: f (n) =o(g(n)) and g(n) =o(h(n)) imply f (n) =o(h(n)) f (n) = (g(n)) if and only if g(n) = (f (n)) f (n) =!(g(n)) and g(n) =!(h(n)) imply f (n) =!(h(n)) Examples: n = O(ln n) andlnn = O(n3 )imply n = O(n3 ). 3n 3 + n = (3n 3 )and3n 3 = (n 3 )imply3n 3 + n = (n 3 ). Symmetry does not hold for O(), (), o() and!(). For example, ln ln n = O(ln n), but ln n 6= O(ln ln n). Transpose symmetry: f (n) =O(g(n)) if and only if g(n) = (f (n)). f (n) =o(g(n)) if and only if g(n) =!(f (n)). 3 COMP3600/6466 Algorithms 08 Lecture 3 COMP3600/6466 Algorithms 08 Lecture 3 3 Proof for transitivity f (n) =O(g(n)) and g(n) =O(h(n)) imply f (n) =O(h(n)) f (n) =O(g(n)) means: There are constants c > 0andn > 0 such that 0 apple f (n) apple c g(n), for all n n. g(n) =O(h(n)) means: There are constants c > 0andn > 0 such that 0 apple g(n) apple c h(n), for all n n. Thus 0 apple f (n) apple c g(n) apple c (c h(n)), for all n max{n, n }. Since c = c c and max{n, n } are positive constants, it follows that f (n) = O(h(n)), by the definition of the O-notation. f (n) = (f (n)) Proof for reflexivity f (n) = (f (n)) means: There are constants c > 0andn 0 > 0 such that 0 apple cf (n) apple f (n), for n n 0. This is true, because f (n) apple f (n), for all positive integers, and f (n) is assumed to be asymptotically nonnegative. More precisely, c = and some large enough n 0 satisfy the definition, so f (n) = (f (n)) follows. COMP3600/6466 Algorithms 08 Lecture 3 COMP3600/6466 Algorithms 08 Lecture 3 4 4

Proof for symmetry Standard notations and common functions f (n) = (g(n)) if and only if g(n) = (f (n)) We show f (n) = (g(n)) implies g(n) = (f (n)). The reverse direction can be proven similarly. f (n) = (g(n)) means: There are constants c > 0, c > 0, and n 0 > 0 such that 0 apple c g(n) apple f (n) apple c g(n), for all n n 0. Since c g(n) apple f (n), we have g(n) apple c f (n). Similarly, since f (n) apple c g(n), we have g(n) c f (n), Monotonicity: A function f (n) ismonotonically increasing if m < n implies f (m) apple f (n) anditisstrictly increasing if m < n implies f (m) < f (n). Similarly, f (n) ismonotonically decreasing if m < n implies f (m) f (n) andstrictly decreasing if m < n implies f (m) > f (n). Floor and ceiling: For any real x, thefloor of x, denoted by bxc, is the greatest integer less than or equal to x. Theceiling of x, denoted by dxe, is the least integer greater than or equal to x. We have x < bxc applex appledxe < x +. E.g., b4.9c =4andd4.0e = 5. We thus have 0 apple c f (n) apple g(n) apple c f (n), i.e., there are constants c 0 = c > 0, c 00 = c > 0, and n 0 > 0 such that 0 apple c 0 f (n) apple g(n) apple c 00 f (n), for all n n 0. Thus g(n) = (f (n)), by the definition of the -notation. 5 Polynomials: Given a nonnegative integer d, apolynomial in n is a function of the form p(n) = P d i=0 a in i,wherea i is a constant for each i, calledacoe cient of the polynomial. The degree of a polynomial is the highest power that occurs. E.g., p(n) =3n 7 4n 5 +.n 00 is a polynomial of degree 7. 7 COMP3600/6466 Algorithms 08 Lecture 3 5 COMP3600/6466 Algorithms 08 Lecture 3 7 Proof for transpose symmetry f (n) =o(g(n)) if and only if g(n) =!(f (n)) Proof Suppose that f (n) = o(g(n)). Hence, for any positive constant c, there exists a positive constant n 0 such that 0 apple f (n) < cg(n), for all n n 0. Now let d be any positive constant. Then there exists a positive constant n 0 such that 0 apple f (n) < (/d)g(n), for all n n 0.That is, 0 apple df(n) < g(n), for all n n 0.Henceg(n) =!(f (n)). The converse is similar. So we can, in fact, define o() using!(), and conversely. COMP3600/6466 Algorithms 08 Lecture 3 6 6 Standard notations and common functions Exponentials: For all a > 0, (a m ) n = a mn, a m a n = a m+n, lim n! ( + x n )n = e x, e ln x =ln(e x )=x. Factorials: Recall the definition of n factorial: n! = 3 (n ) n. Stirling s approximation: n! = p n n n + e n n! =o(n n ) n! =!( n ) lg(n!) = (n lg n) COMP3600/6466 Algorithms 08 Lecture 3 8 8

Standard notations and common functions 3 Logarithms: For all a > 0, b > 0, and c > 0, and log bases 6=, a = b log b a log b a = log c a log c b (Base of logarithm not important asymptotically) lg n log n ln n log e n lg k n =(lgn) k lg lg n =lg(lgn) A useful result from calculus (Apostol, Calculus, Volume, second edition, page 30: Iterated logarithm The iterated logarithm is a very slow growing function. lg =. lg 4 =. lg 6 = 3. lg 65536 = 4. If a > 0andb > 0, then and ln b x lim x! x a =0 [ln b n = o(n a )] lim x! x b e ax =0 [nb = o(e an )andn b = o(c n ), for c > ] COMP3600/6466 Algorithms 08 Lecture 3 9 9 lg ( 65536 ) = 5. lg ( 65536 ) = 6. (That is, lg = 6.) COMP3600/6466 Algorithms 08 Lecture 3 Iterated logarithm lg (0) n = n, lg () n =lgn, lg () n =lglgn, lg (3) n =lglglgn,... In general: lg (k+) n =lglg (k) n. Iterated logarithm function: lg n =min{i 0 : lg (i) n apple }. Example: Determine the value of lg 6. Now lg 6 is the smallest integer i such that lg (i) 6 apple. We have lg (0) 6 = 6 > lg () 6 =lg 6 = 6 > lg () 6 =lglg () 6 = lg 6 = lg 4 =4 > lg (3) 6 =lglg () 6 =lg4=lg = > lg (4) 6 =lglg (3) 6 =lg= It follows that lg 6 = 4. apple. COMP3600/6466 Algorithms 08 Lecture 3 0 0 Comparison of growth rates Common functions fit in a hierarchy (growth rate increases from left to right: e.g. n lg n = O(n 3 )): n...... lg n... lg lg n... lg n... lg n... n... n... n lg n... n 3... n... n 3...... n... e n... 3 n... n!... n n... Most of these facts we accept (and remember!) without proof; others we will prove: I We will use (without proof) that exponential functions grow faster than powers. I We do not prove that logarithmic functions grow slower than powers. I We however often show simpler relationships, for example, that higher powers grow faster than lower powers. I We will prove results like n lg n = o(n 3 ), using the fact that logarithms grow slower than powers. COMP3600/6466 Algorithms 08 Lecture 3

Working with asymptotic notations To simplify a function by using asymptotic notation: Take the fastest growing additive term and ignore its constant coe cient. Examples: 3n +5n += (n ) n + 4(n +lgn) n = (n 3 ) 4n +lnn n + = (n) n n + O(n 3 )= (n n ) 6 ln n + n 00 = (n 00 ) Sample proof O(g (n)) (g (n)) = O g (n).(g (n) assumed asymptotically positive) g (n) Let f (n) =O(g (n)), which means: There are constants n > 0, c > 0 such that 0 apple f (n) apple c g (n), for n n. Let f (n) = (g (n)), which means: There are constants n > 0, c > 0 such that 0 apple c g (n) apple f (n), for n n. We obtain 0 apple f (n) f (n) apple c g (n) f (n) for n max(n, n ). apple c g (n) c g (n) = c c g(n) g (n), 5n + (n) = (n ) (careful!) 3 Since c = c c and n 0 =max(n, n ) are positive constants, the statement follows. 5 COMP3600/6466 Algorithms 08 Lecture 3 3 COMP3600/6466 Algorithms 08 Lecture 3 5 Further asymptotic properties Solution to Exercise I O(g (n)+g (n)) = O(max{g (n), g (n)}) e.g., O(n 5 + log n) =O(n 5 ) I O(g (n)) + O(g (n)) = O(max{g (n), g (n)}) e.g., O(n 3 )+O(n )=O(n 3 ) I O(g (n)) O(g (n)) = O(g (n) g (n)) e.g., O n O(log n) =O log n n (g (n)) I O(g (n)) = g (n). e.g., () g (n) O(n) = n O(g (n)) I (g (n)) = O g (n). e.g., O(n ) g (n) (n) = O(n) Exercise: Try proving some of the above. Exercise: Show that O(n )=O(n) is not valid. If O(n )=O(n) were true, then, for any choice f (n) from the set O(n ), there would be a function g(n) ino(n) suchthat f (n) =g(n). In other words, O(n ) would be a subset of O(n). (Here, it is precise to use subset, because on both sides of the equation, we have well-defined sets.) However, f (n) =n is a member of O(n ) but it is not a member of O(n). Thus, O(n ) is not a subset of O(n), that is, O(n ) 6= O(n). (Note that there are many other functions in O(n ) that are not in O(n).) COMP3600/6466 Algorithms 08 Lecture 3 4 4 COMP3600/6466 Algorithms 08 Lecture 3 6 6

Solution to Exercise Exercise: Prove that n + n O(ln n) =O(n ln n). We have to prove that, for any choice f (n) from the set O(ln n), there is a function g(n) ino(n ln n) suchthat n + n f (n) =g(n), that is, n + n f (n) is an element of O(n ln n). Let f (n) be an arbitrary element of O(ln n). This means that, there is a constant c > 0 such that 0 apple f (n) apple c ln n and so 0 apple n f (n) apple cn ln n, forallsu ciently large values of n. Thus 0 apple n+n f (n) apple n+c n ln n apple n ln n+c n ln n apple (c+) n ln n, for su ciently large n. By the definition of O, thismeansthat n + n f (n) =O(n ln n), as required. 7 COMP3600/6466 Algorithms 08 Lecture 3 7 Solution to Exercise Exercise: Prove that n 6=!(n ). Suppose that n =!(n ). By the definition of!-notation, for any constant c > 0, we have 0 apple cn < n,forsu ciently large n. (*) However, for c =, this gives the false statement that 0 apple n < n for su ciently large n. That is, statement (*) is not true for all c > 0, which is a contradiction. Thus the assumption that n =!(n )isfalse. COMP3600/6466 Algorithms 08 Lecture 3 8 8