with the size of the input in the limit, as the size of the misused.

Similar documents
3.1 Asymptotic notation

CS473 - Algorithms I

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Introduction to Algorithms

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Data Structures and Algorithms CSE 465

Growth of Functions (CLRS 2.3,3)

CS 4407 Algorithms Lecture 2: Growth Functions

Computational Complexity

Asymptotic Analysis 1

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Introduction to Algorithms: Asymptotic Notation

Principles of Algorithm Analysis

COMP Analysis of Algorithms & Data Structures

Comparison of functions. Comparison of functions. Proof for reflexivity. Proof for transitivity. Reflexivity:

COMP Analysis of Algorithms & Data Structures

COMP 9024, Class notes, 11s2, Class 1

CS Non-recursive and Recursive Algorithm Analysis

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Data Structures and Algorithms. Asymptotic notation

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Algorithms Design & Analysis. Analysis of Algorithm

Introduction to Algorithms and Asymptotic analysis

Agenda. We ve discussed

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Foundations II: Data Structures and Algorithms

EECS 477: Introduction to algorithms. Lecture 5

The Time Complexity of an Algorithm

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CS Data Structures and Algorithm Analysis

CSC Design and Analysis of Algorithms. Lecture 1

Intro to Theory of Computation

Asymptotic notation. O-notation 2 78 and g(n) = 1 2 n2.then. O-notation 3 Example. Let f (n) =n 3 and g(n) =2n 2.Thenf (n) 6= O(g(n)).

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Ch01. Analysis of Algorithms

IS 709/809: Computational Methods in IS Research Fall Exam Review

CS 380 ALGORITHM DESIGN AND ANALYSIS

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Design and Analysis of Algorithms

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Asymptotic Analysis of Algorithms. Chapter 4

Analysis of Algorithms

Cpt S 223. School of EECS, WSU

Running Time Evaluation

COMPUTER ALGORITHMS. Athasit Surarerks.

Big-O Notation and Complexity Analysis

Ch 01. Analysis of Algorithms

COMP 355 Advanced Algorithms

Chapter 0: Preliminaries

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Asymptotic Analysis Cont'd

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Time complexity analysis

Module 1: Analyzing the Efficiency of Algorithms

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Analysis of Algorithms Review

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Welcome to CSci Algorithms and Data Structures

Analysis of Algorithms

What we have learned What is algorithm Why study algorithm The time and space efficiency of algorithm The analysis framework of time efficiency Asympt

This chapter covers asymptotic analysis of function growth and big-o notation.

data structures and algorithms lecture 2

Algorithms. Adnan YAZICI Dept. of Computer Engineering Middle East Technical Univ. Ankara - TURKEY. Algorihms, A.Yazici, Fall 2007 CEng 315

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Runtime Complexity. CS 331: Data Structures and Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

csci 210: Data Structures Program Analysis

Big O Notation. P. Danziger

The Growth of Functions and Big-O Notation

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

Module 1: Analyzing the Efficiency of Algorithms

CSE332: Data Abstrac0ons Sec%on 2. HyeIn Kim Spring 2014

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CISC 235: Topic 1. Complexity of Iterative Algorithms

csci 210: Data Structures Program Analysis

Big O (Asymptotic Upper Bound)

Recurrences COMP 215

Theory of Computation

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Big O Notation. P. Danziger

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

How many hours would you estimate that you spent on this assignment?

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Big-Oh notation and P vs NP

Transcription:

Chapter 3. Growth of Functions

Outline Study the asymptotic efficiency of algorithms Give several standard methods for simplifying the asymptotic analysis of algorithms Present several notational conventions used throughout this book Review the behavior of functions that commonly arise in the analysis of algorithms

Asymptotic Notation(1) Asymptotic efficiency of algorithms: We are concern with how the running time of an algorithm increases with the size of the input in the limit, as the size of the input increases without bound The notation we use to describe the asymptotic running time of an algorithm are defined in terms of function whose domain are the set of natural numbers N={0,1,2, }- T(n) It is important to understand the precise meaning of the notation so that when it is sometimes abused,, it is not misused.

-notation Asymptotic Notation (g(n)) = {f f (n) : there exist positive constants c 1, c 2 and n 0 such that 0 c 1 g(n) f(n) c 2 g(n) for all n n 0 } For a given function g(n), (g(n)) is set of functions. Abuse (g(n)),, we write f(n) = In stead of writing f(n) f(n) (g(n)) to indicate that f(n) is a member of (g(n)) Asymptotic tight bound sufficiently large n f(n) = (g(n)) Figure 3.1(a) gives an intuitive picture of f(n) = (g(n)). For all n n 0 the function f(n) is equal to g(n) within a constant factor. We say g(n) is an asymptotic tight bound for f(n)

-notation We introduced an informal notion of -notation: throwing away low-order order terms and ignoring coefficient of the highest-order term. We justify this intuition: 1/2n 2-3n= (n 2 ) To do so, we must determine positive constants c 1,c 2 and n 0 such that: 2 1 2 2 c1n n 3n c2n 0 c 1 3 1 c2, n n0 2 2 n 1 1 n 6 n0 7, c2, c1 2 14

Continue Intuitively, the lower-order order terms of an asymptotically positive function can be ignored in determining asymptotically tight bound because they are insignificant for large n. The coefficient of the highest-order term can likewise be ignored, since it only changes c 1 and c 2 by a constant factor equal to the coefficient. b c 2 2 2 0 c1 a c2 cn 1 an bn c c2n 2 n n a 0 n n d i p( n) a n, a 0 i 0 p n n n i d 0 ( ) ( ), ( ) (1) d 0 a 7a c1 a, c2 a 4 4 n 2max( b / a, c / a) 0

Asymptotic Notation(2) The -notation asymptotically bounds a function from above and below When we have only an asymptotic upper bound, we use O-notation notation. We use O-notation to give an upper bound on a function to within a constant factor.. (Fig.3.1(b)) f(n)=o(g(n) also indicate f(n) is a member of set O(g(n) and f ( n) ( g( n)) f ( n) O( g( n))

O-Notation(1) O(g(n)) = {f f (n) : there exist positive constants c and n 0 such that 0 f (n) cg(n) for all n n0}. g(n) is an asymptotic upper bound for f(n). Example: 2n² = O(n³),, with c=1 and n 0 =2. also, 2n² = O(n²),, with c=2 and n 0 =0. Examples of functions in O(n²): n², n² ² + n, n² ² + 1000n, 1000n² + 1000n Also, n, n/1000, n 1.9999, n²/ lg lg lg n

O-Notation(2) In literature, O-notation is sometimes used informally to describe asymptotically tight bounds, however, distinguishing asymptotic upper bound from asymptotically tight bound has now become standards in literature. Since O-notation describes an upper bound,, when we use it to bound the worst-case running time of an algorithm, we have a bound on the running time of the algorithms on every input,, but -notation cannot guarantee this, n=o(n 2 ). When we say the running time is O(n 2 ), we mean that there is a function f(n) that is O(n 2 ) such that for any value of n,, no matter what particular input of size n is chosen, the running time on that input is bounded from above by the value f(n)

Asymptotic Notation(3) Just as O-notation provides an asymptotic upper bound on a function, -notation provides an asymptotic lower bound. The intuition behind -notation is shown in Fig.3.1(c) When we have only an asymptotic lower bound,, we use -notation. f ( n) ( g( n)) 2 2 an bn c O( n ) 2 2 an bn c ( n ) f ( n) O( g( n)) f ( n) ( g( n))

-notation(1) (g(n)) = {f f (n) : there exist positive constants c and n 0 such that 0 c g(n) f(n) for all n n 0 }. g(n) is an asymptotic lower bound for f(n). Example: n = (lg n),, with c=1 and n 0 =16. Examples of functions in (n²): n², n² ² + n, n² - n, 1000n² + 1000n, 1000n² - 1000n, Also, n³, n 2.0000, n² lg lg lg n,

-notation(2) Since -notation describe a lower bound,, when we use it to bound the best case running time of an algorithm, by implication we also bound the running time of the algorithm on arbitrary input as well. (e.g. insertion sort: (n) (n)) For insertion sort,, its running time falls between (n) and O(n 2 ), moreover, these bound are asymptotically as tight as possible. When we say that the running time of an algorithm is (g(n)), we mean that no matter what particular input of size n is chosen for each value of n,, the running time on that input is at least a constant times g(n),, for sufficiently large n

Asymptotic notation in Equations and inequalities 2 2 2n 3n 1 2 n ( n) 2 2 2n 3n 1 2 n f ( n) 2 2 2 n ( n) ( n ) No matter how the anonymous functions are chosen on the left of the equal sign, there is a way to choose the anonymous functions on the right of the equal sign to make the equation valid.

Asymptotic Notation(4) O-notation may or may not be asymptotically tight We use o-notation to denote an upper bound that is not asymptotically tight o(g(n)) = {f f (n) : for all constants c > 0, there exist a constant n 0 > 0 such that 0 f(n) < cg(n) for all n n0} Example: 2n=o(n 2 ), but 2n 2 o(n 2 ) 1.9999n = o(n 2 ), n² / lg n = (n²) (n²), n² (n²), n²/1000 (n²) f ( n) O( g( n)) f ( n) o( g( n)) c 0, n,0 f ( n) cg( n), n n 0 0 c 0, n,0 f ( n) cg( n), n n 0 0

o-notation Intuitively, in the o-notation, the function f(n) becomes insignificant relative to g(n) as n approaches infinity,, like lim n f ( n) g( n) 0, g( n) 0

Asymptotic Notation(5) We use -notation to denote a lower bound that is not asymptotically tight. By analogy, -notation is to -notation as o-notation to O-notation Definition (g(n)) = {f f (n) : for all constants c > 0, there exist a constant n 0 > 0 such that 0 cg(n) < f(n) for all n n0} f ( n) ( g( n)) g( n) o( f ( n)) f ( n ) n g ( n ) l i m, g ( n ) 0 n n 2 /2 ( n) /2 ( n ) 2 2

Growth of functions A way to describe behavior of functions in the limit -- asymptotic efficiency Growth of functions Focus on what s important by abstracting away low-order terms and constant factors. How to indicate running times of algorithms? A way to compare sizes of functions: O = o < ω >

Comparisons of Functions Related Properties: Transitivity: f (n) = (g(n)) and g(n) = (h(n)) f (n) = (h(n)). Same for O,, o, and ω. Reflexivity: f (n) = ( f (n)). Same for O and. Symmetry: f (n) = (g(n)) if and only if g(n) = ( f (n)). Transpose symmetry: f (n) = O(g(n)) if and only if g(n) = ( f (n)). f (n) = ω(g(n)) if and only if g(n) = ω( f (n)). Comparisons: f (n) is asymptotically smaller than g(n) if f (n) = o(g(n)). f (n) is asymptotically larger than g(n) if f (n) = ω(g(n)).

Standard notations and common Monotonicity: functions (1) f (n) is monotonically increasing if m n f (m) f (n). f (n) is monotonically decreasing if m n f (m) f (n). f (n) is strictly increasing if m < n f (m) < f (n). f (n) is strictly decreasing if m n f (m) > f (n). Floor and Ceilings: x 1 < x x x < x+1 Modular arithmetic: a mod n = a - a/n n d Polynomials: Exponentials: i p ( n ) a n, a 0 i 0 i d 0 1 m n m n mn m n a a a a a a a a a 1,,, ( ), 0

Standard notations and common functions (2) Any exponential function with a base strictly greater than 1 grows faster than any polynomial function b n b n lim 0, a 1 n o ( a ) n n a 2 3 i x x x 2 1... 1 ( )( 0) x e x x x x 2! 3! i! Logarithms i 0 Any positive polynomial function grows faster than any polylogarithmic function b b lg n lg n b a lim lim 0 lg n o( n ) n a lg n a 2 n n

Standard notations and common functions (3) Factorial (n!) n n! 1 2 3 n, n! o( n ) n n! (2 ), lg n! ( n lg n) Function iteration (0) ( i) ( i 1) f n n f n f f n i ( ), ( ) ( ( ))( 0) * ( i) lg ( n) min i 0 : lg n 1 The iterated algorithm is a very slowly growing function F 0, F 1, h Fibonacci numbers 0 1 F F F ( i 2) i i 1 i 2

Homework 3.1-1, 1, 3.1-7 3.2-5 Problem 3-33 (*)