Big O (Asymptotic Upper Bound)

Similar documents
Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Growth of Functions (CLRS 2.3,3)

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

3.1 Asymptotic notation

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

The Time Complexity of an Algorithm

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Data Structures and Algorithms Running time and growth functions January 18, 2018

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Computational Complexity

Analysis of Algorithms

The Time Complexity of an Algorithm

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS473 - Algorithms I

Written Homework #1: Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Lecture 2: Asymptotic Notation CSCI Algorithms I

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

CS 4407 Algorithms Lecture 2: Growth Functions

Principles of Algorithm Analysis

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Cpt S 223. School of EECS, WSU

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Data Structures and Algorithms. Asymptotic notation

Algorithms Design & Analysis. Analysis of Algorithm

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

COMP 9024, Class notes, 11s2, Class 1

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Running Time Evaluation

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Agenda. We ve discussed

Ch01. Analysis of Algorithms

COMP 382: Reasoning about algorithms

Enumerate all possible assignments and take the An algorithm is a well-defined computational

The Growth of Functions and Big-O Notation

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Analysis of Algorithms

CS Non-recursive and Recursive Algorithm Analysis

data structures and algorithms lecture 2

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

Data Structures and Algorithms Chapter 2

Asymptotic Analysis 1

An analogy from Calculus: limits

CSC Design and Analysis of Algorithms. Lecture 1

csci 210: Data Structures Program Analysis

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

CSC236 Week 4. Larry Zhang

Data Structures and Algorithms CSE 465

Big-O Notation and Complexity Analysis

Algorithms and Their Complexity

Analysis of Algorithms

Introduction to Algorithms and Asymptotic analysis

Design and Analysis of Algorithms. Part 1 Program Costs and Asymptotic Notations

with the size of the input in the limit, as the size of the misused.

Time complexity analysis

Asymptotic Analysis of Algorithms. Chapter 4

csci 210: Data Structures Program Analysis

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

P, NP, NP-Complete, and NPhard

Ch 01. Analysis of Algorithms

Problem Set 1 Solutions

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Asymptotic Analysis Cont'd

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CS F-01 Algorithm Analysis 1

CS Data Structures and Algorithm Analysis

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

EECS 477: Introduction to algorithms. Lecture 5

CS 380 ALGORITHM DESIGN AND ANALYSIS

Analysis of Algorithms Review

Analysis of Algorithms - Using Asymptotic Bounds -

2. ALGORITHM ANALYSIS

Asymptotic Algorithm Analysis & Sorting

CISC 235: Topic 1. Complexity of Iterative Algorithms

Advanced Algorithmics (6EAP)

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

CS 61B Asymptotic Analysis Fall 2017

Runtime Complexity. CS 331: Data Structures and Algorithms

Data Structures and Algorithms

COMPUTER ALGORITHMS. Athasit Surarerks.

Introduction to Computer Science Lecture 5: Algorithms

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Great Theoretical Ideas in Computer Science. Lecture 9: Introduction to Computational Complexity

Submit Growable Array exercise Answer Q1-3 from today's in-class quiz.

CS 310 Advanced Data Structures and Algorithms

Transcription:

Big O (Asymptotic Upper Bound) Linear search takes O(n) time. Binary search takes O(lg(n)) time. (lg means log 2 ) Bubble sort takes O(n 2 ) time. n 2 + 2n + 1 O(n 2 ), n 2 + 2n + 1 O(n) Definition: f (n) O(g(n)) iff there exists real c > 0, natural n 0 such that for all natural n n 0, 0 f (n) cg(n) O(g(n)) is the set of such functions. Help! What is it saying?! Let me tell some revisionist history... 1 / 20

How much time does this take? e = false; for (i = 0; i < n; i++) { for (j = i+1; j < n; j++) { if (a[i] == a[j]) { e = true; } } } Platform A Platform B read i, j, n 2 ns 1 ns read a[i] 3 ns 5 ns write i, j 3 ns 1 ns arithmetic 1 ns 1 ns branch 1 ns if continue, 0 ns if continue, 2 ns if exit 2 ns if exit loop back 1 ns 1 ns Total 12n 2 + 14n + 10 10n 2 + 5n + 6 2 / 20

How to be both Coarse and Rigorous? 12n 2 + 14n + 10 10n 2 + 5n + 6 How to unify them and ignore machine differences? What is common about them? Quadratic polynomials. Poster-boy for quadratic polynomials: n 2. How to rigorously say: the above functions are like n 2? And cubic polynomials are like n 3? And 4n lg(n) + 2n + 10 is like n lg(n)? Etc., etc. 3 / 20

How to be both Coarse and Rigorous! Watch this mathemagic: For all natural n 10: 12n 2 + 14n + 10 12n 2 + 14n 2 + 10 = 26n 2 + 10 26n 2 + 10 26n 2 + n 26n 2 + n 2 = 27n 2 There exists natural n 0 such that for all natural n n 0, 0 12n 2 + 14n + 10 27n 2 There exists real c > 0, natural n 0 such that for all natural n n 0, 0 12n 2 + 14n + 10 cn 2 4 / 20

The Rise of Big O There exists real c > 0, natural n 0 such that for all natural n n 0, 0 12n 2 + 14n + 10 cn 2 12n 2 + 14n + 10 O(n 2 ) 10n 2 + 5n + 6 O(n 2 ) My algorithm takes time in O(n 2 ) My algorithm takes O(n 2 ) time My algorithm takes time on the order of n 2 4n lg(n) + 2n + 10 O(n lg(n)) The definition is scary sophisticated because it has to drop some information and carefully preserve some other. It only drops: leading coefficient, small n scenerios, minor terms. 5 / 20

More or Less But wait, these are also true: n O(n 2 ) 3 O(n 2 ) O(n 2 ) includes quadratic functions as well as lesser functions. We need another definition to exclude lesser functions. n O(n 2 ) because it only requires there exists... for all... n cn 2 It s only a one-sided inequality. The definition in the next slide uses a two-sided inequality to rule out this. 6 / 20

Big Θ (Asymptotic Tight Bound) Θ is Greek capital theta. Definition: f (n) Θ(g(n)) iff there exists real b > 0, real c > 0, natural n 0 such that for all natural n n 0, 0 bg(n) f (n) cg(n) Θ(g(n)) is the set of such functions. Example: 12n 2 + 14n + 10 Θ(n 2 ) because for all n 10, 1n 2 12n 2 + 14n + 10 27n 2 Example: n Θ(n 2 ) because (sketch) you can t make bn 2 n to work. 7 / 20

Big Ω (Asymptotic Lower Bound): Big O Flipped Ω is Greek capital omega. Definition: f (n) Ω(g(n)) iff there exists real b > 0, natural n 0 such that for all natural n n 0, 0 bg(n) f (n) Ω(g(n)) is the set of such functions. Equivalently, f (n) Ω(g(n)) iff g(n) O(f (n)). Example: 12n 2 + 14n + 10 Ω(n 2 ) Example: n 2 Ω(n) Example: n Ω(n 2 ) 8 / 20

Asymptotics of Several Variables What if your function takes multiple parameters? I ll illustrate with 2 parameters. f (m, n) O(g(m, n)) iff there exist real c > 0, natural m 0, n 0 such that for all natural m m 0, n n 0, 0 f (m, n) cg(m, n) f (m, n) Θ(g(m, n)) iff there exist real b > 0, real c > 0, natural m 0, n 0 such that for all natural m m 0, n n 0, 0 bg(m, n) f (m, n) cg(m, n) f (m, n) Ω(g(m, n)) iff there exist real b > 0, natural m 0, n 0 such that for all natural m m 0, n n 0, 0 bg(m, n) f (m, n) 9 / 20

Interlude: Functions and Lambda f (n) O(g(n)), n O(n 2 ) is standard but lousy notation. f (n) is not a function, f is. Clever people are not confused by this (they invented this themselves). Enlightened people would not do this to confuse. There is a better way, if you know lambda (Haskell, Scheme, Python, Javascript, Scala, Clojure, F#, C++, Java now too... ): f O(g) iff there exists real c > 0, natural n 0 such that for all natural n n 0, 0 f (n) cg(n) Example: (λn. n + 5) O(λn. n 2 ). Also (λx. x + 5) O(λx. x 2 ). 10 / 20

Using Limits to Prove Big O (Assume: n 0 : n n 0 : f (n) 0 and g(n) 0.) f (n) Theorem: If lim n g(n) exists and is finite, then f (n) O(g(n)). Example: Prove n(n + 1)/2 O(n 2 ) Therefore n(n + 1)/2 O(n 2 ) Example: Prove ln(n) O(n) Therefore ln(n) O(n) n(n + 1)/2 lim n n 2 = 1 2 ln(n) 1/n lim = lim n n n 1 = 0 11 / 20

Using Limits to Disprove Big O (Assume: n 0 : n n 0 : f (n) 0 and g(n) 0.) f (n) Theorem: If lim n g(n) =, then f (n) O(g(n)). Example: Disprove n 2 O(n) Therefore n 2 O(n) Example: Disprove n O(ln(n)) lim n Therefore n O(ln(n)) n 2 lim n n = lim n = n n ln(n) = lim n 1 1/n = lim n n = 12 / 20

When Limits Don t Help f (n) Theorem: If lim n g(n) exists and is finite, then... f (n) Theorem: If lim n g(n) =, then... Which case has not been covered? f (n) If lim n g(n) does not exist and is not, then no conclusion. Hopefully this happens rarely. Example: Define drunk(n) = if n is even then 1 else n drunk(n) O(n) and drunk(n) O(1), but drunk(n) lim n n lim n drunk(n) 1 does not exist and is not does not exist and is not 13 / 20

Using Limits for Θ Theorem: f (n) Θ(g(n)) iff f (n) O(g(n)) and g(n) O(f (n)) Handy when you want to use limits. (Not handy when you want to prove the directly.) Example: n 2 + n 3/2 Θ(n 2 ) n 2 + n 3/2 O(n 2 ) by a limit n 2 O(n 2 + n 3/2 ) by a similar limit Example: ln(n) Θ(n) You saw n O(ln(n)) by a limit. 14 / 20

Big O, Big Θ May Miss Something 10 100 n Θ(n) n + 10 100 Θ(n) Can t say these are practical algorithm times. But O, Θ can t detect them. This is a price for ignoring machine differences. These pathological cases are rare. O and Θ are usually informative. 15 / 20

Big O, Big Θ Gain Simplification e = false; for (i = 0; i < n; i++) { for (j = i+1; j < n; j++) { if (a[i] == a[j]) { e = true; } } } 12n 2 + 14n + 10 ns Took me 30 minutes to figure out (and I got it wrong the first time). The inner loop takes O(n i 1) time, which is O(n). The outer loop takes n iterations. Altogether O(n 2 ) time. This only takes 30 seconds to figure out. It is one of the reasons we use O and Θ. 16 / 20

O vs Θ My friend says: my algorithm takes O(n 2 ) time. It may be actually 9n 2 + 4n + 13. It may be actually 3n + 2. 3n + 2 O(n 2 ) is still true. It may be actually 45. 45 O(n 2 ) is still true. My opinion: You should say Θ as much as you can. Popular opinion: Just say O. A good use of O: Assignment or contract says: Write a O(n 2 )-time algorithm. It allows you to do better. 17 / 20

Worst Case, Best Case e = false; for (i = 0; i < n; i++) { for (j = i+1; j < n; j++) { if (a[i] == a[j]) { e = true; break; } } if (e) { break; } } Depending on what s in a: Anywhere from Θ(1) time to Θ(n 2 ) time. Best case is Θ(1) time. Worst case is Θ(n 2 ) time. We look at worst cases in this course mostly. 18 / 20

Myth Buster Myth: O means worst-case time, Ω means best-case. Truth: O, Ω, Θ classify functions, do not say what the functions stand for. 9n 2 + 4n + 13 may be best-case time, or worst-case time, or best-case space, or worst-case space, or just a polynomial from nowhere. 9n 2 + 4n + 13 O(n 2 ) is true regardless. Best case time is in O(n 2 ) is allowed. It means: Best case time is some function, that function is in O(n 2 ). Clearly a sensible statement and possible scenerio. O, Ω, Θ are good for any function from natural to non-negative real. 19 / 20

Assumptions on Computational Time This course assumes the following to take O(1) time each: 64-bit integer arithmetic; 64-bit floating-point arithmetic; 64-bit pointer comparison read/write 64 bits (data or pointer), if the address is also 64-bit (note: this limits memory size) branching, jumping Food for thought: What if your numbers have many digits and exceed 64 bits? What if you want/have unlimited memory? I run mergesort, but only on 2GB arrays ever, so O(n lg n) time but n 2 31 ever. Is this O(1) time? 20 / 20