Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Similar documents
3.1 Asymptotic notation

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Asymptotic Analysis 1

Lecture 2: Asymptotic Notation CSCI Algorithms I

Lecture 3. Big-O notation, more recurrences!!

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Growth of Functions (CLRS 2.3,3)

Data Structures and Algorithms. Asymptotic notation

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

Running Time Evaluation

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Computational Complexity

The Time Complexity of an Algorithm

Data Structures and Algorithms Running time and growth functions January 18, 2018

This chapter covers asymptotic analysis of function growth and big-o notation.

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

CSC236 Week 4. Larry Zhang

Ch01. Analysis of Algorithms

CS 310 Advanced Data Structures and Algorithms

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

Big-O Notation and Complexity Analysis

The Time Complexity of an Algorithm

CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis. Hunter Zahn Summer 2016

Analysis of Algorithms

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Analysis of Algorithms

An analogy from Calculus: limits

Asymptotic Analysis Cont'd

CS473 - Algorithms I

Cpt S 223. School of EECS, WSU

Asymptotic Analysis of Algorithms. Chapter 4

CS Non-recursive and Recursive Algorithm Analysis

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Data Structures and Algorithms CSE 465

CSE 241 Class 1. Jeremy Buhler. August 24,

Algorithms Design & Analysis. Analysis of Algorithm

Introduction to Algorithms

CSE 417: Algorithms and Computational Complexity

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Homework 1 Solutions

Analysis of Algorithms - Using Asymptotic Bounds -

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

EECS 477: Introduction to algorithms. Lecture 5

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CSC Design and Analysis of Algorithms. Lecture 1

Module 1: Analyzing the Efficiency of Algorithms

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Big O (Asymptotic Upper Bound)

P, NP, NP-Complete, and NPhard

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Written Homework #1: Analysis of Algorithms

CS 4407 Algorithms Lecture 2: Growth Functions

Ch 01. Analysis of Algorithms

CSE 373: Data Structures and Algorithms. Asymptotic Analysis. Autumn Shrirang (Shri) Mare

Module 1: Analyzing the Efficiency of Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CSCE 222 Discrete Structures for Computing. Review for Exam 2. Dr. Hyunyoung Lee !!!

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Agenda. We ve discussed

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CSC236 Week 3. Larry Zhang

Announcements Monday, September 18

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. January 16, 2019

COMP 9024, Class notes, 11s2, Class 1

Algorithms and Programming I. Lecture#1 Spring 2015

Analysis of Algorithms

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

Lecture 3: Big-O and Big-Θ

CSE332: Data Abstrac0ons Sec%on 2. HyeIn Kim Spring 2014

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Polynomials; Add/Subtract

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

What is Performance Analysis?

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

CS Data Structures and Algorithm Analysis

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Asymptotic Algorithm Analysis & Sorting

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Analysis Fundamentals

Algorithms and Their Complexity

2. ALGORITHM ANALYSIS

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Compare the growth rate of functions

Transcription:

Lecture 1: Asymptotic Complexity 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Announcements TA office hours officially start this week see web site. Lab 1 released this Wednesday due 2/8 at 11:59 PM (work on your own it s a lab) There is no coding for this lab, just the written part. Please review and follow the ehomework guidelines for this and future lab writeups. Read the Gradescope turn-in guide at the bottom of the ehomework guidelines. 2

Announcements, Cont d If you joined the class on or after last Thursday, 1/17, you must make up Studio 0 by showing your writeup to a TA in office hours by 1/31 See the website for office hours times and locations Please check that you have a Gradescope account. Those who joined by the first day of class should have gotten an invite email. If you did not, or if you cannot see CSE 247, go to https://www.gradescope.com, create an account if needed, and register for class code M7DRK3 3

Things You Saw in Studio 0 Ticks are a useful way to measure complexity -- count # of times we reach a specific place in the code. Growing array by doubling takes time linear in # of elements added. ( Naïve approach took quadratic time!) We can reason about the number of ticks ( running time) of a program analytically, without actually running it. 4

Today s Agenda Counting the number of ticks exactly Asymptotic complexity Big-O notation being sloppy, but in a very precise way Big-Ω notation the opposite (?) of big-o Big-Θ notation how to say about a constant times f(n) 5

How Many Times Do We Tick? Let s take an example from the studio: How many times do we call tick()? 6

How Many Times Do We Tick? Let s take an example from the studio: Once for each value of i in the loop 7

How Many Times Do We Tick? Let s take an example from the studio: So, for i = 0, 1, 2,??? 8

How Many Times Do We Tick? Let s take an example from the studio: So, for i = 0, 1, 2, n-1 9

How Many Times Do We Tick? Let s take an example from the studio: So, for i = 0, 1, 2, n-1 (not n, because <) 10

Accounting One tick per loop iteration. Total tick count is therefore σ n 1 i=0 (1) Greatest value of i in loop Least value of i in loop 11

Accounting One tick per loop iteration. Total tick count is therefore σ n 1 i=0 1 = n 1 0 + 1 = n 12

Accounting One tick per loop iteration. Total tick count is therefore σ n 1 i=0 1 = n 1 0 + 1 = n First rule of counting: a loop from i = LO to i = HI runs HI LO + 1 times 13

Let s Try a Doubly-Nested Loop Now consider this code: How many times do we call tick()? 14

Let s Work from the Inside Out Innermost loop runs for j from 0 to??? 15

Let s Work from the Inside Out Inner loop runs for j from 0 to i-1 Hence, we tick (i-1) 0 + 1 = i times each time we execute the inner loop. 16

Let s Work from the Inside Out Outer loop runs for i from 0 to??? i ticks 17

Let s Work from the Inside Out Outer loop runs for i from 0 to n-1 i ticks But this time, the number of ticks is different for each i! 18

Accounting i ticks per outer loop iteration Total tick count is therefore σ n 1 i=0 i 19

Accounting i ticks per outer loop iteration. Total tick count is therefore σ n 1 i=0 i = n(n 1) 2 Remember this from last time? We'll use it a lot! 20

Accounting i ticks per outer loop iteration. Total tick count is therefore σ n 1 i=0 i = n(n 1) 2 Second rule of counting: when loops are nested, Work inside-out and form a summation. 21

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k in 0 j tick() tick() tick() 22

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k in 0 j tick() tick() tick() For j from 1 to n, inclusive 23

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k in 0 j tick() tick() tick() Inner loop runs for k from 0 to j and ticks 3 times per iteration 24

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k in 0 j tick() tick() tick() Inner loop runs j 0 + 1 = j+1 times and ticks 3 times per iteration 25

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k = 0 j tick() 3(j+1) tick() ticks tick() Inner loop runs j 0 + 1 = j+1 times and ticks 3 times per iteration 26

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k = 0 j tick() 3(j+1) tick() ticks tick() Outer loop runs for j from 1 to n and ticks??? times on iteration j 27

One More Time Instead of Java, let s do pseudocode. for j in 1 n tick() for k = 0 j tick() 3(j+1) tick() ticks tick() Outer loop runs for j from 1 to n and ticks 1 + 3(j+1) = 3j+4 times on iteration j 28

Accounting 3j+4 ticks per outer loop iteration. Total tick count is therefore n σ j=1 (3j + 4) 29

Accounting 3j+4 ticks per outer loop iteration. Total tick count is therefore n σ j=1 (3j + 4) 30

Accounting 3j+4 ticks per outer loop iteration. Total tick count is therefore 31

Do We Really Care? Seriously,??? Do we need this much detail to understand our code s running time? 32

How Do We Actually Use Running Times? Predict exact time to complete a task 33

How Do We Actually Use Running Times? Predict exact time to complete a task (yeah, we need the precise count for this) 34

How Do We Actually Use Running Times? Predict exact time to complete a task (yeah, we need the precise count for this) Compare running times of different algorithms 35

How Do We Actually Use Running Times? Predict exact time to complete a task (yeah, we need the precise count for this) Compare running times of different algorithms 1000 n log n n 2 3n 2 36

How Do We Actually Use Running Times? Predict exact time to complete a task (yeah, we need the precise count for this) Compare running times of different algorithms 1000 n log n n 2 3n 2 Difference is a constant factor (solved by using a bigger computer) 37

How Do We Actually Use Running Times? Predict exact time to complete a task (yeah, we need the precise count for this) Compare running times of different algorithms 1000 n log n n 2 3n 2 Qualitatively different! 38

n 39

Desirable Properties of Running Time Estimates Distinguish get a bigger computer vs qualitatively different order of growth matters (constant factors don t) 40

Desirable Properties of Running Time Estimates Distinguish get a bigger computer vs qualitatively different order of growth matters (constant factors don t) Ignore transient effects for small input sizes n 41

n 42

Desirable Properties of Running Time Estimates Distinguish get a bigger computer vs qualitatively different order of growth matters (constant factors don t) Ignore transient effects for small input sizes n Standard assumption: we care what happens as input becomes large (grows without bound) In other words, we care about asymptotic behavior of an algorithm s running time! 43

44

How do we reason about asymptotic behavior? Time for Theory! 45

Definition of Big-O Notation Let f(n), g(n) be positive functions for n > 0. 46

Definition of Big-O Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] 47

Definition of Big-O Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] We say that f(n) = O(g(n)) if there exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). 48

There exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). 49

There exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). For small n, f(n) can behave strangely if it wants. 50

There exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). But by some point n 0, f(n) settles down 51

There exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). and it stays c g(n) forever after. 52

Does Big-O Have the Properties We Desire? Explicitly ignores behavior of functions for small n (we get to decide what small is). Allows a constant c in front of g(n) for upper bound. Does that make big-o insensitive to constants? 53

Big-O Ignores Constants, as Desired Lemma: If f(n) = O(g(n)), then f(n) = O(a g(n)) for any a > 0. Pf: f(n) = O(g(n)) for some c > 0, n 0 > 0, if n n 0, f(n) c g(n). But then for n n 0, f(n) c a a g(n). Conclude that f(n) = O(a g(n)). QED 54

Big-O Ignores Constants, as Desired Lemma: If f(n) = O(g(n)), then f(n) = O(a g(n)) for any a > 0. Pf: f(n) = O(g(n)) for some c > 0, n 0 > 0, if n n 0, f(n) c g(n). But then for n n 0, When specifying running times, never write a constant inside the O(). It is unnecessary. f(n) c a a g(n). Conclude that f(n) = O(a g(n)). QED 55

Does big-o Match our Intuition? Which function grows faster, n or n 2? 56

Does big-o Match our Intuition? Which function grows faster, n or n 2? [quadratic beats linear] So does n = O(n 2 )? Set c =???, n 0 =??? [many options here] 57

Does big-o Match our Intuition? Which function grows faster, n or n 2? [quadratic beats linear] So does n = O(n 2 )? Set c = 1, n 0 = 1 [many options here] When n 1, is 1 n 2 n? 58

Does big-o Match our Intuition? Which function grows faster, n or n 2? [quadratic beats linear] So does n = O(n 2 )? Set c = 1, n 0 = 1 [many options here] When n 1, is 1 n 2 n? Yes! multiply both sides of n 1 by n. QED 59

General Strategy for Proving f(n) = O(g(n)) 1. Pick c > 0, n 0 > 0. [choose to make next steps easier] 2. Write down desired inequality f(n) c g(n). 3. Prove that the inequality holds whenever n n 0. 60

Another Example Does 3n 2 + 11n = O(n 2 )? 61

Another Example Does 3n 2 + 11n = O(n 2 )? [what does your intuition say?] Let s prove it. Set c =???, n 0 =??? 62

Another Example Does 3n 2 + 11n = O(n 2 )? [what does your intuition say?] Let s prove it. Set c = 33, n 0 = 1 [again, many possible choices] For n 1, difference 33n 2 3n 2 + 11n = 11n 2 3n 2 + 11n 2 11n + 11n 2 0 > 0. Conclude that the claim is true. QED 63

Generalization of Previous Proof k Thm: any polynomial of the form s n = σ j=0 a j n j is O(n k ). Pf: pick c to be k+1 times the largest (most positive) a j ; pick n 0 = 1. Write cn k s n as k σ j=0 c k+1 nk a j n j, each term of which is 0 for n 1. QED 64

Generalization of Previous Proof k Thm: any polynomial of the form s n = σ j=0 a j n j is O(n k ). Pf: pick c to be k+1 times the largest a j, and pick n 0 = 1. Write cn k s n as When specifying running times, never write lowerorder terms k c σ j=0 inside the O(). k+1 nk a j n j, It is unnecessary. each term of which is 0 for n 1. QED 65

Generalization of Previous Proof k Thm: any polynomial of the form s n = σ j=0 a j n j is O(n k ). Pf: pick c to be k+1 times the largest a j, and pick n 0 = 1. Write cn k s n as Based on these two examples, we can prove k σ j=0 c k+1 nk a j n j, each term of which is 0 for n 1. QED 66

Generalization of Previous Proof k Thm: any polynomial of the form s n = σ j=0 a j n j is O(n k ). Pf: pick c to be k+1 times the largest a j, and pick n 0 = 1. Write cn k s n as k σ j=0 c k+1 nk a j n j, each term of which is 0 for n 1. QED Polynomial terms other than the highest do not impact asymptotic complexity! 67

One More Example Does 1000 n log n = O(n 2 )? 68

n 69

One More Example Does 1000 n log n = O(n 2 )? Set c =???, n 0 =??? 70

One More Example Does 1000 n log n = O(n 2 )? Set c = 1000, n 0 = 1 When n = 1, 1000 n 2 1000 n log n = 1000 > 0. Moreover, this difference only grows with increasing n > 1. QED 71

One More Example Does 1000 n log n = O(n 2 )? Set c = 1000, n 0 = 1 When n = 1, 1000 n 2 1000 n log n = 1000 > 0. Moreover, this difference only grows with increasing n > 1. QED (Oh really? Are you sure?) 72

One More Example Well, the derivative of the difference which is > 0 for n = 1. But does it stay that way for n > 1? 73

One More Example Well, the derivative of the difference which is > 0 for n = 1. But does it stay that way for n > 1? Furthermore, d 2 dn 2 [1000n2 1000 n log n] = 2000 1000/n, which is > 0 for n 1. Hence, the derivative remains positive, and so the difference increases for n 1 as claimed. 74

Moral You can use calculus to show that one function remains greater than another past a certain point, even if the functions are not algebraic. This is often a crucial step in proving f(n) = O(g(n)). (Next time, we ll use this idea to derive a general test for comparing the asymptotic behavior of two functions.) Big-O makes precise our intuition about when one function effectively upper-bounds another, ignoring constant factors and small input sizes. 75

Extensions of Big-O Notation: Ω and Θ 76

More Ways to Bound Running Times When comparing numbers, we would not be happy if we could say x y but not x y or x = y Big-O is analogous to for functions [upper bound on growth rate] What are statements analogous to, =? 77

More Ways to Bound Running Times When comparing numbers, we would not be happy if we could say x y but not x y or x = y Big-O is analogous to for functions [upper bound on growth rate] What are statements analogous to, =? Ω, Θ 78

Definition of Big-Ω Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] We say that f(n) = Ω(g(n)) if there exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). 79

Definition of Big-Ω Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] We say that f(n) = Ω(g(n)) if there exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). 80

There exist constants c > 0, n 0 > 0 such that for all n n 0, f(n) c g(n). 81

How Do You Prove f(n) = Ω(g(n))? Lemma: f(n) = O(g(n)) iff g(n) = Ω(f(n)) So if we want to prove, say, n 2 = Ω(n log n), we just prove n log n = O(n 2 ). 82

(Proof of Lemma) If f(n) = O(g(n)), there are c > 0, n 0 > 0 s.t. for n n 0, f(n) c g(n). Set d = 1/c. Then for n n 0, g(n) d f(n). Conclude that with constants d, n 0, we have proved g(n) = Ω(f(n)). A similar argument works to prove the other direction of the iff. QED 83

Definition of Big-Θ Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] We say that f(n) = Θ(g(n)) if there exist constants c 1, c 2 > 0, n 0 > 0 such that for all n n 0, c 1 g(n) f(n) c 2 g(n). 84

Definition of Big-Θ Notation Let f(n), g(n) be positive functions for n > 0. [e.g. running times!] We say that f(n) = Θ(g(n)) if there exist constants c 1, c 2 > 0, n 0 > 0 such that for all n n 0, c 1 g(n) f(n) c 2 g(n). 85

There exist constants c 1,c 2 > 0, n 0 > 0 s.t. for all n n 0, c 1 g(n) f(n) c 2 g(n). Upper and lower bounds on f(n) (might not be same constant) 86

How Do You Prove f(n) = Θ(g(n))? Lemma: f(n) = Θ(g(n)) iff f(n) = O(g(n)) and f(n) = Ω(g(n)) So if we want to prove, say, 3n 2 + 11n = Θ(n 2 ), we just prove 3n 2 + 11n = O(n 2 ) and 3n 2 + 11n = Ω(n 2 ) 87

How Do You Prove f(n) = Θ(g(n))? Lemma: f(n) = Θ(g(n)) iff f(n) = O(g(n)) and f(n) = Ω(g(n)) So if we want to prove, say, 3n 2 + 7 = Θ(n 2 ), definitions we of just O, prove Ω, and Θ. 3n 2 + 7 = O(n 2 ) and 3n 2 + 7 = Ω(n 2 ) You should be able to prove this lemma from the 88

Conclusion (so far) We now have precise way to bound behavior of fcns when n gets large, ignoring constant factors. We can replace ugly precise running times by much simpler expressions with same asymptotic behavior. You will see O, Ω, and Θ frequently for rest of 247! 89

Next Time Quick, uniform proof strategy for O, Ω, and Θ statements Review of linked lists for Studio 2 More practice applying asymptotic complexity 90

91

End of Asymptotic Complexity Part 1 continued next lecture 92