CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Similar documents
CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis. Hunter Zahn Summer 2016

CS 310 Advanced Data Structures and Algorithms

csci 210: Data Structures Program Analysis

Analysis of Algorithms

csci 210: Data Structures Program Analysis

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

Growth of Functions (CLRS 2.3,3)

CSE 373: Data Structures and Algorithms. Asymptotic Analysis. Autumn Shrirang (Shri) Mare

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

ASYMPTOTIC COMPLEXITY SEARCHING/SORTING

Asymptotic Running Time of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Data Structures and Algorithms. Asymptotic notation

Great Theoretical Ideas in Computer Science. Lecture 9: Introduction to Computational Complexity

Topic 17. Analysis of Algorithms

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

This chapter covers asymptotic analysis of function growth and big-o notation.

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Running Time Evaluation

3. Algorithms. What matters? How fast do we solve the problem? How much computer resource do we need?

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Analysis of Algorithms

Theory of Computation

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

CS1210 Lecture 23 March 8, 2019

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Algorithms and Their Complexity

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

P, NP, NP-Complete, and NPhard

N/4 + N/2 + N = 2N 2.

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Lecture 3. Big-O notation, more recurrences!!

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Computational Complexity

CPSC 221 Basic Algorithms and Data Structures

Ch 01. Analysis of Algorithms

CISC 235: Topic 1. Complexity of Iterative Algorithms

Data Structures and Algorithms

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

3.1 Asymptotic notation

Principles of Algorithm Analysis

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Computational complexity and some Graph Theory

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Ch01. Analysis of Algorithms

An analogy from Calculus: limits

What is Performance Analysis?

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Lecture 2: Asymptotic Notation CSCI Algorithms I

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Asymptotic Analysis 1

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CS173 Running Time and Big-O. Tandy Warnow

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

COMP 355 Advanced Algorithms

Time Analysis of Sorting and Searching Algorithms

CS Data Structures and Algorithm Analysis

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Asymptotic Analysis of Algorithms. Chapter 4

Example: Fib(N) = Fib(N-1) + Fib(N-2), Fib(1) = 0, Fib(2) = 1

EECS 477: Introduction to algorithms. Lecture 5

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Asymptotic Algorithm Analysis & Sorting

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Analysis of Algorithms

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

MATH 22 FUNCTIONS: ORDER OF GROWTH. Lecture O: 10/21/2003. The old order changeth, yielding place to new. Tennyson, Idylls of the King

Advanced Algorithmics (6EAP)

O Notation (Big Oh) We want to give an upper bound on the amount of time it takes to solve a problem.

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

CS383, Algorithms Spring 2009 HW1 Solutions

Cpt S 223. School of EECS, WSU

Announcements. CompSci 102 Discrete Math for Computer Science. Chap. 3.1 Algorithms. Specifying Algorithms

Algorithms Design & Analysis. Analysis of Algorithm

Asymptotic Analysis Cont'd

Big-O Notation and Complexity Analysis

ITEC2620 Introduction to Data Structures

CSC2100B Data Structures Analysis

CSE 417: Algorithms and Computational Complexity

Module 1: Analyzing the Efficiency of Algorithms

2. ALGORITHM ANALYSIS

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2

Solving Recurrences. 1. Express the running time (or use of some other resource) as a recurrence.

Transcription:

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis Ruth Anderson Winter 2019

Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis Big-Oh Definition 1/09/2019 2

What do we care about? Correctness: Does the algorithm do what is intended. Performance: Speed time complexity Memory space complexity Why analyze? To make good design decisions Enable you to look at an algorithm (or code) and identify the bottlenecks, etc. 1/09/2019 3

Q: How should we compare two algorithms? 1/09/2019 4

A: How should we compare two algorithms? Uh, why NOT just run the program and time it?? Too much variability, not reliable or portable: Hardware: processor(s), memory, etc. OS, Java version, libraries, drivers Other programs running Implementation dependent Choice of input Testing (inexhaustive) may miss worst-case input Timing does not explain relative timing among inputs (what happens when n doubles in size) Often want to evaluate an algorithm, not an implementation Even before creating the implementation ( coding it up ) 1/09/2019 5

Comparing algorithms When is one algorithm (not implementation) better than another? Various possible answers (clarity, security, ) But a big one is performance: for sufficiently large inputs, runs in less time (our focus) or less space Large inputs (n) because probably any algorithm is plenty good for small inputs (if n is 10, probably anything is fast enough) Answer will be independent of CPU speed, programming language, coding tricks, etc. Answer is general and rigorous, complementary to coding it up and timing it on some test cases Can do analysis before coding! 1/09/2019 6

Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code How to count different code constructs Best Case vs. Worst Case Ignoring Constant Factors Asymptotic Analysis Big-Oh Definition 1/09/2019 7

Analyzing code ( worst case ) Basic operations take some amount of constant time Arithmetic Assignment Access one Java field or array index Etc. (This is an approximation of reality: a very useful lie.) Consecutive statements Loops Conditionals Function Calls Recursion Sum of time of each statement Num iterations * time for loop body Time of condition plus time of slower branch Time of function s body Solve recurrence equation 1/09/2019 8

Examples b = b + 5 c = b / a b = c + 100 for (i = 0; i < n; i++) { } sum++; if (j < 5) { sum++; } else { for (i = 0; i < n; i++) { } } sum++; 1/09/2019 9

Another Example int coolfunction(int n, int sum) { int i, j; } for (i = 0; i < n; i++) { } for (j = 0; j < n; j++) { sum++; } print "This program is great!" for (i = 0; i < n; i++) { sum++; } return sum 1/09/2019 10

Using Summations for Loops for (i = 0; i < n; i++) { sum++; } 1/09/2019 11

Complexity cases We ll start by focusing on two cases: Worst-case complexity: max # steps algorithm takes on most challenging input of size N Best-case complexity: min # steps algorithm takes on easiest input of size N 1/09/2019 12

Example 2 3 5 16 37 50 73 75 126 Find an integer in a sorted array // requires array is sorted // returns whether k is in array boolean find(int[]arr, int k){??? } 1/09/2019 13

Linear search Best Case & Worst Case 2 3 5 16 37 50 73 75 126 Find an integer in a sorted array // requires array is sorted // returns whether k is in array boolean find(int[]arr, int k){ for(int i=0; i < arr.length; ++i) if(arr[i] == k) return true; return false; Best case: } Worst case: 1/09/2019 14

Linear search Running Times 2 3 5 16 37 50 73 75 126 Find an integer in a sorted array // requires array is sorted // returns whether k is in array boolean find(int[]arr, int k){ for(int i=0; i < arr.length; ++i) if(arr[i] == k) } return true; return false; Best case: 6 ish steps = O(1) Worst case: 5 ish * (arr.length) = O(arr.length) 1/09/2019 15

Remember a faster search algorithm? 1/09/2019 16

Ignoring constant factors So binary search is O(log n) and linear is O(n) But which will actually be faster? Depending on constant factors and size of n, in a particular situation, linear search could be faster. Could depend on constant factors How many assignments, additions, etc. for each n And could depend on size of n But there exists some n 0 such that for all n > n 0 binary search wins Let s play with a couple plots to get some intuition 1/09/2019 17

Example Let s try to help linear search Run it on a computer 100x as fast (say 2018 model vs. 1990) Use a new compiler/language that is 3x as fast Be a clever programmer to eliminate half the work So doing each iteration is 600x as fast as in binary search Note: 600x still helpful for problems without logarithmic algorithms! 1/09/2019 18

Logarithms and Exponents Since so much is binary in CS, log almost always means log 2 Definition: log 2 x = y if x = 2 y So, log 2 1,000,000 = a little under 20 Just as exponents grow very quickly, logarithms grow very slowly See Excel file for plot data play with it! 1/09/2019 19

Aside: Log base doesn t matter (much) Any base B log is equivalent to base 2 log within a constant factor And we are about to stop worrying about constant factors! In particular, log 2 x = 3.22 log 10 x In general, we can convert log bases via a constant multiplier Say, to convert from base B to base A: log B x = (log A x) / (log A B) 1/09/2019 20

Review: Properties of logarithms log(a*b) = log A + log B So log(n k )= k log N log(a/b) = log A log B X = log(log x) is written log log x Grows as slowly as 2 2 grows fast Ex: log 2 x 2 (log x)(log x) is written log 2 x It is greater than log x for all x > 2 y 32 log log2 4billion ~ log2 log2 2 log2 32 2 5 1/09/2019 21

Logarithms and Exponents 1/09/2019 22

Logarithms and Exponents 1/09/2019 23

Logarithms and Exponents 1/09/2019 24

Today Algorithm Analysis What do we care about? How to compare two algorithms Analyzing Code Asymptotic Analysis Big-Oh Definition 1/09/2019 25

Asymptotic notation About to show formal definition, which amounts to saying: 1. Eliminate low-order terms 2. Eliminate coefficients Examples: 4n + 5 0.5n log n + 2n + 7 n 3 + 2 n + 3n n log (10n 2 ) 1/09/2019 26

Big-Oh relates functions We use O on a function f(n) (for example n 2 ) to mean the set of functions with asymptotic behavior less than or equal to f(n) So (3n 2 +17) is in O(n 2 ) 3n 2 +17 and n 2 have the same asymptotic behavior Confusingly, we also say/write: (3n 2 +17) is O(n 2 ) (3n 2 +17) = O(n 2 ) But we would never say O(n 2 ) = (3n 2 +17) 1/09/2019 27

Formally Big-Oh Definition: g(n) is in O( f(n) ) iff there exist positive constants c and n 0 such that g(n) c f(n) for all n n 0 Note: n 0 1 (and a natural number) and c > 0 1/09/2019 28

Formally Big-Oh Definition: g(n) is in O( f(n) ) iff there exist positive constants c and n 0 such that g(n) c f(n) for all n n 0 Note: n 0 1 (and a natural number) and c > 0 To show g(n) is in O( f(n) ), pick a c large enough to cover the constant factors and n 0 large enough to cover the lower-order terms. Example: Let g(n) = 3n + 4 and f(n) = n c = 4 and n 0 = 5 is one possibility This is less than or equal to So 3n + 4 is also O(n 5 ) and O(2 n ) etc. 1/09/2019 29

What s with the c? To capture this notion of similar asymptotic behavior, we allow a constant multiplier (called c) Consider: g(n) = 7n+5 f(n) = n These have the same asymptotic behavior (linear), so g(n) is in O(f(n)) even though g(n) is always larger There is no positive n 0 such that g(n) f(n) for all n n 0 The c in the definition allows for that: g(n) c f(n) for all n n 0 To show g(n) is in O(f(n)), have c = 12, n 0 = 1 1/09/2019 30

An Example To show g(n) is in O( f(n) ), pick a c large enough to cover the constant factors and n 0 large enough to cover the lower-order terms Example: Let g(n) = 4n 2 + 3n + 4 and f(n) = n 3 1/09/2019 31

Examples True or false? 1. 4+3n is O(n) 2. n+2logn is O(logn) 3. logn+2 is O(1) 4. n 50 is O(1.1 n ) Notes: Do NOT ignore constants that are not multipliers: n 3 is O(n 2 ) : FALSE 3 n is O(2 n ) : FALSE When in doubt, refer to the definition 1/09/2019 34

What you can drop Eliminate coefficients because we don t have units anyway 3n 2 versus 5n 2 doesn t mean anything when we cannot count operations very accurately Eliminate low-order terms because they have vanishingly small impact as n grows Do NOT ignore constants that are not multipliers n 3 is not O(n 2 ) 3 n is not O(2 n ) (This all follows from the formal definition) 1/09/2019 36

Big Oh: Common Categories From fastest to slowest O(1) constant (same as O(k) for constant k) O(log n) logarithmic O(n) linear O(n log n) n log n O(n 2 ) O(n 3 ) quadratic cubic O(n k ) polynomial (where is k is any constant > 1) O(k n ) exponential (where k is any constant > 1) Usage note: exponential does not mean grows really fast, it means grows at rate proportional to k n for some k>1 1/09/2019 37

More Asymptotic Notation Upper bound: O( f(n) ) is the set of all functions asymptotically less than or equal to f(n) g(n) is in O( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 Lower bound: ( f(n) ) is the set of all functions asymptotically greater than or equal to f(n) g(n) is in ( f(n) ) if there exist constants c and n 0 such that g(n) c f(n) for all n n 0 Tight bound: ( f(n) ) is the set of all functions asymptotically equal to f(n) Intersection of O( f(n) ) and ( f(n) ) (can use different c values) 1/09/2019 38

Regarding use of terms A common error is to say O( f(n) ) when you mean ( f(n) ) People often say O() to mean a tight bound Say we have f(n)=n; we could say f(n) is in O(n), which is true, but only conveys the upper-bound Since f(n)=n is also O(n 5 ), it s tempting to say this algorithm is exactly O(n) Somewhat incomplete; instead say it is (n) That means that it is not, for example O(log n) Less common notation: little-oh : like big-oh but strictly less than Example: sum is o(n 2 ) but not o(n) little-omega : like big-omega but strictly greater than Example: sum is (log n) but not (n) 1/09/2019 39

What we are analyzing The most common thing to do is give an O or bound to the worst-case running time of an algorithm Example: True statements about binary-search algorithm Common: (log n) running-time in the worst-case Less common: (1) in the best-case (item is in the middle) Less common: Algorithm is (log log n) in the worst-case (it is not really, really, really fast asymptotically) Less common (but very good to know): the find-in-sortedarray problem is (log n) in the worst-case No algorithm can do better (without parallelism) A problem cannot be O(f(n)) since you can always find a slower algorithm, but can mean there exists an algorithm 1/09/2019 40

Other things to analyze Space instead of time Remember we can often use space to gain time Average case Sometimes only if you assume something about the distribution of inputs See CSE312 and STAT391 Sometimes uses randomization in the algorithm Will see an example with sorting; also see CSE312 Sometimes an amortized guarantee Will discuss in a later lecture 1/09/2019 41

Summary Analysis can be about: The problem or the algorithm (usually algorithm) Time or space (usually time) Or power or dollars or Best-, worst-, or average-case (usually worst) Upper-, lower-, or tight-bound (usually upper or tight) 1/09/2019 42

Big-Oh Caveats Asymptotic complexity (Big-Oh) focuses on behavior for large n and is independent of any computer / coding trick But you can abuse it to be misled about trade-offs Example: n 1/10 vs. log n Asymptotically n 1/10 grows more quickly But the cross-over point is around 5 * 10 17 So if you have input size less than 2 58, prefer n 1/10 Comparing O() for small n values can be misleading Quicksort: O(nlogn) (expected) Insertion Sort: O(n 2 ) (expected) Yet in reality Insertion Sort is faster for small n s We ll learn about these sorts later 1/09/2019 43

Addendum: Timing vs. Big-Oh? At the core of CS is a backbone of theory & mathematics Examine the algorithm itself, mathematically, not the implementation Reason about performance as a function of n Be able to mathematically prove things about performance Yet, timing has its place In the real world, we do want to know whether implementation A runs faster than implementation B on data set C Ex: Benchmarking graphics cards Evaluating an algorithm? Use asymptotic analysis Evaluating an implementation of hardware/software? Timing can be useful 1/09/2019 44

Review: Properties of logarithms log(a*b) = log A + log B So log(n k )= k log N log(a/b) = log A log B X = log(log x) is written log log x Grows as slowly as 2 2 grows fast Ex: log 2 x 2 (log x)(log x) is written log 2 x It is greater than log x for all x > 2 y 32 log log2 4billion ~ log2 log2 2 log2 32 2 5 1/09/2019 45