CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Similar documents
Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

CSE 417: Algorithms and Computational Complexity

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Growth of Functions (CLRS 2.3,3)

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Copyright 2000, Kevin Wayne 1

EECS 477: Introduction to algorithms. Lecture 5

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

COMP 382: Reasoning about algorithms

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Principles of Algorithm Analysis

Review Asympto&c Bounds

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

Copyright 2000, Kevin Wayne 1

The Time Complexity of an Algorithm

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Advanced Algorithmics (6EAP)

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Analysis of Algorithms

Algorithms Design & Analysis. Analysis of Algorithm

Big O Notation. P. Danziger

Algorithms and Their Complexity

Module 1: Analyzing the Efficiency of Algorithms

Big O Notation. P. Danziger

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

Data Structures and Algorithms. Asymptotic notation

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Reading 10 : Asymptotic Analysis

Data Structures and Algorithms CSE 465

Asymptotic Analysis 1

Big-oh stuff. You should know this definition by heart and be able to give it,

Analysis of Algorithms Review

CSC Design and Analysis of Algorithms. Lecture 1

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Order Notation and the Mathematics for Analysis of Algorithms

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

2.1 Computational Tractability. Chapter 2. Basics of Algorithm Analysis. Computational Tractability. Polynomial-Time

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Big , and Definition Definition

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Computational Complexity

Introduction to Cryptography

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

with the size of the input in the limit, as the size of the misused.

Lecture 3. Big-O notation, more recurrences!!

CS 4407 Algorithms Lecture 2: Growth Functions

Asymptotic Analysis of Algorithms. Chapter 4

CS 61B Asymptotic Analysis Fall 2017

Lecture 3: Big-O and Big-Θ

Ch 01. Analysis of Algorithms

2. ALGORITHM ANALYSIS

Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Theory of Computation

The Growth of Functions. A Practical Introduction with as Little Theory as possible

CS 310 Advanced Data Structures and Algorithms

An analogy from Calculus: limits

Ch01. Analysis of Algorithms

Section 1.8 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

This chapter covers asymptotic analysis of function growth and big-o notation.

Introduction to Algorithms: Asymptotic Notation

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

CSC236 Week 4. Larry Zhang

Assignment 5 Bounding Complexities KEY

What is Performance Analysis?

The Growth of Functions (2A) Young Won Lim 4/6/18

Section 3.2 The Growth of Functions. We quantify the concept that g grows at least as fast as f.

Module 1: Analyzing the Efficiency of Algorithms

Problem-Solving via Search Lecture 3

CSE 373: Data Structures and Algorithms. Asymptotic Analysis. Autumn Shrirang (Shri) Mare

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Intro to Theory of Computation

Analysis of Algorithms

Approximation and Randomized Algorithms (ARA) Lecture 1, September 3, 2012

7.4 Relative Rates of Growth

COMP Analysis of Algorithms & Data Structures

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Electrical & Computer Engineering University of Waterloo Canada February 26, 2007

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Fall Lecture 5

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Lecture 2: Asymptotic Notation CSCI Algorithms I

COMP Analysis of Algorithms & Data Structures

Submit Growable Array exercise Answer Q1-3 from today's in-class quiz.

Transcription:

CSE 421: Intro Algorithms 2: Analysis Winter 2012 Larry Ruzzo 1

Efficiency Our correct TSP algorithm was incredibly slow Basically slow no matter what computer you have We want a general theory of efficiency that is Simple Objective Relatively independent of changing technology But still predictive theoretically bad algorithms should be bad in practice and vice versa (usually) 3

Defining Efficiency Runs fast on typical real problem instances Pro: sensible, bottom-line-oriented Con: moving target (diff computers, compilers, Moore s law) highly subjective (how fast is fast? What s typical?) 4

computational complexity The time complexity of an algorithm associates a number T(n), the worst-case time the algorithm takes, with each problem size n. Mathematically, T: N+ R+ i.e.,t is a function mapping positive integers (problem sizes) to positive real numbers (number of steps). Reals so we can say, e.g., sqrt(n) instead of sqrt(n) 10

computational complexity: general goals Asymptotic growth rate, i.e., characterize growth rate of worst-case run time as a function of problem size, up to a constant factor, e.g. T(n) = O(n 2 ) Why not try to be more precise? Average-case, e.g., is hard to define, analyze Technological variations (computer, compiler, OS, ) easily 10x or more Being more precise is a ton of work A key question is scale up : if I can afford this today, how much longer will it take when my business is 2x larger? (E.g. today: cn 2, next year: c(2n) 2 = 4cn 2 : 4 x longer.) Big-O analysis is adequate to address this. 12

Time! computational complexity T(n)! 2n log 2 n! n log 2 n! Problem size! 13

O-notation, etc. Given two functions f and g:n R f(n) is O(g(n)) iff there is a constant c>0 so that f(n) is eventually always c g(n) f(n) is Ω (g(n)) iff there is a constant c>0 so that f(n) is eventually always c g(n) f(n) is Θ (g(n)) iff there is are constants c 1, c 2 >0 so that eventually always c 1 g(n) f(n) c 2 g(n) 15

Examples 10n 2-16n+100 is O(n 2 ) also O(n 3 ) 10n 2-16n+100 11n 2 for all n 10 10n 2-16n+100 is Ω (n 2 ) also Ω (n) 10n 2-16n+100 9n 2 for all n 16 Therefore also 10n 2-16n+100 is Θ (n 2 ) 10n 2-16n+100 is not O(n) also not Ω (n 3 ) 16

Properties Transitivity. If f = O(g) and g = O(h) then f = O(h). If f = Ω(g) and g = Ω(h) then f = Ω(h). If f = Θ(g) and g = Θ(h) then f = Θ(h). Additivity. If f = O(h) and g = O(h) then f + g = O(h). If f = Ω(h) and g = Ω(h) then f + g = Ω(h). If f = Θ(h) and g = O(h) then f + g = Θ(h). 17

Working with O-Ω-Θ notation Claim: For any a, and any b>0, (n+a) b is Θ(n b ) (n+a) b (2n) b for n a = 2 b n b = cn b for c = 2 b so (n+a) b is O(n b ) (n+a) b (n/2) b for n 2 a (even if a < 0) = 2 -b n b = c n for c = 2 -b so (n+a) b is Ω (n b ) 18

Working with O-Ω-Θ notation Claim: For any a, b>1 log a n is Θ (log b n) log a b = x means a x = b a log a b = b (a log a b ) log b n = b log b n = n (log a b)(log b n) = log a n c log b n = log a n for the constant c = log a b So : log b n = "(log a n) = "(logn) 19

Asymptotic Bounds for Some Common Functions Polynomials: a 0 + a 1 n + + a d n d is Θ(n d ) if a d > 0 Logarithms: O(log a n) = O(log b n) for any constants a,b > 0 Logarithms: For all x > 0, log n = O(n x ) log grows slower than every polynomial 20

polynomial vs exponential For all r > 1 (no matter how small) and all d > 0, (no matter how large) n d = O(r n ). 1.01 n n 100 In short, every exponential grows faster than every polynomial! 21

Domination f(n) is o(g(n)) iff lim n f(n)/g(n)=0 that is g(n) dominates f(n) If a b then n a is O(n b ) If a < b then n a is o(n b ) Note: if f(n) is Θ (g(n)) then it cannot be o(g(n)) 22

Working with little-o n 2 = o(n 3 ) [Use algebra]: lim n "# n 2 n 3 = lim n "# 1 n = 0 n 3 = o(e n ) [Use L Hospital s rule 3 times]: lim n "# n 3 e n = lim n "# 3n 2 e n = lim n "# 6n e n = lim n "# 6 e n = 0 23

the complexity class P: polynomial time P: Running time O(n d ) for some constant d (d is independent of the input size n) Nice scaling property: there is a constant c s.t. doubling n, time increases only by a factor of c. (E.g., c ~ 2 d ) Contrast with exponential: For any constant c, there is a d such that n n+d increases time by a factor of more than c. (E.g., c = 100 and d = 7 for 2 n vs 2 n+7 ) 24

polynomial vs exponential growth 2 2n! 2 2n 2 n/10 2 n/10! 1000n 2 1000n 2! 25

why it matters not only get very big, but do so abruptly, which likely yields erratic performance on small instances 26

another view of poly vs exp Next year's computer will be 2x faster. If I can solve problem of size n 0 today, how large a problem can I solve in the same time next year? Complexity Increase E.g. T=10 12 O(n) n 0 2n 0 10 12 2 x 10 12 O(n 2 ) n 0 2 n 0 10 6 1.4 x 10 6 O(n 3 ) n 0 3 2 n 0 10 4 1.25 x 10 4 2 n /10 n 0 n 0 +10 400 410 2 n n 0 n 0 +1 40 41 27

why polynomial? Point is not that n 2000 is a nice time bound, or that the differences among n and 2n and n 2 are negligible. Rather, simple theoretical tools may not easily capture such differences, whereas exponentials are qualitatively different from polynomials, so more amenable to theoretical analysis. My problem is in P is a starting point for a more detailed analysis My problem is not in P may suggest that you need to shift to a more tractable variant, or otherwise readjust expectations 28

complexity summary Typical initial goal for algorithm analysis is to find an asymptotic upper bound on worst case running time as a function of problem size This is rarely the last word, but often helps separate good algorithms from blatantly poor ones - concentrate on the good ones! 32