More on Asymptotic Running Time Analysis CMPSC 122

Similar documents
Principles of Algorithm Analysis

5 + 9(10) + 3(100) + 0(1000) + 2(10000) =

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Growth of Functions (CLRS 2.3,3)

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Lecture 3: Big-O and Big-Θ

Concrete models and tight upper/lower bounds

Module 1: Analyzing the Efficiency of Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

This is particularly true if you see long tails in your data. What are you testing? That the two distributions are the same!

1 Closest Pair of Points on the Plane

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Asymptotic Analysis 1

Math 391: Midterm 1.0 Spring 2016

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

1 Terminology and setup

How many hours would you estimate that you spent on this assignment?

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

CPSC 320 Sample Solution, Reductions and Resident Matching: A Residentectomy

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

Written Homework #1: Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

MATH 521, WEEK 2: Rational and Real Numbers, Ordered Sets, Countable Sets

Advanced Algorithmics (6EAP)

MATH 320, WEEK 6: Linear Systems, Gaussian Elimination, Coefficient Matrices

Assignment 5 Bounding Complexities KEY

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Asymptotic Analysis Cont'd

Cpt S 223. School of EECS, WSU

MITOCW watch?v=vjzv6wjttnc

1.1 Administrative Stuff

Selection and Adversary Arguments. COMP 215 Lecture 19

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

Analysis of Algorithms

Lecture 1 - Preliminaries

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Matroids and Greedy Algorithms Date: 10/31/16

MITOCW MITRES18_005S10_DerivOfSinXCosX_300k_512kb-mp4

CS173 Running Time and Big-O. Tandy Warnow

Error Correcting Codes Prof. Dr. P. Vijay Kumar Department of Electrical Communication Engineering Indian Institute of Science, Bangalore

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Introduction to Algorithms

Topic 17. Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

An analogy from Calculus: limits

NP-Completeness I. Lecture Overview Introduction: Reduction and Expressiveness

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min

Math 1320, Section 10 Quiz IV Solutions 20 Points

csci 210: Data Structures Program Analysis

Some hints for the Radioactive Decay lab

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

MITOCW MIT18_01SCF10Rec_24_300k

Data Structures and Algorithms CSE 465

CPSC 221 Basic Algorithms and Data Structures

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

Introduction to Algebra: The First Week

MITOCW watch?v=y6ma-zn4olk

, p 1 < p 2 < < p l primes.

Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2:

Sequences and Series

Algorithms Design & Analysis. Analysis of Algorithm

Basics of Proofs. 1 The Basics. 2 Proof Strategies. 2.1 Understand What s Going On

MITOCW ocw f99-lec09_300k

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Reminder of Asymptotic Notation. Inf 2B: Asymptotic notation and Algorithms. Asymptotic notation for Running-time

Copyright 2000, Kevin Wayne 1

MITOCW ocw nov2005-pt1-220k_512kb.mp4

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Functions

PSRGs via Random Walks on Graphs

Introduction. How can we say that one algorithm performs better than another? Quantify the resources required to execute:

Review Of Topics. Review: Induction

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

EQ: How do I convert between standard form and scientific notation?

1 Maintaining a Dictionary

22: Applications of Differential Calculus

Math 300 Introduction to Mathematical Reasoning Autumn 2017 Inverse Functions

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

PSRGs via Random Walks on Graphs

csci 210: Data Structures Program Analysis

CSCI 3110 Assignment 6 Solutions

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

19. TAYLOR SERIES AND TECHNIQUES

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

We introduce one more operation on sets, perhaps the most important

MITOCW big_picture_derivatives_512kb-mp4

3: Linear Systems. Examples. [1.] Solve. The first equation is in blue; the second is in red. Here's the graph: The solution is ( 0.8,3.4 ).

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

P, NP, NP-Complete, and NPhard

Big-O Notation and Complexity Analysis

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Introduction to Algorithms 6.046J/18.401J

CMPSCI611: Three Divide-and-Conquer Examples Lecture 2

(Refer Slide Time: 0:21)

Introduction to Divide and Conquer

CS1210 Lecture 23 March 8, 2019

Chapter 1 Review of Equations and Inequalities

Transcription:

More on Asymptotic Running Time Analysis CMPSC 122 I. Asymptotic Running Times and Dominance In your homework, you looked at some graphs. What did you observe? When analyzing the running time of algorithms, we really care about what happens in the long run, or what happens when n is large. Thus We are not terribly concerned with what happens when (input size is small) We are concerned with how quickly (running time functions grow) Details like don't matter when n is large (coeffs, lower order terms) We call this kind of analysis asymptotic analysis or, sometimes, aymptotics. We say that a function f (x) dominates a function g (x) when Examples: 1. Does 23n 2 dominate n + 7? 2. How about n3 5 vs. 2n3 + 7? 3. How about 450 3 n vs. 3 n? 4. How about 150 n vs. 20 n? (Please note that we are being a little informal with some of this analysis in this course.) Page 1 of 6 Prepared by D. Hogan for PSU CMPSC 122

II. Key Families of Functions It turns out that there are a few families of functions that tend to be important. Ranked in order of dominance, they are: 1. n n 2. n! 3. c n for all constants c s.t. c > 1, where if a > b, a n dominates b n 4. n c for all constants c s.t. c 2, where if a > b, c a dominates c b 5. n lg n 6. n 7. lg n 8. c Two notes: Since all constants grow at the same rate, anything that you might think to call Θ(c) gets simplified to Θ(1). In Θ-notation, each function in each of families 1-2 and 5-8 would all reduce to the same thing. Different constant choices in families 3 and 4 yield different Θ-notation expressions. This list leaves some matters vague, but covers most of the common functions that arise. You should convince yourself of the correctness of this ranking by plotting several of these kinds of functions on the same axis. Problem: Simplify each function of n to Θ-notation and then rank the functions by dominance. 3n! n! 12 + n3 17lgn + n 3 + nlgn 100 n + n 200000 + 50 n nlgn + nn 2 + 7n Problem: Simplify each function of n to Θ-notation and then rank the functions by dominance. n 35 + lgn lg 3n + 3 n 37 n(n +1) 2n(1+ 3lgn) III. O-Notation Another kind of asymptotic notation is O-notation. While O-notation was invented before Θ-notation, it is less precise. So, let's first give a more formal definition for Θ-notation: We say a function f (x) is Θ(g (x)) when (for all sufficiently large n ) Now, let's give a more definition for O-notation: We say a function f (x) is O(g (x)) when Page 2 of 6 Prepared by D. Hogan for PSU CMPSC 122

This asymptotic notation is said to put bounds on the functions. In this sense, Θ gives an asymptotic tight bound, while O gives an asymptotic upper bound. Or, analogously, (= vs. <=) Example: 2n 2 is and also 2n 2 is. But we could also say 2n 2 is. Example: n! is O( ) Example: 5n lg n is O( ), but not O( ). Example: n 2 + 7n + 6 is O( ), but not O( ) In some sense, we could think of a Θ-notation bound as being the best O-notation bound possible, and sometimes you'll see the expression "best big-o bound" as a result. IV. Example Code Fragments Having gone a bit further, let's look at some code examples. Once again, we will, in each case, count the exact number of elementary operations that are done. But we will then also simplify to Θ-notation too. (Intentionally, the first few are review, and the numbering picks up where we left off before. Example 1: for i = 1 to n a = b + c - i Example 8: for i = 1 to n a = b + c i for i = 1 to n a = b + c i Example 9: for i = 1 to n for j = 1 to n a = b + c + max(i, j) Page 3 of 6 Prepared by D. Hogan for PSU CMPSC 122

Example 10: for i = 1 to 2n for j = 1 to n a = b + c + max(i, j) Example 11: for i = 0 to n/2 for j = n/2 downto 1 for k = 1 to n a = ib + jc + kd Example 12: for i = 0 to n/2 for j = n/2 downto 1 for k = 1 to n a = ib + jc + kd z = z + 2^a Example 13: for i = 0 to n/2 for j = 1 to n a = j + 3a for j = n downto 1 b = 50 + j - a Example 14: for i = 1 to n for j = 1 to n/3 // 3 elementary operations for j = n/3 to 2n/3 // n elementary operations for j = 2n/3 to n // 1 elementary operation Page 4 of 6 Prepared by D. Hogan for PSU CMPSC 122

Problem: Rank the running time functions for all of the above examples from fastest growing to slowest growing. V. Back to Binary Search When we starting discussing analysis, we looked at binary search under the assumption n was a power of 2. What if it's not? Backing up, what did we conclude was the worst-case running time for a binary search when n is a power of 2? Let's look at a few not-as-nice input sizes, again, in the worst case: 33 34 37 63 150 700 How does this fit with asymptotic notation? Page 5 of 6 Prepared by D. Hogan for PSU CMPSC 122

VI. How This Fits in the Course and Curriculum Problem: Presumably, you've written code to traverse or walk an array, i.e., print out all of the elements of that array. Suppose an array has size n. What is the worst-case running time to walk an array and why? In the remainder of the course, we'll look at more advanced data structures, algorithms on those structures, and sorting algorithms. In all cases, we'll make a point of analyzing their asymptotic running times. Your final project will carry the analysis of running times of sorts further and make connections between theory and practice. Examples of code fragments we have handled here are ones where the math is "nice." In 360, we'll look at sequences and counting tools that will help you to be able to analyze the running time of "harder" problems. Finally, as 465 is the main analysis course in the curriculum, we'll look at more formal definitions of O and Θ-notation, as well as a few other kinds of asymptotic notation there. We'll also encounter many more advanced analysis techniques, and a running theme will be to analyze, formally, the running time of many new algorithms. Page 6 of 6 Prepared by D. Hogan for PSU CMPSC 122