More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Similar documents
CS 61B Asymptotic Analysis Fall 2017

Computational Complexity

CS473 - Algorithms I

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

An analogy from Calculus: limits

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Growth of Functions (CLRS 2.3,3)

P, NP, NP-Complete, and NPhard

Asymptotic Analysis 1

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

data structures and algorithms lecture 2

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Module 1: Analyzing the Efficiency of Algorithms

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Ch01. Analysis of Algorithms

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CS 4407 Algorithms Lecture 2: Growth Functions

Asymptotic Analysis and Recurrences

CS Non-recursive and Recursive Algorithm Analysis

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Asymptotic Analysis, recurrences Date: 9/7/17

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

The Time Complexity of an Algorithm

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

3.1 Asymptotic notation

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

EECS 477: Introduction to algorithms. Lecture 5

Data Structures and Algorithms Running time and growth functions January 18, 2018

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Data Structures and Algorithms. Asymptotic notation

The Time Complexity of an Algorithm

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Fall 2016 Test 1 with Solutions

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Math 391: Midterm 1.0 Spring 2016

Data Structures and Algorithms CSE 465

Written Homework #1: Analysis of Algorithms

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Homework 1 Submission

Divide-and-Conquer Algorithms Part Two

Lecture 3. Big-O notation, more recurrences!!

CSC236 Week 4. Larry Zhang

CS361 Homework #3 Solutions

Analysis of Algorithms

How many hours would you estimate that you spent on this assignment?

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Computational Complexity. This lecture. Notes. Lecture 02 - Basic Complexity Analysis. Tom Kelsey & Susmit Sarkar. Notes

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Analysis of Algorithms

CS 310 Advanced Data Structures and Algorithms

CSC Design and Analysis of Algorithms. Lecture 1

Cpt S 223. School of EECS, WSU

Lecture 3: Big-O and Big-Θ

N/4 + N/2 + N = 2N 2.

with the size of the input in the limit, as the size of the misused.

Asymptotic Analysis Cont'd

Big O (Asymptotic Upper Bound)

Lecture 2: Asymptotic Notation CSCI Algorithms I

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Ch 01. Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Analysis of Algorithms - Using Asymptotic Bounds -

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Chapter 1 Review of Equations and Inequalities

Asymptotic Analysis of Algorithms. Chapter 4

University of New Mexico Department of Computer Science. Midterm Examination. CS 361 Data Structures and Algorithms Spring, 2003

Introduction to Algorithms and Asymptotic analysis

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Menu. Lecture 2: Orders of Growth. Predicting Running Time. Order Notation. Predicting Program Properties

CSC236 Week 3. Larry Zhang

Algorithms Design & Analysis. Analysis of Algorithm

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Methods for solving recurrences

Asymptotic Running Time of Algorithms

Infinite Series Summary

COMP 9024, Class notes, 11s2, Class 1

This chapter covers asymptotic analysis of function growth and big-o notation.

CS173 Running Time and Big-O. Tandy Warnow

Asymptotic Algorithm Analysis & Sorting

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

COMP 355 Advanced Algorithms

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Midterm 1 for CS 170

Great Theoretical Ideas in Computer Science. Lecture 9: Introduction to Computational Complexity

Homework Assignment 1 Solutions

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 1: Introduction

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

CS325: Analysis of Algorithms, Fall Midterm

Transcription:

CS 61B More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018 Here is a review of some formulas that you will find useful when doing asymptotic analysis. ˆ N i=1 i = 1 + 2 + 3 + 4 + + N = N(N+1) 2 = N2 +N 2 ˆ N 1 i=0 2i = 1 + 2 + 4 + 8 + + 2 N 1 = 2 2 N 1 1 = 2 N 1 Intuition For the following recursive functions, give the worst case and best case running time in the appropriate O( ), Ω( ), or Θ( ) notation. The meta-strat on this problem is to explore a rigourous framework to analyze running time for recursive procedures. Specifically, one can derive the running time by drawing the recursive tree and accounting for three pieces of information. ˆ The height of the tree. ˆ The branching factor of each node. ˆ The amount of work each node contributes relative to its input size.

2 More Asymptotic Analysis 1.1 Give the running time in terms of N. 1 public void andslam(int N) { 2 if (N > 0) { 3 for (int i = 0; i < N; i += 1) { 4 System.out.println("datboi.jpg"); 5 } 6 andslam(n / 2); 7 } 8 } andslam(n) runs in time Θ(N) worst and best case. One potentially tricky portion is that the log n i=0 2 i is at most 2 because the geometric sum as it goes to infinity is bounded by 2! (2 *exclamation mark* not 2 factorial )

More Asymptotic Analysis 3 1.2 Give the running time for andwelcome(arr, 0, N) where N is the length of the input array arr. 1 public static void andwelcome(int[] arr, int low, int high) { 2 System.out.print("[ "); 3 for (int i = low; i < high; i += 1) { 4 System.out.print("loyal "); 5 } 6 System.out.println("]"); 7 if (high - low > 0) { 8 double coin = Math.random(); 9 if (coin > 0.5) { 10 andwelcome(arr, low, low + (high - low) / 2); 11 } else { 12 andwelcome(arr, low, low + (high - low) / 2); 13 andwelcome(arr, low + (high - low) / 2, high); 14 } 15 } 16 } andwelcome(arr, 0, N) runs in time Θ(N log N) worst case and Θ(N) best case. The recurrence relation is different for each case. In the worst case you always flip the wrong side of the coin resulting in a branching factor of 2. Because there is a branching factor of 2, there are 2 i nodes in the i-th layer. Meanwhile, the work you do per node is linear with respect to the size of the input. Hence in the i-th layer, the work done is about n 2 i. In the best case you always flip the right side of the coin giving a branching factor of 1. The analysis is then the same as the previous problem!

4 More Asymptotic Analysis 1.3 Give the running time in terms of N. 1 public int tothe(int N) { 2 if (N <= 1) { 3 return N; 4 } 5 return tothe(n - 1) + tothe(n - 1); 6 } For tothe(n) the worst and best case are Θ(2 N ). Notice that at the i-th layer, there are 2 i nodes. Each node does constant amount of work so with the fact that n i=0 2i = 2 n+1 1, we can derive the following.

More Asymptotic Analysis 5 1.4 Give the running time in terms of N. 1 public static void spacejam(int N) { 2 if (N <= 1) { 3 return; 4 } 5 for (int i = 0; i < N; i += 1) { 6 spacejam(n - 1); 7 } 8 } For spacejam(n) the worst and best case is O(N N!). Now for the i-th layer, the number of nodes is n (n 1)... (n i) since the branching factor starts at n and decrements by 1 each layer. Actually calculating the sum is a bit tricky because there is a pesky (n i)! term in the denominator. We can upper bound the sum by just removing the denominator, but in the strictest sense we would now have a big-o bound instead of big-θ.

6 More Asymptotic Analysis Hey you watchu gon do 2.1 For each example below, there are two algorithms solving the same problem. Given the asymptotic runtimes for each, is one of the algorithms guaranteed to be faster? If so, which? And if neither is always faster, explain why. (a) Algorithm 1: Θ(N), Algorithm 2: Θ(N 2 ) Algorithm 1: Θ(N) - Θ gives tightest bounds therefore the slowest algorithm 1 could run is relative to N while the fastest algorithm 2 could run is relative to N 2. (b) Algorithm 1: Ω(N), Algorithm 2: Ω(N 2 ) Neither, Ω(N) means that algorithm 1 s running time is lower bounded by N, but does not provide an upper bound. Hence the bound on algorithm 1 could not be tight and it could also be in Ω(N 2 ) or lower bounded by N 2. (c) Algorithm 1: O(N), Algorithm 2: O(N 2 ) Neither, same reasoning for part (b) but now with upper bounds. O(N 2 ) could also be in O(1). (d) Algorithm 1: Θ(N 2 ), Algorithm 2: O(log N) Algorithm 2: O(log N) - Algorithm 2 cannot run SLOWER than O(log N) while Algorithm 1 is constrained on to run FASTEST and SLOWEST by Θ(N 2 ). (e) Algorithm 1: O(N log N), Algorithm 2: Ω(N log N) Neither, Algorithm 1 CAN be faster, but it is not guaranteed - it is guaranteed to be as fast as or faster than Algorithm 2. Would your answers above change if we did not assume that N was very large (for example, if there was a maximum value for N, or if N was constant)? Depends, because for fixed N, constants and lower order terms may dominate the function we are trying to bound. For example N 2 is asymptotically larger than 10000N, yet when N is less than 10000, 10000N is larger than N 2. This highlights the power in using big-o because these lower order terms don t affect the running time as much as our input size grows very large! However, part of the definition of O( ), Ω( ), and Θ( ) is the limit to infinity (lim N ) so, when working with asymptotic notation, we must always assume large inputs.

More Asymptotic Analysis 7 Asymptotic Notation 3.1 Draw the running time graph of an algorithm that is O( N) in the best case and Ω(N) in the worst case. Assume that the algorithm is also trivially Ω(1) in the best case and O( ) in the worst case. Below we have the graph an example solution. Assume that the cyan region extends arbitrarily towards infinity. Ignoring constants, the algorithm running time takes at most N time (and trivially at least constant time) in the worst case. The algorithm also takes at least N time (and trivially at most infinite time) in the worst case. Extra: Following is a question from last week, now that you have properly learned about O( ), Ω( ), or Θ( ).

8 More Asymptotic Analysis 3.2 Are the statements in the right column true or false? If false, correct the asymptotic notation (Ω( ), Θ( ), O( )). Be sure to give the tightest bound. Ω( ) is the opposite of O( ), i.e. f(n) Ω(g(n)) g(n) O(f(n)). f(n) = 20501 f(n) = n 2 + n f(n) = 2 2n + 1000 f(n) = log(n 100 ) f(n) = n log n + 3 n + n f(n) = n log n + n 2 f(n) = n log n g(n) = 1 g(n) = 0.000001n 3 g(n) = 4 n + n 100 g(n) = n log n g(n) = n 2 + n + log n g(n) = log n + n 2 g(n) = (log n) 2 f(n) O(g(n)) f(n) Ω(g(n)) f(n) O(g(n)) f(n) Θ(g(n)) f(n) Ω(g(n)) f(n) Θ(g(n)) f(n) O(g(n)) ˆ True, although Θ( ) is a better bound. ˆ False, O( ). Even though n 3 is strictly worse than n 2, n 2 is still in O(n 3 ) because n 2 is always as good as or better than n 3 and can never be worse. ˆ True, although Θ( ) is a better bound. ˆ False, O( ). ˆ True. ˆ True. ˆ False, Ω( ). Fall 2015 Extra 4.1 If you have time, try to answer this challenge question. For each answer true or false. If true, explain why and if false provide a counterexample. (a) If f(n) O(n 2 ) and g(n) O(n) are positive-valued functions (that is for all n, f(n), g(n) > 0), then f(n) g(n) O(n). Nope this does not hold in general! Consider if f(n) = n 2 and g(n) = 1 n. Readily we have f(n), g(n) O(n) but when divided they give us: f(n) g(n) = n2 n 1 = n3 / O(n) (b) If f(n) Θ(n 2 ) and g(n) Θ(n) are positive-valued functions, then f(n) g(n) Θ(n). This does hold in general! We can think about this in two cases: ˆ First we ask, when can the ratio f(n) g(n) be larger than n. As f(n) is tightly bounded (by Θ) by n 2, this is only true when g(n) is asymptotically smaller than n because we are dividing n 2 (this is what happened in part a). However, g(n) is tightly bounded, and thus lower bounded by n, this cannot happen.

More Asymptotic Analysis 9 ˆ Next we ask, when can the ratio be smaller than n. Again as f(n) is tightly bounded by n 2, this can only happen when g(n) is asymptotically bigger than n as again we are dividing. But since g(n) is tightly bounded, and thus upper bounded by n, this too cannot happen. So what we note here is that f(n) g(n) is upper and lower bounded by n hence it is in Θ(n). We can also give a rigorous proof from definition of part b using the definitions provided in class. Theorem 1. If f(n) Θ(n 2 ) and g(n) Θ(n) are positive-valued functions, then f(n) g(n) Θ(n). Proof. Given that f Θ(n 2 ) is positive, by definition there exists k 0, k 0 > 0 such that for all n > N, the following holds. k 0 n 2 f(n) k 0n 2 Similarly, g Θ(n) implies there exists k 1, k 1 > 0 such that Now consider f(n) g(n). k 1 n g(n) k 1n f(n) g(n) k 0n 2 k 1 n = k 0n O(n) k 1 f(n) g(n) k 0n 2 k 1 n = k 0n k 1 Ω(n) As f(n) g(n) is in O(n) and Ω(n) then it is in Θ(n).