CS 61B Asymptotic Analysis Fall 2017

Similar documents
More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

COMP 382: Reasoning about algorithms

Computational Complexity

Growth of Functions (CLRS 2.3,3)

Fall 2016 Test 1 with Solutions

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Module 1: Analyzing the Efficiency of Algorithms

data structures and algorithms lecture 2

Order Notation and the Mathematics for Analysis of Algorithms

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

Lecture 3. Big-O notation, more recurrences!!

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Written Homework #1: Analysis of Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

CS473 - Algorithms I

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

CS Non-recursive and Recursive Algorithm Analysis

Asymptotic Analysis 1

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Reading 10 : Asymptotic Analysis

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Divide and Conquer CPE 349. Theresa Migler-VonDollen

Central Algorithmic Techniques. Iterative Algorithms

Introduction to Algorithmic Complexity. D. Thiebaut CSC212 Fall 2014

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Data Structures and Algorithms CSE 465

Copyright 2000, Kevin Wayne 1

Analysis of Algorithms - Using Asymptotic Bounds -

Lecture 3: Big-O and Big-Θ

University of New Mexico Department of Computer Science. Midterm Examination. CS 361 Data Structures and Algorithms Spring, 2003

Algorithms Design & Analysis. Analysis of Algorithm

Module 1: Analyzing the Efficiency of Algorithms

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Big O (Asymptotic Upper Bound)

3.1 Asymptotic notation

Lecture 2: Asymptotic Notation CSCI Algorithms I

P, NP, NP-Complete, and NPhard

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Ch01. Analysis of Algorithms

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Input Decidable Language -- Program Halts on all Input Encoding of Input -- Natural Numbers Encoded in Binary or Decimal, Not Unary

Divide-and-Conquer Algorithms Part Two

CS 161 Summer 2009 Homework #2 Sample Solutions

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Data Structures and Algorithms CMPSC 465

Math 391: Midterm 1.0 Spring 2016

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

Analysis of Algorithms

Analysis of Multithreaded Algorithms

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

An analogy from Calculus: limits

How many hours would you estimate that you spent on this assignment?

CSC236 Week 4. Larry Zhang

COMP Analysis of Algorithms & Data Structures

Midterm 1 for CS 170

COMP Analysis of Algorithms & Data Structures

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Problem Set 1 Solutions

Homework 1 Submission

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

INF2220: algorithms and data structures Series 1

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Introduction to Divide and Conquer

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CSCI 3110 Assignment 6 Solutions

Asymptotic Running Time of Algorithms

Objective. - mathematical induction, recursive definitions - arithmetic manipulations, series, products

CSE 417: Algorithms and Computational Complexity

Homework 1 Solutions

ECE250: Algorithms and Data Structures Analyzing and Designing Algorithms

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Algorithm runtime analysis and computational tractability

EECS 477: Introduction to algorithms. Lecture 5

Data Structures and Algorithms Running time and growth functions January 18, 2018

The Time Complexity of an Algorithm

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS173 Running Time and Big-O. Tandy Warnow

Outline. 1 Introduction. Merging and MergeSort. 3 Analysis. 4 Reference

Divide and Conquer. Arash Rafiey. 27 October, 2016

Asymptotic Analysis Cont'd

Data selection. Lower complexity bound for sorting

NAME (1 pt): SID (1 pt): TA (1 pt): Name of Neighbor to your left (1 pt): Name of Neighbor to your right (1 pt):

CSC236 Week 3. Larry Zhang

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

Introduction to Algorithms: Asymptotic Notation

Transcription:

CS 61B Asymptotic Analysis Fall 2017 1 More Running Time Give the worst case and best case running time in Θ( ) notation in terms of M and N. (a) Assume that slam() Θ(1) and returns a boolean. 1 public void comeon() { 2 int j = 0; 3 for (int i = 0; i < N; i += 1) { 4 for (; j < M; j += 1) { 5 if (slam(i, j)) 6 break; 9 10 for (int k = 0; k < 1000 * N; k += 1) { 11 System.out.println("space jam"); 12 } 13 } (a) For comeon() the worst case is Θ(M + N) and the best case is Θ(N). To see this, note that j is declared outside of the for loops. In the best case, slam(i, j) always returns true. However, in the worst case where it s always false, j can only increment to at most M. CS 61B, Fall 2017, Asymptotic Analysis 1

2 Recursive Running Time For the following recursive functions, give the worst case and best case running time in Θ( ) notation. This exercise targets understanding of how branching and input size affect the running time of analyzing recursive functions. I would explain each problem by writing the recurrence relation, draw the tree out, and ask (1) what s the branching factor, (2) how much work does each node contribute, (3) what s the height of the tree. (a) Give the running time in terms of N. 1 public void andslam(int N) { 2 if (N > 0) { 3 for (int i = 0; i < N; i += 1) { 4 System.out.println("bigballer.jpg"); 5 } 6 andslam(n / 2); (b) Give the running time for andwelcome(arr, 0, N) where N is the length of the input array arr. 1 public static void andwelcome(int[] arr, int low, int high) { 2 System.out.print("[ "); 3 for (int i = low; i < high; i += 1) { 4 System.out.print("loyal "); 5 } 6 System.out.println("]"); 7 if (high - low > 0) { 8 double coin = Math.random(); 9 if (coin > 0.5) { 10 andwelcome(arr, low, low + (high - low) / 2); 11 } else { 12 andwelcome(arr, low, low + (high - low) / 2); 13 andwelcome(arr, low + (high - low) / 2, high); 14 } 15 } 16 } (c) Give the running time in terms of N. 1 public int tothe(int N) { 2 if (N <= 1) { 3 return N; 4 } 5 return tothe(n - 1) + tothe(n - 1) + tothe(n - 1); 6 } CS 61B, Fall 2017, Asymptotic Analysis 2

(d) Give the running time in terms of N 1 public static void spacejam(int N) { 2 if (N == 1) { 3 return; 4 } 5 for (int i = 0; i < N; i += 1) { 6 spacejam(n-1); (a) andslam(n) runs in time Θ(N) worst and best case. The recurrence relation is T (N) = T ( N 2 ) + O(N). (b) andwelcome(arr, 0, N) runs in time Θ(N logn) worst case and Θ(N) best case. The recurrence relation is different for each case. In the worst case you always flip the wrong side of the coin resulting in a branching factor of 2. The recurence relation is T (N) = 2T ( N 2 ) + O(N). In the best case you always flip the right side of the coin giving a branching factor of 1. The recurrence relation is T (N) = T ( N 2 ) + O(N). (c) For tothe(n) the worst and best case are Θ(3 N ). The recurrence relation is T (n) = 3T (n 1)+O(1). Draw the call tree out and for each node notice that there is constant work associated with it. The running time scales w.r.t. the number of nodes and we know that the number of nodes for a complete ternary tree is of order 3 N. (d) For spacejam(n) the worst and best case is Θ(N!). The recurrence relation is T (n) = nt (n 1) + O(1). To explain, I would again draw the tree out and make an argument based on the number of nodes. CS 61B, Fall 2017, Asymptotic Analysis 3

3 Hey you watchu gon do? For each example below, there are two algorithms solving the same problem. Given the asymptotic runtimes for each, is one of the algorithms guaranteed to be faster? If so, which? And if neither is always faster, explain why. Assume the algorithms have very large input (so N is very large). (a) Algorithm 1: Θ(N), Algorithm 2: Θ(N 2 ) (b) Algorithm 1: Ω(N), Algorithm 2: Ω(N 2 ) (c) Algorithm 1: O(N), Algorithm 2: O(N 2 ) (d) Algorithm 1: Θ(N 2 ), Algorithm 2: O(logN) (e) Algorithm 1: O(N log N), Algorithm 2: Ω(N log N) (a) Algorithm 1: Θ(N) - straight forward, Θ gives tightest bounds (b) Neither, something in Ω(N) could also be in Ω(N 2 ) (c) Neither, something in O(N 2 ) could also be in O(1) (d) Algorithm 2: O(logN) - Algorithm 2 cannot run SLOWER than O(logN) while Algorithm 1 is constrained on best and worst case by Θ(N 2 ). (e) Neither, Algorithm 1 CAN be faster, but is not guaranteed - it is guaranteed to be "as fast as or faster" than Algorithm 2. Why did we need to assume that N was large? Asymptotic bounds often only make sense as N gets large, because constant factors may result in a function with a smaller order of growth growing faster than a faster one. For example, take the functions 1000n and n 2. n 2 is asymptotically larger than 1000n, but for small n, it will seem that 1000n is larger than n 2. 4 Big Ballin Bounds 1. Prove the following bounds by finding some constant M > 0 and input N > 0 for M R,N N such that f and g satisfy the relationship. (a) f O(g) for f = 2n, g = n 2 (b) f Ω(g) for f = 0.1n, g = 40 (c) f Θ(g) for f = log(n), g = log(n a ), for a > 0. (a) Let M = 2, N = 1. Then we see for all n N, f (n) Mg(n). (b) Let M = 400 1, N = 1. Then we see for all n N, f (n) Mg(n). Alternatively, M = 1 and N = 400 also satisfies the bound. CS 61B, Fall 2017, Asymptotic Analysis 4

(c) For M = 1 a,n = 2, we have log(n) = 1 a log(na ). It should be noted that this is an equality, which proves that these functions are asymptotically the same. 2. Answer the following claims with true or false. If false, provide a counterexample. (a) If f (n) O(g(n)), then 500 f (n) O(g(n)). (b) If f (n) Θ(g(n)), then 2 f (n) Θ(2 g(n) ). (a) True. (b) False, consider f (n) = n, g(n) = 2n. Then 2 f (n) = 2 n, but 2 g(n) = 2 2n = (2 2 ) n = 4 n. CS 61B, Fall 2017, Asymptotic Analysis 5