How many hours would you estimate that you spent on this assignment?

Similar documents
MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

Analysis of Algorithms

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Asymptotic Analysis 1

Big-oh stuff. You should know this definition by heart and be able to give it,

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

CS173 Running Time and Big-O. Tandy Warnow

COMP 355 Advanced Algorithms

2. ALGORITHM ANALYSIS

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

COMP 355 Advanced Algorithms Algorithm Design Review: Mathematical Background

Computational Complexity

Growth of Functions (CLRS 2.3,3)

CS 310 Advanced Data Structures and Algorithms

Written Homework #1: Analysis of Algorithms

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Asymptotic Analysis Cont'd

data structures and algorithms lecture 2

Theory of Computation

CS1210 Lecture 23 March 8, 2019

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

CS383, Algorithms Spring 2009 HW1 Solutions

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

Data Structures and Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

CSE373: Data Structures and Algorithms Lecture 2: Math Review; Algorithm Analysis. Hunter Zahn Summer 2016

Intro to Theory of Computation

Algorithms Design & Analysis. Analysis of Algorithm

Analysis of Algorithms

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Data Structures and Algorithms Running time and growth functions January 18, 2018

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Copyright 2000, Kevin Wayne 1

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Homework 1 Solutions

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

CS Non-recursive and Recursive Algorithm Analysis

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

Fall 2016 Test 1 with Solutions

Divide-and-Conquer Algorithms Part Two

with the size of the input in the limit, as the size of the misused.

CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis. Catie Baker Spring 2015

Data Structure and Algorithm Homework #1 Due: 2:20pm, Tuesday, March 12, 2013 TA === Homework submission instructions ===

Math 391: Midterm 1.0 Spring 2016

Chapter 2: The Basics. slides 2017, David Doty ECS 220: Theory of Computation based on The Nature of Computation by Moore and Mertens

Principles of Algorithm Analysis

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

Assignment 5 Bounding Complexities KEY

CS361 Homework #3 Solutions

Quiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

COMPUTER ALGORITHMS. Athasit Surarerks.

IS 709/809: Computational Methods in IS Research Fall Exam Review

CSCE 750 Final Exam Answer Key Wednesday December 7, 2005

Homework Assignment 1 Solutions

3.1 Asymptotic notation

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Algorithm Efficiency. Algorithmic Thinking Luay Nakhleh Department of Computer Science Rice University

P, NP, NP-Complete, and NPhard

An analogy from Calculus: limits

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

Homework 1 Submission

Module 1: Analyzing the Efficiency of Algorithms

CS103 Handout 08 Spring 2012 April 20, 2012 Problem Set 3

The Time Complexity of an Algorithm

This chapter covers asymptotic analysis of function growth and big-o notation.

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

The Time Complexity of an Algorithm

Big-O Notation and Complexity Analysis

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Lecture 3. Big-O notation, more recurrences!!

CSE373: Data Structures and Algorithms Lecture 2: Proof by Induction. Linda Shapiro Spring 2016

Advanced Algorithmics (6EAP)

Analysis of Algorithms

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

1 Terminology and setup

CSE 417: Algorithms and Computational Complexity

Data Structures and Algorithms Chapter 2

Ch01. Analysis of Algorithms

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

University of Connecticut Department of Mathematics

COMP 555 Bioalgorithms. Fall Lecture 3: Algorithms and Complexity

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

CMSC Discrete Mathematics FINAL EXAM Tuesday, December 5, 2017, 10:30-12:30

CSC Design and Analysis of Algorithms. Lecture 1

Foundations II: Data Structures and Algorithms

Data Structures and Algorithms CSE 465

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

Theory of Computation

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

Transcription:

The first page of your homework submission must be a cover sheet answering the following questions. Do not leave it until the last minute; it s fine to fill out the cover sheet before you have completely finished the assignment. Assignments submitted without a cover sheet, or with a cover sheet obviously dashed off without much thought at the last minute, will not be graded. How many hours would you estimate that you spent on this assignment? Explain (in one or two sentences) one thing you learned through doing this assignment. What is one thing you think you need to review or study more? What do you plan to do about it? Homework 2 (Asymptotic analysis) 1 Due: 1:10pm, Friday, September 7

Question 1 (K&T 2.2). Suppose you have algorithms with the six running times listed below. (Assume these are the exact number of operations performed as a function of the input size n.) Suppose you have a computer that can perform 10 10 operations per second, and you need to compute a result in at most an hour of computation. For each of the algorithms, what is the largest input size n for which you would be able to get the result within an hour? 1 1. n 2 2. n 3 3. 100n 2 4. n log 2 n 5. 2 n 6. 2 2n 7. n! In addition to O, Θ, and Ω, we will also sometimes use the notation o(g(n)) (little-o) to mean that a function is O(g(n)) but not Θ(g(n)). So, if lim n T(n)/g(n) = 0, then T(n) is o(g(n)). Intuitively, if big-o is like less than or equal to, little-o is like less than ; if f (n) is o(g(n)) then f grows strictly more slowly than g. Question 2 (Some asymptotic properties). (a) Show that n j is o(n k ) whenever j < k. (b) Prove that log a n is Θ(log b n) for any positive integer bases a, b > 1. Conclude that we are justified in writing simply Θ(log n) without caring about the base of the logarithm. (c) Prove that n k is o(b n ) for any positive integer k and any real number b > 1. For example, n 295 is o(1.0001 n ). In the long run, any polynomial function grows more slowly than any exponential function! Hint: recall, or look up, the change-of-base formula for logarithms. Hint: use L Hôpital s rule k times. Recall that the derivative of b n with respect to n is b n ln b. Homework 2 (Asymptotic analysis) 2 Due: 1:10pm, Friday, September 7

Question 3 (K&T 2.3). Take the following list of functions and arrange them in ascending order of asymptotic growth rate. That is, if function g(n) immediately follows function f (n) in your list, then it should be the case that f (n) is O(g(n)). Please prove/justify your claims. 1. f 1 (n) = n 2.5 2. f 2 (n) = 2n 3. f 3 (n) = n + 10 4. f 4 (n) = 10 n 5. f 5 (n) = 100 n 6. f 6 (n) = n 2 log n Question 4. Characterize the asymptotic behavior of each of the following in terms of n, using Θ. Give a brief justification for each answer. For items asking for the time needed to perform some operation or solve some problem, you should describe the worst case running time of the best possible algorithm. Please come ask for help if you are unsure about any of these! (a) 1 + 2 + 3 + 4 + + n (b) Average number of array lookups needed to find a given element in a sorted array of length n. (c) Number of two-element subsets of a set of size n. (d) 1 + 2 + 4 + 8 + + 2 n (e) Total number of nodes in a balanced binary tree with n leaves. (f) Number of edges in a graph with n nodes, where every node is connected to every other node. {{ n 4(e) (g) Number of different ways to arrange n people in a line. (h) Average number of steps needed to find a given element in a linked list of length n. (i) Biggest integer that can be represented with n bits. 4(f) (j) Worst-case number of swaps needed to sort an array of length n if you are only allowed to swap adjacent elements. (k) Worst-case number of steps needed to check whether two lists, each containing n integers (not necessarily sorted), have any element in common. 4(k) Homework 2 (Asymptotic analysis) 3 Due: 1:10pm, Friday, September 7

(l) Number of array lookups performed by this Python code: for i in range(0,n): for j in range (0,i): sum += array[i][j] (m) Number of addition operations performed by this Java code: for (int i = 0; i < n; i += 3) { sum = sum + i; (n) Total number of calls to println performed by this Java code: for (int i = 1; i < n; i *= 2) { for (j = 0; j < 20; j++) { System.out.println(i + j); (o) Total number of calls to print performed by this Python code: def foo(n): print(n) if n > 0: foo(n-1) foo(n-1) Question 5 (K&T 2.8). You re doing some stress-testing on various models of glass jars to determine the height from which they can be dropped and still not break. The setup for this experiment, on a particular type of jar, is as follows. You have a ladder with n rungs, and you want to find the highest rung from which you can drop a copy of the jar and not have it break. We call this the highest safe rung. It might be natural to try binary search: drop a jar from the middle rung, see if it breaks, and then recursively try from rung n/4 or 3n/4 depending on the outcome. But this has the drawback that you could break a lot of jars in finding the answer. If your primary goal were to conserve jars, on the other hand, you could try the following strategy. Start by dropping a jar from the first rung, then the second rung, and so forth, climbing one higher each time until the jar breaks. In this way, you only need a single jar at the moment it breaks, you have the correct answer but you may have to drop it n times (rather than log n times as in the binary search solution). So here is the trade-off: it seems you can perform fewer drops if you re willing to break more jars. To understand better how this trade-off works at Homework 2 (Asymptotic analysis) 4 Due: 1:10pm, Friday, September 7

a quantitative level, let s consider how to run this experiment given a fixed budget of k 1 jars. In other words, you have to determine the correct answer the highest safe rung and can use at most k jars in doing so. (a) Suppose you are given a budget of k = 2 jars. Describe a strategy for finding the highest safe rung that requires you to drop a jar at most f (n) times, for some function f (n) that grows slower than linearly. (In other words, f (n) should be o(n), that is, lim n f (n)/n = 0.) (b) Now suppose you have a budget of k > 2 jars, for some given k. Describe a strategy for finding the highest safe rung using at most k jars. If f k (n) denotes the number of times you need to drop a jar according to your strategy, then the functions f 1, f 2, f 3,... should have the property that each grows asymptotically slower than the previous one: lim n f k (n)/ f k 1 (n) = 0 for each k. Homework 2 (Asymptotic analysis) 5 Due: 1:10pm, Friday, September 7