i=1 i B[i] B[i] + A[i, j]; c n for j n downto i + 1 do c n i=1 (n i) C[i] C[i] + A[i, j]; c n

Similar documents
Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Analysis of Algorithms

Lecture 2: Asymptotic Notation CSCI Algorithms I

data structures and algorithms lecture 2

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures

Written Homework #1: Analysis of Algorithms

Big-O Notation and Complexity Analysis

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

3.1 Asymptotic notation

Analysis of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Module 1: Analyzing the Efficiency of Algorithms

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CS Non-recursive and Recursive Algorithm Analysis

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Growth of Functions (CLRS 2.3,3)

Asymptotic Analysis 1

Analysis of Algorithms

Fall 2016 Test 1 with Solutions

Computational Complexity

Homework 1 Solutions

Asymptotic Notation. such that t(n) cf(n) for all n n 0. for some positive real constant c and integer threshold n 0

CIS 121 Data Structures and Algorithms with Java Spring Big-Oh Notation Monday, January 22/Tuesday, January 23

Data Structures and Algorithms. Asymptotic notation

Introduction to Computer Science Lecture 5: Algorithms

Review Of Topics. Review: Induction

Divide and Conquer CPE 349. Theresa Migler-VonDollen

How many hours would you estimate that you spent on this assignment?

Data Structures and Algorithms CSE 465

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Asymptotic Analysis Cont'd

Cpt S 223. School of EECS, WSU

Algorithms Test 1. Question 1. (10 points) for (i = 1; i <= n; i++) { j = 1; while (j < n) {

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Foundations II: Data Structures and Algorithms

CS473 - Algorithms I

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Data Structures and Algorithms Chapter 2

Lecture 2. More Algorithm Analysis, Math and MCSS By: Sarah Buchanan

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Asymptotic Analysis. Slides by Carl Kingsford. Jan. 27, AD Chapter 2

The Time Complexity of an Algorithm

More Asymptotic Analysis Spring 2018 Discussion 8: March 6, 2018

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

On my honor I affirm that I have neither given nor received inappropriate aid in the completion of this exercise.

CS173 Running Time and Big-O. Tandy Warnow

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

CS2223 Algorithms D Term 2009 Exam 3 Solutions

CSCI 3110 Assignment 6 Solutions

CSC Design and Analysis of Algorithms. Lecture 1

Algorithms Design & Analysis. Analysis of Algorithm

COMP 9024, Class notes, 11s2, Class 1

EECS 477: Introduction to algorithms. Lecture 5

Ch01. Analysis of Algorithms

The Time Complexity of an Algorithm

2. ALGORITHM ANALYSIS

COMPUTER ALGORITHMS. Athasit Surarerks.

P, NP, NP-Complete, and NPhard

Review 3. Andreas Klappenecker

Divide-and-Conquer Algorithms Part Two

Lecture 1: Asymptotic Complexity. 1 These slides include material originally prepared by Dr.Ron Cytron, Dr. Jeremy Buhler, and Dr. Steve Cole.

Homework 1 Submission

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CS 350 Midterm Algorithms and Complexity

CSCE 222 Discrete Structures for Computing. Review for Exam 2. Dr. Hyunyoung Lee !!!

Md Momin Al Aziz. Analysis of Algorithms. Asymptotic Notations 3 COMP Computer Science University of Manitoba

CS 4407 Algorithms Lecture 2: Growth Functions

CIS 121. Analysis of Algorithms & Computational Complexity. Slides based on materials provided by Mary Wootters (Stanford University)

Time complexity analysis

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Analysis Fundamentals

An analogy from Calculus: limits

Analysis of Algorithms - Using Asymptotic Bounds -

Data Structures and Algorithms Running time and growth functions January 18, 2018

CSE 417: Algorithms and Computational Complexity

Analysis of Multithreaded Algorithms

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

Algorithms, CSE, OSU. Introduction, complexity of algorithms, asymptotic growth of functions. Instructor: Anastasios Sidiropoulos

COMP 382: Reasoning about algorithms

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

When we use asymptotic notation within an expression, the asymptotic notation is shorthand for an unspecified function satisfying the relation:

CSCI Honor seminar in algorithms Homework 2 Solution

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Problem 5. Use mathematical induction to show that when n is an exact power of two, the solution of the recurrence

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

Introduction to Algorithms

CSE332: Data Abstrac0ons Sec%on 2. HyeIn Kim Spring 2014

Analysis of Algorithms

Big O (Asymptotic Upper Bound)

Linear Selection and Linear Sorting

CS F-01 Algorithm Analysis 1

Computational Complexity - Pseudocode and Recursions

Lecture 5: Loop Invariants and Insertion-sort

Divide and Conquer. Arash Rafiey. 27 October, 2016

2. a. Consider the definition-based algorithm for finding the difference betweentwonxnmatrices.

Transcription:

Fundamental Algorithms Homework #1 Set on June 25, 2009 Due on July 2, 2009 Problem 1. [15 pts] Analyze the worst-case time complexity of the following algorithms,and give tight bounds using the Theta notation. You need show the key steps involved in the analyses. a) cost times procedure matrixvector(n: integer) var i, j: integer; for i 1 to n do c 1 n B[i] 0; 1 n C[i] 0; 1 n for j 1 to i do c n 2 i B[i] B[i] + A[i, j]; c n 3 i for j n downto i + 1 do c n 4 (n i) C[i] C[i] + A[i, j]; c n 5 (n i) The first for loop runs n times. The second for loop runs n i = n(n + 1)/2 times while the third for loop runs n (n i) = n(n 1)/2 times. The total running time will therefore be: c 1 n + n + n + (c 2 + c 3 ) n(n + 1)/2 + (c 4 + c 5 ) n(n 1)/2. [3 pts] The whole expression can be re-written as: a 1 n + a 2 n 2, which is Θ(n 2 ). [2 pts] b) Algorithm MysteryPrint(A[1... n]) if n = 1 then print A[1]; else print A[n]; call MysteryPrint(A[1... n 1]); call MysteryPrint(A[2... n]); ; Assume that the time complexity of the above algorithm is T (n). The recursion calls itself with parameters of n 1 elements twice (MysteryPrint(A[1... n 1]) and MysteryPrint(A[2... n]): 1

the number of elements in the array gets reduced by 1 every time this function call is made). Then if statement and print A[n] are both called once in each recursion. We can express this relationship as: { T (n) = 2T (n 1) + c, if n 2 T (1) = c Solving the recurrence, we can expand it as: T (n) = 2T (n 1) + c = 2(2T (n 2) + c) + c = 2 2 T (n 2) + 2 1 c + c = 2 2 (2T (n 3) + c) + 2 1 c + c = 2 3 T (n 3) + 2 2 c + 2 1 c + c = = 2 n 1 T (1) + 2 n 2 c + 2 n 3 c + + 2 1 c + c n 1 = c 2 i = c (2 n 1) i=0 Hence, the time complexity of this algorithm is Θ(2 n ). [1 pt] [2 pts] [2 pts] c) cost times procedure whileloop(n: integer) var i, count: integer; count 0; c 1 1 i 1; c 2 1 while i < n do c 3 log 3 n i 3i; c 4 log 3 n count count + 1; c 5 log 3 n The important thing to understand in this algorithm is the fact that the while loop will run only k times where 3 k < n. This gives k < log 3 n. Hence, the running time of this algorithm will be: c 1 + c 2 + (c 3 + c 4 + c 5 ) log 3 n. [3 pts] Hence, the time complexity is Θ(log 3 n) or Θ(log n) (Base of the log is irrelevant for asymptotic analysis). [2 pts] Problem 2. [10 pts] [Levitin, p. 60] Question 7 (a, c). 2

a) If t(n) O(g(n)), then g(n) Ω(t(n)) [5 pts] Proof: Since t(n) O(g(n)), there exist some positive constant c 1 and some nonnegative integer n 1 such that: Then, we have t(n) c 1 g(n) for all n n 1 g(n) 1 c 1 t(n) for all n n 1 Hence, g(n) Ω(t(n)) with constant c = 1/c 1 and n 0 = n 1. Q.E.D. c) Θ(g(n)) = O(g(n)) Ω(g(n)) [5 pts] Proof: For any t(n), t(n) O(g(n)) Ω(g(n)) iff t(n) O(g(n)) and t(n) Ω(g(n)) iff there exist some positive constant c 1, c 2 and some nonnegative integer n 1, n 2, such that: t(n) c 1 g(n) for all n n 1 and t(n) c 2 g(n) for all n n 2 iff c 2 g(n) t(n) c 1 g(n) for all n n 0, where n 0 = max{n 1, n 2 } iff t(n) Θ(g(n)). Q.E.D. Problem 3. [10 pts] Insertion sort can be expressed as a recursive procedure as follows. Given A[1... n], we recursively sort A[1... n 1] and then insert element A[n] into the sorted array A[1... n 1]. Write a recurrence relation for the running time of this recursive version of insertion sort. Feel free to use the big-o notation and assume the worst case. Can you also solve the recurrence relation? Is your result the same as the one obtained by the summation analysis in the textbook (pp. 160-162)? Answer: procedure InsertionSort(A[1... n]) if n = 1 then return A[n]; else InsertionSort(A[1... n 1]); Insert(A[n],A[1... n 1]); 3

; The InsertionSort calls itself with parameter of n 1 once, then Insert function is called to insert A[n] into the sorted array. Assuming that array data structure is being used, in the worst case, the running time for each Insert is Θ(n) because we need to shift elements to make room for A[n] to be inserted. So we can express the running time with the recurrence relation as follows: T (1) = c Solving the recurrence relation { T (n) = T (n 1) + Θ(n), for n 2 [4 pts] T (n) = T (n 1) + Θ(n) = T (n 2) + Θ(n 1) + Θ(n) = = T (1) + Θ(2) + + Θ(n 1) + Θ(n) = Θ(1) + Θ(2) + + Θ(n 1) + Θ(n) n n = Θ(i) = Θ( i) [2 pts] = Θ(n(n + 1)/2) = Θ(n 2 ) So the time complexity is Θ(n 2 ), the same as the one obtained by summation analysis in the textbook. [4 pts] Problem 4. [15 pts] Give a tight asymptotic bound for T (n) in each of the following recurrences. Assume that T (n) is a constant for n 2. Justify your answers. a. T (n) = 16T (n/4) + n 2 b. T (n) = 7T (n/3) + n 2 c. T (n) = T (n 1) + n 2 You may solve these recurrences using either the Master Theorem or the backward substitution method. Answer: a. a = 16, b = 4, d = 2, so a = 16 = b d. By applying case 2 of Master Theorem([Levitin, p. 483]), T (n) Θ(n 2 log n) [5 pts] b. a = 7, b = 3, d = 2, so a = 7 < 9 = b d. By applying case 1 of Master Theorem, T (n) Θ(n 2 ) [5 pts] 4

c. By using backward substitution method, T (n) = T (n 1) + n 2 = T (n 2) + (n 1) 2 + n 2 = T (n 3) + (n 2) 2 + (n 1) 2 + n 2 = = T (1) + 2 2 + 3 2 + + (n 1) 2 + n 2 n = c 1 + i 2 = c 1 + n(n + 1)(2n + 1)/6 So T (n) Θ(n 3 ). [5 pts] Problem 5. [15 pts] Describe a Θ(n log n) time algorithm that, given a set S of n integers and another integer x, determines if or not there exist two elements in S whose sum is exactly x. Analyze the running time of your algorithm to make sure that it is really Θ(n log n) (it would be sufficient to show that the running time is O(n log n)). Before you present the pseudo-code, use a few words to describe the basic idea(s) in your algorithm to help the reader to understand your algorithm. Hint: Merge/heap sort and binary search. Answer: Describing your basic idea: [2 pts] Consider a small example: S = {5, 13, 8, 20, 2, 6, 10, 1} x = 22 First, we need to sort S, and get S = {1, 2, 5, 6, 8, 10, 13, 20} Then, we scan the elements in S from left to right, and for each element S[i], we search for x S[i] in S[i + 1... n]. In the example, S[1] = 1, so we try to search for x S[1] = 21 in the rest of the list S[2... 8] (not found) S[2] = 2, so we try to search for x S[2] = 20 in the rest of the list S[3... 8] (found) Then we are done. The main thing here is that we need to perform a binary search to avoid a linear scan of the list each time. Pseudo-code: [8 pts] 5

procedure sumquery(s[1... n], x) var i: integer; Sort(S[1... n]) // any O(n log n) sorting algorithm i 1; while(i < n) do BinarySearch(x S[i], S[i + 1... n]); if found, return yes ; else i i + 1 return no ; ; Running time: [5 pts] Sorting takes O(n log n) Each BinarySearch takes O(log n) Perform the BinarySearch n 1 times, which would make the whole while loop O(n log n) Combining both Sort and BinarySearch, the total running time for this algorithm will be O(n log n) + O(n log n) = O(n log n) 6