Analysis of Algorithms - Midterm (Solutions)

Similar documents
Advanced Analysis of Algorithms - Midterm (Solutions)

Advanced Analysis of Algorithms - Homework I (Solutions)

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 3

Automata Theory - Quiz II (Solutions)

Algorithm Design and Analysis

Divide-and-conquer: Order Statistics. Curs: Fall 2017

CPS 616 DIVIDE-AND-CONQUER 6-1

Data Structures and Algorithms

b + O(n d ) where a 1, b > 1, then O(n d log n) if a = b d d ) if a < b d O(n log b a ) if a > b d

COMP 9024, Class notes, 11s2, Class 1

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

Divide-and-Conquer Algorithms Part Two

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Review Of Topics. Review: Induction

Methods for solving recurrences

CS 350 Midterm Algorithms and Complexity

In-Class Soln 1. CS 361, Lecture 4. Today s Outline. In-Class Soln 2

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Asymptotic Analysis and Recurrences

CSCI 3110 Assignment 6 Solutions

Recommended readings: Description of Quicksort in my notes, Ch7 of your CLRS text.

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Mergesort and Recurrences (CLRS 2.3, 4.4)

Chapter 2. Recurrence Relations. Divide and Conquer. Divide and Conquer Strategy. Another Example: Merge Sort. Merge Sort Example. Merge Sort Example

3.1 Asymptotic notation

Week 5: Quicksort, Lower bound, Greedy

Data Structures and Algorithms Chapter 3

This means that we can assume each list ) is

Algorithm Design and Analysis

CS 161 Summer 2009 Homework #2 Sample Solutions

1. [10 marks] Consider the following two algorithms that find the two largest elements in an array A[1..n], where n >= 2.

1 Terminology and setup

Realization Plans for Extensive Form Games without Perfect Recall

Divide and Conquer. Arash Rafiey. 27 October, 2016

Asymptotic Algorithm Analysis & Sorting

Solutions. Problem 1: Suppose a polynomial in n of degree d has the form

Computational Complexity

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithms. Quicksort. Slide credit: David Luebke (Virginia)

MIDTERM I CMPS Winter 2013 Warmuth

Grade 11/12 Math Circles Fall Nov. 5 Recurrences, Part 2

CSE 421 Greedy Algorithms / Interval Scheduling

CS 4104 Data and Algorithm Analysis. Recurrence Relations. Modeling Recursive Function Cost. Solving Recurrences. Clifford A. Shaffer.

COL106: Data Structures and Algorithms (IIT Delhi, Semester-II )

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS 2110: INDUCTION DISCUSSION TOPICS

Design and Analysis of Algorithms

Divide and Conquer. CSE21 Winter 2017, Day 9 (B00), Day 6 (A00) January 30,

Data selection. Lower complexity bound for sorting

Data Structures and Algorithms Running time and growth functions January 18, 2018

Analysis of Algorithms - Using Asymptotic Bounds -

Selection and Adversary Arguments. COMP 215 Lecture 19

CS 4407 Algorithms Lecture 3: Iterative and Divide and Conquer Algorithms

Outline. 1 Introduction. 3 Quicksort. 4 Analysis. 5 References. Idea. 1 Choose an element x and reorder the array as follows:

CS 361, Lecture 14. Outline

Chapter 4 Divide-and-Conquer

Homework 1 Submission

Divide and Conquer Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 14

CS 374: Algorithms & Models of Computation, Spring 2017 Greedy Algorithms Lecture 19 April 4, 2017 Chandra Chekuri (UIUC) CS374 1 Spring / 1

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

COMP Analysis of Algorithms & Data Structures

CS361 Homework #3 Solutions

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Dynamic Programming: Interval Scheduling and Knapsack

COMP Analysis of Algorithms & Data Structures

CSE 421 Dynamic Programming

Divide and Conquer CPE 349. Theresa Migler-VonDollen

2. ALGORITHM ANALYSIS

Partition and Select

Exercise Sheet #1 Solutions, Computer Science, A M. Hallett, K. Smith a k n b k = n b + k. a k n b k c 1 n b k. a k n b.

CS 577 Introduction to Algorithms: Strassen s Algorithm and the Master Theorem

Concrete models and tight upper/lower bounds

Solving recurrences. Frequently showing up when analysing divide&conquer algorithms or, more generally, recursive algorithms.

Lecture 4. Quicksort

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR Design and Analysis of Algorithms Department of Mathematics MA21007/Assignment-2/Due: 18 Aug.

CPSC 320 Sample Final Examination December 2013

Linear Time Selection

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

Divide and Conquer. Andreas Klappenecker

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

Sorting algorithms. Sorting algorithms

(tree searching technique) (Boolean formulas) satisfying assignment: (X 1, X 2 )

Multidimensional Divide and Conquer 1 Skylines

Divide and Conquer. Andreas Klappenecker. [based on slides by Prof. Welch]

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Divide-Conquer-Glue. Divide-Conquer-Glue Algorithm Strategy. Skyline Problem as an Example of Divide-Conquer-Glue

Priority queues implemented via heaps

CSE 4502/5717: Big Data Analytics

Algorithm efficiency analysis

Fall 2017 November 10, Written Homework 5

CMSC 451: Lecture 7 Greedy Algorithms for Scheduling Tuesday, Sep 19, 2017

CS Analysis of Recursive Algorithms and Brute Force

CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.

COE428 Notes Week 4 (Week of Jan 30, 2017)

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

MATH 324 Summer 2011 Elementary Number Theory. Notes on Mathematical Induction. Recall the following axiom for the set of integers.

The maximum-subarray problem. Given an array of integers, find a contiguous subarray with the maximum sum. Very naïve algorithm:

Chapter 5. Divide and Conquer CLRS 4.3. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Transcription:

Analysis of Algorithms - Midterm (Solutions) K Subramani LCSEE, West Virginia University, Morgantown, WV {ksmani@cseewvuedu} 1 Problems 1 Recurrences: Solve the following recurrences exactly or asymototically You may assume any convenient form for n (a) T (1) = 1 T (n) = T ( 3 n) + 1, n > 1 (b) T (1) = 0 T (n) = 4T ( n ) + n log n, n > 1 Solution: (a) Put n = 3 k Accordingly, the recurrence can be restated as: T (3 0 ) = 1 T (3 k ) = T (3 k 3 ) + 1, k > 0 Let G(k) denote T (3 k ) Accordingly, the above recurrence can be represented as: G(0) = 1 G(k) = G( k 3 ) + 1, k > 0 Using one of the many techniques discussed in class (expansion, induction, the Master Theorem), it is easily seen that G(k) = log 3 n, from which it follows that T (n) = log 3 log 3 n (b) We use the Master Theorem to solve this recurrence As per the pattern discussed in class, a = 4, b = and f(n) = n log n It is clear that f(n) Θ(n log4 log 1 n), from which it follows that T (n) Θ(n log n) Binary Trees: Let T denote a proper binary tree with n internal nodes We define E(T ) to be the sum of the depths of all the external nodes of T ; likewise, I(T ) is defined to be the sum of the depths of all the internal nodes of T Prove that E(T ) = I(T ) + n Solution: We use induction on the number of internal nodes in T 1

Base case: n = 1 In this case, T consists of a root node with a left child and right child node The root node is the only internal node and hence I(T ) is 0; its two children are the only external nodes and hence E(T ) is 1 + 1 = Since E(T ) = I(T ) + 1, the conjecture is proven in the base case Assume that if T is a proper binary tree with i internal nodes, where i k then E(T ) = I(T ) + i Now consider a proper binary tree T having exactly k + 1 internal nodes Let h denote the height of this tree Since T is proper, there are at least two external nodes, which are children of the same internal node Splice out these external nodes to get a new tree proper binary tree T having k internal nodes (since a node that was internal in T has now become external) As per the inductive hypothesis, we must have E(T ) = I(T ) + k Observe that in T two external nodes at depth h in T have been removed and one node which was internal in T at depth h 1 has been added; hence, E(T ) = E(T ) h + (h 1) Likewise, a node which was internal in T at depth h 1 is now external in T and hence I(T ) = I(T ) (h 1) We thus have, E(T ) I(T ) = E(T ) I(T ) h + (h 1) + (h 1) = E(T ) I(T ) E(T ) I(T ) = E(T ) I(T ) + E(T ) I(T ) = k + E(T ) = I(T ) + (k + 1) Thus, using the second principle of mathematical induction we can conclude that the conjecture is true for all proper binary trees regardless of the number of internal nodes 3 Greedy: Assume that you are given a set S of n activities {a 1, a,, a n } Associated with activity a i are its start time s i and finish time f i ; if activity a i is selected then it must start at s i and finish before f i Two activities a i and a j are compatible, if s i f j or s j f i ; otherwise, they are conflicting Design an algorithm that outputs the largest set of compatible activities Solution: Algorithm 11 represents a greedy approach to output the maximum number of mutually compatible activities Function MAX-ACTIVITY-SELECT(S) 1: Let R denote a subset of mutually compatible activities : Set R = φ 3: Order the activities by their finish times so that f 1 f f n 4: for (i = 1 to n) do 5: if (activity a i is compatible with the activities already in R) then 6: R = R {a i } 7: end if 8: end for 9: return(r) Algorithm 11: Greedy algorithm for maximum compatible activity subset Assume that Algorithm 11 is not optimal and there exists another algorithm, say A, which produces a solution R, such that R > R Claim 11 If a 1 R, then a 1 can always be made part of R, without decreasing the number of activities in R

Proof: Insert a 1 into R ; clearly it must conflict with some activities in R Otherwise, R a 1 is a feasible set, which violates the optimality of R Let a i, a j R denote two jobs that conflict with a 1 Since f 1 f i, f j, we must have s i, s j f 1 However, this means that both a i and a j straddle a 1, which forces them to conflict with each other It follows that a i and a j conflict with each other as well! Thus, there can be at most one activity in R that conflicts with a 1 ; replacing that activity with a 1 preserves the cardinality of R Let k be the smallest index such that a k R and a k R Thrust a k into R ; using the same argument as before, a k can conflict with at most one activity in R ; replacing that activity with a k does not affect the cardinality of R, but brings it one activity closer to R Working in this fashion, we can gradually transform R such that it includes all the activities in R, without decreasing its cardinality Once this transformation has been carried out, we claim that there are no additional activities in R Assume that there exists an activity, say a p R, such that a p R Let a q denote the finish time of the last activity that was added to R We consider two possibilities: (a) s p f q - In this case, the greedy algorithm would have considered a p and added it to R, since it does not conflict with any of the jobs already in R (b) s p < f q - If f p f q, then a p conflicts with a q and hence cannot be part of R If f p < f q, then the greedy algorithm would have considered a p before a q ; the fact that a p R implies that it conflicted with some of the activities already chosen in R! We have thus established that any optimal solution can be transformed into the greedy one, ie, the greedy approach does produce the optimal solution 4 Sorting: Analogous to the notion of worst-case running time for an algorithm, is the notion of best-case running time, which is the minimum amount of time that an algorithm needs to accomplish its task Argue that the best-case running time of Quicksort (in terms of element-to-element comparisons) is Ω(n log n) (It is interesting to note that the best-case running time of Insertion sort is O(n)) Solution: We focus on the computation tree of Quicksort; recall that we used the computation tree to demonstrate that the expected running time of Quicksort is O(n log n) Indeed the running time of the Quicksort algorithm is O(n) h, where h is the height of the computation tree We observe that the height of a binary tree (or any tree, for that matter) is minimized, when the tree is balanced, ie, external nodes occur only at level h and possibly level h 1 Accordingly, for the best-case performance of Quicksort, the partition procedure must divide the array into approximately equal portions Letting T (n) denote the best-case running time of Quicksort on an array of n elements, we get, T (1) = 0 T (n) = T ( n 1 ) + (n 1) We argue using induction, that T (n) G(n) = 1 10n log n n Since T (1) G(1), the base case is proven Assume that T (n) G(n) for all n k Observe that T (k + 1) = T ( k ) + k as per definition [ 1 k 10 log k k ] + k as per inductive hypothesis 3

= k 10 log k = k 10 log k k 10 We then observe that, k 10 log k k 1 (k + 1) log(k + 1) (k + 1) 10 10 k log k k (k + 1) log(k + 1) (k + 1) k log k k (k + 1) log(k + 1) 10(k + 1) 9k + 10 (k + 1) log(k + 1) k log k But (k + 1) log(k + 1) k log k (k + 1)[log k + 1] k log k] = (k + 1) + log k Hence, 9k + 10 (k + 1) log(k + 1) k log k, as long as 8k + 9 log k, which is true for all k We have thus shown that T (n) Ω(G(n)); it is not hard to show that G(n) Ω(n log n); we can thus conclude that T (n) Ω(n log n) 5 Divide and Conquer: Design a Divide-And-Conquer algorithm to discover both the maximum and minimum of an array A of n elements using at most 3n element-to-element comparisons Formally prove that your algorithm makes at most 3 n element-to-element comparisons Solution: We assume that there are at least elements in the array; otherwise, the problem is ill-defined Further, we assume that the number of elements in A is an exact power of, in order to simplify the exposition Algorithm 1 represents a Divide-And-Conquer approach for computing both the minimum and maximum elements of the input array Function MAXMIN(A, low, high) 1: if (high low + 1 = ) then : if (A[low] < A[high]) then 3: max = A[high]; min = A[low] 4: return((max, min)) 5: else 6: max = A[low]; min = A[high] 7: return((max, min)) 8: end if 9: else 10: mid = low+high 11: (max l, min l ) = MAXMIN(A, low, mid) 1: (max r, min r ) =MAXMIN(A, mid + 1, high) 13: end if 14: Set max to the larger of max l and max r ; likewise, set min to the smaller of min l and min r 15: return((max, min)) Algorithm 1: Divide and Conquer algorithm for computing maximum and minimum of an array Let T (n) denote the number of element-to-element comparisons carried out by Algorithm 1 We have, T () = 1 T (n) = T ( n ) +, n > 4

Substituting n = k and using the expansion method discussed in class, it is straightforward to see that T (n) 3 n T ( k ) = T ( k 1 ) + = [ T ( k ) + ] + = T ( k ) + + = [ T ( k 3 ) + ] + + = 3 T ( k 3 ) + 3 + + = = k 1 T ( k (k 1) ) + k 1 + k + + But T ( k (k 1) ) = T ( 1 ) = 1 and hence, T ( k ) = k 1 j=1 j + k 1 Note that k 1 k j = j=1 j=0 j = [0 ( k 1 1)] 1 = k sum of a geometric progression It follows that T (n) = T ( k ) = k 1 + k = 1 k + k = 3 k = 3n 3n 5