NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: Time Allowed 2 Hours

Similar documents
Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

1 Some loose ends from last time

University of Toronto Department of Electrical and Computer Engineering. Final Examination. ECE 345 Algorithms and Data Structures Fall 2016

Trees. A tree is a graph which is. (a) Connected and. (b) has no cycles (acyclic).

Breadth-First Search of Graphs

CSE 421 Introduction to Algorithms Final Exam Winter 2005

FINAL EXAM PRACTICE PROBLEMS CMSC 451 (Spring 2016)

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Computational Complexity and Intractability: An Introduction to the Theory of NP. Chapter 9

Algorithms Exam TIN093 /DIT602

CS325: Analysis of Algorithms, Fall Final Exam

CSCE 750 Final Exam Answer Key Wednesday December 7, 2005

NOTE: You have 2 hours, please plan your time. Problems are not ordered by difficulty.

CS781 Lecture 3 January 27, 2011

Exam Practice Problems

Algorithms and Data Structures (COMP 251) Midterm Solutions

(1) Which of the following are propositions? If it is a proposition, determine its truth value: A propositional function, but not a proposition.

CS 170 Algorithms Spring 2009 David Wagner Final

Midterm Exam. CS 3110: Design and Analysis of Algorithms. June 20, Group 1 Group 2 Group 3

CS 70 Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand Midterm 1 Solutions

Advanced Combinatorial Optimization September 22, Lecture 4

University of New Mexico Department of Computer Science. Final Examination. CS 362 Data Structures and Algorithms Spring, 2007

Combinatorial Optimization

Advanced Combinatorial Optimization September 24, Lecture 5

Propositional Logic. What is discrete math? Tautology, equivalence, and inference. Applications

Lecture 2: Divide and conquer and Dynamic programming

Analysis of Algorithms

CMPSCI 311: Introduction to Algorithms Second Midterm Exam

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

ACO Comprehensive Exam March 17 and 18, Computability, Complexity and Algorithms

Exam in Discrete Mathematics

IS 709/809: Computational Methods in IS Research Fall Exam Review

CSI 4105 MIDTERM SOLUTION

CS60007 Algorithm Design and Analysis 2018 Assignment 1

Algorithms Design & Analysis. Approximation Algorithm

Algorithm Design Strategies V

25. Minimum Spanning Trees

25. Minimum Spanning Trees

CMPSCI 611 Advanced Algorithms Midterm Exam Fall 2015

NP-Completeness. Until now we have been designing algorithms for specific problems

CS 350 Algorithms and Complexity

CS 350 Algorithms and Complexity

Contents Lecture 4. Greedy graph algorithms Dijkstra s algorithm Prim s algorithm Kruskal s algorithm Union-find data structure with path compression

Algorithm Design and Analysis

CPSC 320 (Intermediate Algorithm Design and Analysis). Summer Instructor: Dr. Lior Malka Final Examination, July 24th, 2009

CMSC Discrete Mathematics SOLUTIONS TO FIRST MIDTERM EXAM October 18, 2005 posted Nov 2, 2005

7.4 DO (uniqueness of minimum-cost spanning tree) Prove: if all edge weights are distinct then the minimum-cost spanning tree is unique.

Dynamic Programming: Shortest Paths and DFA to Reg Exps

CPSC 320 Sample Final Examination December 2013

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

University of California Berkeley CS170: Efficient Algorithms and Intractable Problems November 19, 2001 Professor Luca Trevisan. Midterm 2 Solutions

Approximation algorithm for Max Cut with unit weights

CSE 321 Solutions to Practice Problems

Partition is reducible to P2 C max. c. P2 Pj = 1, prec Cmax is solvable in polynomial time. P Pj = 1, prec Cmax is NP-hard

CS1800 Discrete Structures Spring 2018 February CS1800 Discrete Structures Midterm Version A

Undirected Graphs. V = { 1, 2, 3, 4, 5, 6, 7, 8 } E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6 } n = 8 m = 11

Breadth First Search, Dijkstra s Algorithm for Shortest Paths

Advanced Combinatorial Optimization Updated February 18, Lecture 5. Lecturer: Michel X. Goemans Scribe: Yehua Wei (2009)

Lecture 10 February 4, 2013

ICS 252 Introduction to Computer Design

arxiv: v1 [cs.ds] 2 Oct 2018

Notes on the Matrix-Tree theorem and Cayley s tree enumerator

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi

from notes written mostly by Dr. Matt Stallmann: All Rights Reserved

Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 10

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Midterm Exam 2 Solutions

CSE 591 Homework 3 Sample Solutions. Problem 1

Approximation algorithms for cycle packing problems

More on NP and Reductions

Very few Moore Graphs

i times p(p(... (p( n))...) = n ki.

Algorithm Design and Analysis

Data Structures and Algorithms (CSCI 340)

Greedy Algorithms. Kleinberg and Tardos, Chapter 4

A necessary and sufficient condition for the existence of a spanning tree with specified vertices having large degrees

Problem One: Order Relations i. What three properties does a binary relation have to have to be a partial order?

Lecture 4: NP and computational intractability

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

CS 170, Fall 1997 Second Midterm Professor Papadimitriou

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

CSC 421: Algorithm Design & Analysis. Spring 2018

CS281A/Stat241A Lecture 19

Greedy Algorithms My T. UF

Assignment 4. CSci 3110: Introduction to Algorithms. Sample Solutions

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

CSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Packing and Covering Dense Graphs

Exact Algorithms for Dominating Induced Matching Based on Graph Partition

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017

Copyright 2013 Springer Science+Business Media New York

Fall 2017 November 10, Written Homework 5

(c) Give a proof of or a counterexample to the following statement: (3n 2)= n(3n 1) 2

CMPSCI611: The Matroid Theorem Lecture 5

CS 580: Algorithm Design and Analysis

besides your solutions of these problems. 1 1 We note, however, that there will be many factors in the admission decision

Algorithms and Theory of Computation. Lecture 9: Dynamic Programming

Climbing an Infinite Ladder

Geometric Steiner Trees

Aditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016

Transcription:

NATIONAL UNIVERSITY OF SINGAPORE CS3230 DESIGN AND ANALYSIS OF ALGORITHMS SEMESTER II: 2017 2018 Time Allowed 2 Hours INSTRUCTIONS TO STUDENTS 1. This assessment consists of Eight (8) questions and comprises 18 printed pages (including this page). 2. Answer ALL questions. 3. Maximum marks for the paper is 50. 4. This is an Open Book assessment. 5. Write your student number on each page. 6. Student Number (DO NOT WRITE YOUR NAME) Question Internal Examiner External Examiner Average Q1a Q1b Q2 Q3 Q4 Q5 Q6 Q7 Q8 Total

CS4232 Question 1 For each of the following parts, state whether the statement is true or false, and give a short proof for correctness of your answer. (a) (4 marks) Suppose T (0) = T (1) = 1. For n > 1, suppose T (n) = 3T ( n/2 ) + 5n 2 + 3. Then, T (n) Θ(n 2 log n) Answer: False. Using master theorem, a = 3, b = 2, k = 2. Thus, b k = 2 2 > 3 = a. Thus, T (n) Θ(n 2 ), and thus not Θ(n 2 log n). 2

(b) (4 marks) n! 2 n O(n2 ) Answer: False. Suppose by way of contradiction otherwise that n! 2 n cn 2, for n n 0 and some constant c > 0. But, n! 2 n(n 1)(n 2) n 16 n 3 2 n 4 2... 2 2, for n 100. Thus, n 3 64 cn2, for n 100 + n 0, or n 64c, for n 100 + n 0, which is a contradiction. 3

Question 2 (6 marks) Suppose we are given n sorted lists. Suppose the only further operations allowed to us is to merge two sorted lists (which may be among the above lists or obtained in turn by merging some of the lists), as done in class. Suppose further that merging two lists of size k and l requires time k + l. Give an algorithm to determine the order in which the n lists (having m 1, m 2,..., m n numbers respectively), should be merged so that the merging operations take the least amount of time. Note that the aim above is to minimize the merging time of lists (and not the time taken by your algorithm). Give an argument why your algorithm gives optimal order of merging. Ans: This problem is similar to the algorithm for optimal Huffman coding. Just as in optimal Huffman coding, we merge the two lists with least weights first, to form a combined list (which is then again put back in the pool of lists). We continue in a manner similar to optimal Huffman coding. Just as optimal Huffman coding gives optimal coding, the above method will give the least total cost for merging the lists. 4

Question 3 (6 marks) Build a table to show how the dynamic programming algorithm done in class will work for finding the optimal algorithm for the following matrix multiplication. M 1 M 2 M 3 M 4, where M 1 is a matrix of size 8 3 M 2 is a matrix of size 3 2 M 3 is a matrix of size 2 5 M 4 is a matrix of size 5 7 Answer: F (1, 1) = F (2, 2) = F (3, 3) = 0. F (1, 2) = 8 3 2 = 48 F (2, 3) = 3 2 5 = 30 F (3, 4) = 2 5 7 = 70 F (1, 3) = min(f (1, 2)+F (3, 3)+8 2 5, F (1, 1)+F (2, 3)+8 3 5) = min(48+0+80, 0+30+120) = 128 F (2, 4) = min(f (2, 3)+F (4, 4)+3 5 7, F (2, 2)+F (3, 4)+3 2 7) = min(30+0+105, 0+70+42) = 112 F (1, 4) = min(f (1, 1) + F (2, 4) + 8 3 7, F (1, 2) + F (3, 4) + 8 2 7, F (1, 3) + F (4, 4) + 8 5 7) = min(0 + 112 + 168, 48 + 70 + 112, 128 + 0 + 280) = min(280, 230, 408) = 230. 5

Question 4 (6 marks) Recall the problem of providing coin change: We are given coin denominations d 1 > d 2 > d 3 >... > d n = 1 and an amount S. The aim is to find the minimal number of coins needed to provide change for amount S using the denominations d 1, d 2,... d n. Prove or give a counterexample to the following claim: Claim: For any given coin denominations d 1, d 2,..., d n : if the greedy algorithm done in class is optimal for obtaining coin change for all amounts S n LCM(d 1, d 2,..., d n ), then the greedy algorithm is optimal for all values of S. Here LCM( ) denotes the least common multiple of the arguments. Answer: Suppose the greedy algorithm is optimal for any S n LCM(d 1, d 2,..., d n ). Now suppose by induction we have shown that the greedy algorithm is optimal for all values of S m. We then show that it is optimal for S = m + 1. Consider the optimal solution for S = m + 1. As m + 1 > n LCM(d 1, d 2,..., d n ), there must exist an i with 1 i n, such that more than LCM(d 1, d 2,..., d n ) value was obtained by using coins of denomination d i. Thus, we can replace the LCM(d 1, d 2,..., d n ) value formed using these coins by using coins of denomination d 1 without increasing the number of coins used. Thus, there is at least one coin of denomination d 1 used by optimal solution. Removing this coin from both greedy and optimal solution and using induction, we have that greedy method is optimal. 6

Question 5 (6 marks) Consider the following weighted adjacency matrix wt for an undirected graph having vertices {1, 2, 3, 4, 5}. In the adjacency matrix below, the entry wt(i, j) denotes the weight of the edge between vertex i and j (where the weight is if there is no such edge). wt = 0 13 5 1 13 0 6 3 5 6 0 2 1 1 2 0 3 1 0 Find the shortest path distance starting from vertex 1 to all other vertices using Dijkstra s algorithm done in class. Please show how the distance vector Dist[ ] gets updated after every iteration of the while loop in the algorithm. For this question, you need to give the shortest path distance, and need not give the shortest path itself. Answer: Selected Dist(1) Dist(2) Dist(3) Dist(4) Dist(5) 0 1 0 13 5 1 4 0 13 3 1 3 0 9 3 1 4 5 0 7 3 1 4 Thus, the optimal distances to vertices 2, 3, 4, 5 are respectively 7, 3, 1 and 4. 7

Question 6 (6 marks) Consider the following divide and conquer algorithm for finding a minimal spanning tree proposed by Professor Martin. (a) In case the tree has only one vertex, then return spanning tree as containing the only vertex. (b) Otherwise, divide the set of vertices into two nearly equal halves. (The division in two halves might be arbitrary; in case of odd number of vertices, one half will have one more vertex than the other). (c) Find the minimal spanning tree for each half. (d) Find the minimal weight edge which has two endpoints in two different halves as found in (b) above. (e) Use the edge found in (d) above to connect the two spanning trees found in (c) above. Does the above algorithm always give a minimal spanning tree? If so, give an argument for it. If not, give a counterexample. Ans: No. Consider the following counterexample: G = (V, E), where V = {1, 2, 3, 4} and wt(1, 2) = 9, wt(3, 4) = 1, wt(1, 3) = 1, wt(2, 4) = 2. And, the initial division is {1, 2} and {3, 4}. Then optimal MST is by using the edges (1, 3), (2, 4), (3, 4) (with weight 4). However, the algorithm above will give the spanning tree, (1, 2), (3, 4), (1, 3) (with weight 11). 8

Question 7 (6 marks) Suppose T is a Huffman coding tree for the frequencies f 1, f 2, f 3,..., f n, where f 1 and f 2 have the same parent. Consider the tree T with f 1 and f 2 deleted, and the parent of f 1 and f 2 labeled with frequency f 1 + f 2. Consider the following conjecture: If T is optimal for frequencies f 1 +f 2, f 3,..., f n then T is optimal for f 1, f 2,..., f n. Either prove the conjecture to be true or give a counterexample. Ans: conjecture is false, as given by the frequencies, f 1 = 5, f 2 = 6, f 3 = 1. 9

Question 8 (6 marks) Consider the following problem. INSTANCE/INPUT: (a) A set U and (b) a collection C of some subsets of U (in other words, C is a subset of power set of U). QUESTION: Does there exist a subset S of U such that for all sets X C, [X S and X (U S) ]. Show that the above problem is NP-complete. Answer: It is easy to see that the above problem is in NP: Guess S and verify that for all sets X C, both X S and X (U S) are nonempty. Consider NAESAT problem V = {x 1, x 2,..., x n }, Cl = {c 1, c 2,..., c m }, and c i = (l 1 i, l2 i, l3 i ). Then reduce NAESAT to the problem of Q4 as follows: U = {u i, w i : 1 i n}. C = {{u i, w i } : 1 i n} {{a 1 i, a2 i, a3 i } : 1 i n}, where ar i = u j, if l r i = x j, and a r i = w j, if l r i = x j. It is straightforward to verify that above reducton can be done in polynomial ti me. Furthermore, if the NAESAT problems has a solution, then S = {u i : x i is true} {w i : x i is false} witnesses the solution for Q4 problem. If the problem in the question has a solution, then note that exactly one of u i, w i belongs to S. Setting x i to be true iff u i S gives a solution to NAESAT problem. 10