HW11. Due: December 8, 2016

Similar documents
HW11. Due: December 6, 2018

HW8. Due: November 1, 2018

SAT, Coloring, Hamiltonian Cycle, TSP

SOLUTION: SOLUTION: SOLUTION:

NP Complete Problems. COMP 215 Lecture 20

Lecture 20: PSPACE. November 15, 2016 CS 1010 Theory of Computation

3130CIT Theory of Computation

Polynomial-time Reductions

CSCI3390-Second Test with Solutions

6.045J/18.400J: Automata, Computability and Complexity Final Exam. There are two sheets of scratch paper at the end of this exam.

Computational Complexity

Umans Complexity Theory Lectures

NP-COMPLETE PROBLEMS. 1. Characterizing NP. Proof

Complexity Theory Final Exam

9.1 HW (3+3+8 points) (Knapsack decision problem NP-complete)

Intro to Theory of Computation

Midterm II : Formal Languages, Automata, and Computability

Non-Approximability Results (2 nd part) 1/19

6.045 Final Exam Solutions

Welcome to... Problem Analysis and Complexity Theory , 3 VU

Tractability. Some problems are intractable: as they grow large, we are unable to solve them in reasonable time What constitutes reasonable time?

NP Completeness and Approximation Algorithms

Technische Universität München Summer term 2010 Theoretische Informatik August 2, 2010 Dr. J. Kreiker / Dr. M. Luttenberger, J. Kretinsky SOLUTION

Algorithms Design & Analysis. Approximation Algorithm

Final Exam (Version B) December 16, 2014 Name: NetID: Section: 1 2 3

Design and Analysis of Algorithms

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

Lecture 23: More PSPACE-Complete, Randomized Complexity

CS 6505, Complexity and Algorithms Week 7: NP Completeness

CSC 1700 Analysis of Algorithms: P and NP Problems

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Data Structures in Java

Final Exam Version A December 16, 2014 Name: NetID: Section: # Total Score

Umans Complexity Theory Lectures

Approximation Algorithms

-bit integers are all in ThC. Th The following problems are complete for PSPACE NPSPACE ATIME QSAT, GEOGRAPHY, SUCCINCT REACH.

Data Structures and Algorithms (CSCI 340)

Topics in Complexity Theory

Theory of Computation Chapter 9

Computability and Complexity Theory: An Introduction

NP-Complete Problems. More reductions

Definition: Alternating time and space Game Semantics: State of machine determines who

CSCI 1590 Intro to Computational Complexity

Notes on Complexity Theory Last updated: October, Lecture 6

CSC 5170: Theory of Computational Complexity Lecture 5 The Chinese University of Hong Kong 8 February 2010

CS/ECE 374 Final Exam Study Problems Fall 2016

P P P NP-Hard: L is NP-hard if for all L NP, L L. Thus, if we could solve L in polynomial. Cook's Theorem and Reductions

Chapter 5 : Randomized computation

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

Classes of Problems. CS 461, Lecture 23. NP-Hard. Today s Outline. We can characterize many problems into three classes:

Introduction to Advanced Results

Introduction to Complexity Theory

1 More finite deterministic automata

Time Complexity. Definition. Let t : n n be a function. NTIME(t(n)) = {L L is a language decidable by a O(t(n)) deterministic TM}

Beyond NP [HMU06,Chp.11a] Tautology Problem NP-Hardness and co-np Historical Comments Optimization Problems More Complexity Classes

Undecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65

Problems, and How Computer Scientists Solve Them Manas Thakur

1. Introduction Recap

Complexity (Pre Lecture)

Correctness of Dijkstra s algorithm

Limits to Approximability: When Algorithms Won't Help You. Note: Contents of today s lecture won t be on the exam

Lecture 24: Randomized Complexity, Course Summary

Advanced topic: Space complexity

NP-Completeness. Until now we have been designing algorithms for specific problems

COMP Analysis of Algorithms & Data Structures

Automata Theory CS Complexity Theory I: Polynomial Time

Polynomial-Time Reductions

CSE 105 THEORY OF COMPUTATION

CS256 Applied Theory of Computation

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

CSE200: Computability and complexity Space Complexity

Lecture Notes Each circuit agrees with M on inputs of length equal to its index, i.e. n, x {0, 1} n, C n (x) = M(x).

NP-completeness. Chapter 34. Sergey Bereg

CSCE 551 Final Exam, Spring 2004 Answer Key

Show that the following problems are NP-complete

NP-Complete Reductions 2

A An Overview of Complexity Theory for the Algorithm Designer

Time and space classes

CS151 Complexity Theory. Lecture 1 April 3, 2017

CS21 Decidability and Tractability

NP and Computational Intractability

CS154, Lecture 18: 1

Lecture 12: Randomness Continued

Week 3: Reductions and Completeness

CS154, Lecture 15: Cook-Levin Theorem SAT, 3SAT

INAPPROX APPROX PTAS. FPTAS Knapsack P

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

Lecture 8: Complete Problems for Other Complexity Classes

NP Completeness and Approximation Algorithms

CS154, Lecture 13: P vs NP

Pr[X = s Y = t] = Pr[X = s] Pr[Y = t]

DAA Unit IV Complexity Theory Overview

CSC373: Algorithm Design, Analysis and Complexity Fall 2017

Essential facts about NP-completeness:

COP 4531 Complexity & Analysis of Data Structures & Algorithms

Lecture 19: Finish NP-Completeness, conp and Friends

NP and NP-Completeness

20.1 2SAT. CS125 Lecture 20 Fall 2016

Transcription:

CSCI 1010 Theory of Computation HW11 Due: December 8, 2016 Attach a fully filled-in cover sheet to the front of your printed homework. Your name should not appear anywhere except on the cover sheet; each individual page of the homework should include your Banner ID instead. While collaboration is encouraged in this class, please remember not to take away notes from any collaboration sessions. Also please list the names and logins of any of your collaborators at the beginning of each homework. Please monitor Piazza, as we will post clarifications of questions there. You should hand in your solutions by 12:55 to the CSCI 1010 bin on the second floor of the CIT. Late homeworks are not accepted. Problem 1 Walter Chestnutt has been sent to jail. Due to his lack of knowledge about PSPACE, he was unable to win his trial by LosingSubset, and has thus lost his freedom and root vegetable status. However, the obfuscated mangelwurzel he stole from the O Nions has yet to be found, and some of Walter s henchmen are still at large. In the meantime, Töhm Eric has finally passed his Turing test thanks, in part, to the help of unsuspecting humans. As a key player in Walter s support of turmeric, Töhm has some ideas about where the mangelwurzel might be. The only problem is transportation, as Töhm knows that Walter s genetic experiments produced some crazy mangelwurzel specimens. Töhm wants to transport the mangelwurzel in as few root vegetable crates as possible. Each crate can hold a total volume of b mangelwurzels. There are n mangelwurzels in Walter s secret lair, and their volumes are represented as S = {x 1,x 2,...,x n } where x i is the volume of the ith mangelwurzel. Some mangelwurzels are massive, but none are of so large a volume that they will not fit into a root vegetable crate. Although he wants to pack all of the mangelwurzels into the minimum number of crates, Töhm doesn t want to encounter any of Walter s other henchmen or competitors, so he uses an approximation algorithm to save time. The following algorithm adds mangelwurzels one by one to the root vegetable crates. It starts with the first crate. Once Töhm encounters a mangelwurzel

that does not fit into the first crate, he places it in the second crate. He then adds mangelwurzels to the second crate until he encounters a mangelwurzel for which there is no room. After all mangelwurzels are loaded into crates, Töhm returns the number of crates he used. Algorithm MangelwurzelCrates = On input S = {x 1,...,x n },b: 1. Let k = 1,v k = 0. 2. For each i where 1 i n: If x i +v k > b, let k := k +1 and let v k = x i. Otherwise, let v k := x i +v k. 3. Return k. Prove that this algorithm is a polynomial-time 2-approximation for the task of finding the smallest number of crates. Note: Töhm s dilemma is very similar to the situation described in Problem 1 of Homework 9. Problem 2 For the last week, we have been looking at problems that are NP-hard not only to solve exactly, but even to approximate. In the travelling salesman problem, an instance is a fully connected weighted undirected graph G = (V,E) with a starting vertex s and ending vertex t. A solution is a simple path from s to t that visits every vertex in the graph. The cost of a solution is the sum of weights of the edges of the path; this cost function is called MIN-TSP ( TSP stands for the travelling salesman problem ). The decision variant of this problem, in which we are given the instance and a maximum cost and need to decide whether a solution of at most that cost exists, is NP-hard, which you could easily show by reduction from HamiltonianPath. However, the problem is also NP-hard to approximate. We can show this by again using HamiltonianPath. a. Suppose that there is a polynomial-time c-approximation to the travelling salesman problem for some constant c. Using such an approximation 2

algorithm, show that one can solve any instance of HamiltonianPath. Conclude that the travelling salesman problem is NP-hard to approximate. b. Using the fact that HamiltonianPath is NP-hard, conclude that if we, for any constant c, have a polynomial-time c-approximation algorithm for the travelling salesman problem, then P must equal NP. Note: There are approximation algorithms for some variants of the TSP. For example, the Christofides algorithm provides a 1.5-approximation for metric-tsp, which is an instance of the TSP such that all edges satisfy the triangle inequality. Because these algorithms do not cover all cases for the TSP, they do not contradict the result of this problem. The following questions are lab problems. Lab Problem 1 Ever since Ginger and Rudy returned to Myami and stopped meeting with each other, Ginger 2.0, Ginger s robotic doppelgänger, has been on the loose. At one point, Ginger 2.0 decided to construct her own robotic friend. BecauseGinger2.0wascreatedatatimewhenRudyandGingerwereblissfully in love, she naturally decided to make Rudy 2.0. To test whether Rudy 2.0 was a sufficient copy of Rudy, Ginger 2.0 developed a 3CNF to test the Rudy-ness of a root vegetable AI. Recall that in the decision problem 3SAT, we have to determine whether there is some assignment of truth values that satisfies every clause of a 3CNF. In the optimization version MAX-3SAT, the goal is to find some assignment of truth values that satisfies the maximum possible number of clauses, even if it is not possible to satisfy every clause. As she plans for future endeavours and competition with her nemesis Töhm Eric, Ginger 2.0 wonders if she could have constructed Rudy 2.0 so that he passed more clauses of the 3CNF. Help Ginger 2.0 design a polynomial-time algorithm that, given as input a Boolean formula in 3CNF, finds an assignment of values that satisfies at least 7/8 as many clauses as the maximum possible number of satisfiable clauses. You should assume that each clause contains exactly 3 literals of distinct variables. 3

Hint: The algorithm should set each variable in the formula one by one, depending on the expected number of satisfied clauses given a particular assignment. That is, the algorithm should choose an assignment to variable x i based on the expected number of satisfied clauses given that x i = 1 and the expected number of satisfied clauses given x i = 0. Lab Problem 2 We have previously dealt with LOGSPACE, in which a machine can read from the input tape but not modify it, and has another working tape of logarithmic size relative to the input. NL is the nondeterministic version of LOGSPACE, with the same size restrictions but allowing the machine to make nondeterministic choices. Reductions to show problems are NL-hard are defined differently from those that show NP and PSPACE-hardness. The machine performing the reduction has three tapes: an input tape which can be read but not modified, a logarithmic-size working tape, and a polynomialsize output tape that can be written to but not read. For example, a language that is NL-complete is PATH: PATH = { G,s,t G is a directed graph that has a path from s to t} One fateful day in Myami, Ginger is busy avoiding Rudy. Just as she manages to stay away from him, she runs straight into another Rudy! Little does she know that it is actually Rudy 2.0, but she is even more shocked when he is joined by another Ginger. Realizing that it s her robotic replica, Ginger confronts Ginger 2.0 to find out just what s going on. Ginger 2.0 explains that she created Rudy 2.0 as a friend, but he then decided to go undercover as Walter Chestnutt s lover to dig up dirt on his illegal obfuscation empire. Rudy never loved Walter, Ginger 2.0 says. And Rudy 2.0 only did it to protect the O Nions monopoly on mangelwurzel! Astonished by this news, Ginger sets out to find Rudy and apologize. The four of them Ginger, Rudy, Ginger 2.0, and Rudy 2.0 1 decide to tackle some of Myami s computational problems and stop the increasingly nefarious activities of Töhm Eric. Their first step is to establish a map system so that they can travel along a directed path between any two meeting points in Myami. They realize that 1 For the sake of clarity, Ginger 2.0 and Rudy 2.0 change their names to Potata Lovelace and Alan Turnip. They will come to be remembered as pioneers in the theory of yamputation. 4

this is actually a dircted graph problem and construct the language: MyamiMap = { G v 1,v 2, a path from v 1 to v 2 and v 2 to v 1 } Their intuition tells them that this language might be NL-complete. Prove to them that it is, by a logarithmic-space reduction from PATH. Lab Problem 3 A probabilistic Turing machine is a Turing machine that, in addition to the usual tape, has a second tape filled with random bits. This machine may sometimes accept, and sometimes reject the same input x, depending on the content of the random tape. The concept of a probabilistic Turing machine gives rise to a whole new set of complexity classes; namely probabilistic polynomial time complexity classes. Some of the main ones are BPP, RP, corp, and ZPP 2, and they are defined as follows: A language L is in BPP if there exists a probabilistic Turing machine M that runs in polynomial time such that the following two conditions hold: Completeness For every x L, Pr[M accepts x] 2/3. Soundness For every x / L, Pr[M accepts x] 1/3. This Turing machine M is called a Monte-Carlo algorithm for L. A language L is in RP if there exists a probabilistic Turing machine M that runs in polynomial time such that the following two conditions hold: Completeness For every x L, Pr[M accepts x] 2/3. Perfect soundness For every x / L, Pr[M accepts x] = 0. A language L is in corp if its complement is in RP. Equivalently, a language L is in corp if there exists a Turing machine M that runs in polynomial time such that the following two conditions hold: 2 For the curious: BPP stands for bounded-error probabilistic polynomial time, RP stands for randomized polynomial time, and ZPP stands for zero-error probabilistic polynomial time! 5

Perfect completeness For every x L, Pr[M accepts x] = 1. Soundness For every x / L, Pr[M accepts x] 1/3. A language L is in ZPP if there exists a probabilistic Turing machine M that, in addition to the usual accepting and rejecting states, has an unsure state in which the machine can halt. M will run in polynomial time and at termination time will be in the accept, reject, or unsure state, such that the following two conditions hold: Completeness Foreveryx L,Pr[M accepts x] 2/3,Pr[M rejects x] = 0, and Pr[M is unsure on x] 1/3. Soundness For every x / L, Pr[M accepts x] = 0, Pr[M rejects x] 2/3, and Pr[M is unsure on x] 1/3. a. Show that P RP BPP and P corp BPP. b. Show that ZPP = RP corp. c. UsingeitherthegivendefinitionforZPPortheoneyoujustderivedinb., demonstrate that the following definition is also equivalent: A language L is in ZPP if there exists a probabilistic Turing machine M (which does not have unsure states) and some polynomial p such that: Runtime For all inputs x of length n, E[runtime of M on x] O(p(n)). Completeness For every x L, Pr[M accepts x] = 1. Soundness For every x / L, Pr[M accepts x] = 0. 6