Lecture 8. MINDNF = {(φ, k) φ is a CNF expression and DNF expression ψ s.t. ψ k and ψ is equivalent to φ}

Similar documents
6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

1 PSPACE-Completeness

Space and Nondeterminism

MTAT Complexity Theory October 20th-21st, Lecture 7

Lecture 23: More PSPACE-Complete, Randomized Complexity

CSCI 1590 Intro to Computational Complexity

Lecture 22: PSPACE

Lecture 9: Polynomial-Time Hierarchy, Time-Space Tradeoffs

Lecture 7: The Polynomial-Time Hierarchy. 1 Nondeterministic Space is Closed under Complement

Lecture 8: Alternatation. 1 Alternate Characterizations of the Polynomial Hierarchy

The Polynomial Hierarchy

Umans Complexity Theory Lectures

COMPLEXITY THEORY. PSPACE = SPACE(n k ) k N. NPSPACE = NSPACE(n k ) 10/30/2012. Space Complexity: Savitch's Theorem and PSPACE- Completeness

Space is a computation resource. Unlike time it can be reused. Computational Complexity, by Fu Yuxi Space Complexity 1 / 44

P = k T IME(n k ) Now, do all decidable languages belong to P? Let s consider a couple of languages:

Notes on Complexity Theory Last updated: October, Lecture 6

Polynomial Hierarchy

Lecture 20: PSPACE. November 15, 2016 CS 1010 Theory of Computation

Lecture 11: Proofs, Games, and Alternation

CSE 555 HW 5 SAMPLE SOLUTION. Question 1.

COMPLEXITY THEORY. Lecture 17: The Polynomial Hierarchy. TU Dresden, 19th Dec Markus Krötzsch Knowledge-Based Systems

CSC 5170: Theory of Computational Complexity Lecture 9 The Chinese University of Hong Kong 15 March 2010

SOLUTION: SOLUTION: SOLUTION:

INAPPROX APPROX PTAS. FPTAS Knapsack P

PSPACE COMPLETENESS TBQF. THURSDAY April 17

The Class NP. NP is the problems that can be solved in polynomial time by a nondeterministic machine.

The space complexity of a standard Turing machine. The space complexity of a nondeterministic Turing machine

CSCI 1590 Intro to Computational Complexity

Computability and Complexity CISC462, Fall 2018, Space complexity 1

6.045 Final Exam Solutions

Theory of Computation Space Complexity. (NTU EE) Space Complexity Fall / 1

CSE200: Computability and complexity Space Complexity

Introduction to Computational Complexity

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

De Morgan s a Laws. De Morgan s laws say that. (φ 1 φ 2 ) = φ 1 φ 2, (φ 1 φ 2 ) = φ 1 φ 2.

Lecture 7: Polynomial time hierarchy

Non-Deterministic Time

Definition: Alternating time and space Game Semantics: State of machine determines who

Space Complexity. The space complexity of a program is how much memory it uses.

Time and space classes

Computational Complexity IV: PSPACE

Circuits. Lecture 11 Uniform Circuit Complexity

POLYNOMIAL SPACE QSAT. Games. Polynomial space cont d

Space Complexity. Huan Long. Shanghai Jiao Tong University

6.840 Language Membership

Comp487/587 - Boolean Formulas

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Polynomial Time Computation. Topics in Logic and Complexity Handout 2. Nondeterministic Polynomial Time. Succinct Certificates.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Lecture 5: The Landscape of Complexity Classes

2 Evidence that Graph Isomorphism is not NP-complete

Algorithms & Complexity II Avarikioti Zeta

Complexity Theory. Knowledge Representation and Reasoning. November 2, 2005

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

NP-Completeness and Boolean Satisfiability

Lecture Notes Each circuit agrees with M on inputs of length equal to its index, i.e. n, x {0, 1} n, C n (x) = M(x).

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

Introduction to Complexity Theory. Bernhard Häupler. May 2, 2006

Shamir s Theorem. Johannes Mittmann. Technische Universität München (TUM)

CSCI 1590 Intro to Computational Complexity

Intro to Theory of Computation

Definition: Alternating time and space Game Semantics: State of machine determines who

NP Complete Problems. COMP 215 Lecture 20

Essential facts about NP-completeness:

CS151 Complexity Theory. Lecture 4 April 12, 2017

Chapter 1 - Time and Space Complexity. deterministic and non-deterministic Turing machine time and space complexity classes P, NP, PSPACE, NPSPACE

CS154, Lecture 15: Cook-Levin Theorem SAT, 3SAT

CS154, Lecture 17: conp, Oracles again, Space Complexity

Lecture 8: Complete Problems for Other Complexity Classes

Lecture 6: Oracle TMs, Diagonalization Limits, Space Complexity

UNIT-IV SPACE COMPLEXITY

CISC 4090 Theory of Computation

Complexity (Pre Lecture)

Logarithmic space. Evgenij Thorstensen V18. Evgenij Thorstensen Logarithmic space V18 1 / 18

A.Antonopoulos 18/1/2010

Lecture 19: Interactive Proofs and the PCP Theorem

1 AC 0 and Håstad Switching Lemma

Lecture 21: Space Complexity (The Final Exam Frontier?)

Lecture 10: Boolean Circuits (cont.)

Time Complexity. Definition. Let t : n n be a function. NTIME(t(n)) = {L L is a language decidable by a O(t(n)) deterministic TM}

Space Complexity. Master Informatique. Université Paris 5 René Descartes. Master Info. Complexity Space 1/26

Lecture 12: Interactive Proofs

Lecture 4 : Quest for Structure in Counting Problems

Advanced topic: Space complexity

Theory of Computation. Ch.8 Space Complexity. wherein all branches of its computation halt on all

Theory of Computation Time Complexity

Lecture 16: Time Complexity and P vs NP

Kernel-Size Lower Bounds. The Evidence from Complexity Theory

CS 151 Complexity Theory Spring Solution Set 5

CSCI3390-Lecture 14: The class NP

CSE 135: Introduction to Theory of Computation NP-completeness

: Computational Complexity Lecture 3 ITCS, Tsinghua Univesity, Fall October 2007

Chapter 7: Time Complexity

Boolean circuits. Lecture Definitions

Principles of Knowledge Representation and Reasoning

CS 5114: Theory of Algorithms. Tractable Problems. Tractable Problems (cont) Decision Problems. Clifford A. Shaffer. Spring 2014

6.045: Automata, Computability, and Complexity (GITCS) Class 15 Nancy Lynch

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

COL 352 Introduction to Automata and Theory of Computation Major Exam, Sem II , Max 80, Time 2 hr. Name Entry No. Group

ECS 120 Lesson 24 The Class N P, N P-complete Problems

Transcription:

6.841 Advanced Complexity Theory February 28, 2005 Lecture 8 Lecturer: Madhu Sudan Scribe: Arnab Bhattacharyya 1 A New Theme In the past few lectures, we have concentrated on non-uniform types of computation and focused our energies on trying to prove lower bounds for various kinds of resources. But here, we are stuck; proving tighter lower bounds seems to be an extremely difficult problem. For example, we don t know how to separate randomized polynomial-time algorithms from nondeterministic exponential-time algorithms even though intuitively, they seem to be clearly different. So, now we will shift our attention in order to explore other goals of complexity theory and to discuss other interesting aspects of computation. 2 Motivation Consider the MinDNF problem. Informally speaking, it is the problem of finding if a given boolean expression can be written as a small DNF expression 1. This problem frequently comes up in VLSI design where we want to minimize the number of gates in a boolean circuit. The size of a DNF expression is defined as the number of literals in the expression. More formally, then we can define the following language to correspond to the MinDNF problem: MINDNF = {(φ, k) φ is a CNF expression and DNF expression ψ s.t. ψ k and ψ is equivalent to φ} MINDNF PSPACE, but it is not known if MINDNF is in NP or in conp. However, it seems easier than some other PSPACE problems. Let s try to understand this more carefully. If a language L can be decided by a deterministic TM in polynomial time, then L is in P. Adding non-determinism appears to make our computations more powerful. In a usual nondeterministic TM, a branching computation accepts if any of the forked computations accept; in other words, each node in the computation tree OR s the computations of its children. Let us call such a TM an -TM. Clearly, if an -TM accepts L in polynomial time, then L is in NP. Similary, a -TM is a nondeterministic TM where a nondeterministic computation succeeds if all of the branched computations accept. If L is accepted by a -TM in polynomial time, then L is in conp. What happens if both existential and universal nondeterministic steps are present in a TM? Clearly such a machine can decide TQBF in polynomial time because the machine can have an existential step for each quantifier and a universal step for each quantifier in the TQBF input formula. So, a TM with an unbounded amount of alternation between existential and universal steps can decide any PSPACE problem in polynomial time. Such a machine seems too powerful. Let us try to restrict computational power by restricting the amount of alternation permitted to the TM. Suppose a TM is such that its configuration graph can be cut into two subgraphs G 1 and G 2. All the nodes in G 1 are existential and all the nodes in G 2 are universal, and there are no edges that go from a node in G 2 to a node in G 1. In other words, there can be exactly one alternation from existential nondeterminism to universal nondeterminism. Observer that MINDNF can be decided by such a TM; this is so because MINDNF can be formulated as: ψx {0, 1} n (φ(x) = ψ(x) ψ k) The class of languages that can be decided by a TM with one alternation from -steps to -steps was introduced by Meyer and Stockmeyer in 1973 as Σ P 2. As we ve just indicated, MINDNF ΣP 2. Meyer and Stockmeyer presented the following complete problem for Σ P 2 : {φ φ is a CNF expression and xy, φ(x, y) = true} 1 A boolean expression φ is in disjunctive normal form if φ = W n i=1 D i where n 1 and each of the D i s is the conjunction of boolean literals. An example is (x 1 x 2 x 3 ) (x 3 x 4 ). 8-1

It turns out that MINDNF is also a complete problem for Σ P 2, although this was shown only in 2000 by Umans. Meyer and Stockmeyer also defined the complement of the Σ P 2 language as Π P 2 and similarly defined Σ P 3, ΠP 3 and so on. 3 The Polynomial Hierarchy Definition 1 Σ P i is the language decided by machines of the form? in polynomial time. i Σ p i has the following complete problem: {φ φ is in CNF and x 1 x 2 Q i x i such that φ(x 1,, x i ) = true} There s a similar definition for the complement language of Σ P i : Definition 2 Π P i is the language decided by machines of the form? in polynomial time. i Π p i has the following complete problem: {φ φ is in CNF and x 1 x 2 Q i x i such that φ(x 1,, x i ) = true} Definition 3 PH = Σ P i Directly from the definitions, we have: NP = Σ P 1 and conp = ΠP 1 Σ P i PSPACE and Π P i PSPACE Σ P i, ΠP i Σ P i+1, ΠP i+1 8-2

Proof Sketch: Every string in a language in Σ P i can be turned into a string from a language in Σ P i+1 by adding a vacuous quantifier for the extra variable at the end of the list of quantifiers in the original string. It can be turned into a string from a language in Π P i+1 by adding a vacuous quantifier at the beginning of the original list of quantifiers. Similarly for Π P i. If there exists an i such that Σ P i Π P i, then for all j > i, ΣP j = ΣP i. Proof Sketch: Clearly, it suffices to show that Σ P i Π P i implies Σ P i = Σ P i+1. Suppose L ΣP i+1. Because of the complete problem in this class, φ L iff x 1 x 2 x 3 Q i+1 x i+1 φ(x 1, x 2, x 3,, x i+1 ) By assumption, Σ P i Π P i. So, with respect to the definition of ΣP i, we can rewrite the above as: x 1 (x 2, x 3 ) Q i+1 x i+1 φ(x 1, (x 2, x 3 ),, x i+1 ) Here φ is modified to take i input variables. Thus by joining the two universal quantifiers and modifying the boolean function slightly, we see that L Σ P i. So, ΣP i+1 = ΣP i. The implication in the last claim, that for all j > i, Σ P j = ΣP i, is known as collapse of the polynomial hierarchy. There is a belief among complexity theorists that PH does not collapse. Observe that this is a very strong belief, stronger than the P NP or NP conp conjectures. However, there is not much independent evidence supporting it, and one can very well imagine it being false. The belief that PH does not collapse is important because the negation of other unproven conjectures would make it false. For example, if NP were shown to have polynomial-sized circuits, then PH would collapse. If randomized polynomial time algorithms were shown to coincide with deterministic polynomial time algorithms, then PH would collapse. These types of implications make the PH collapse conjecture a central hacking point for complexity theorists. 4 Alternation as a Resource To explore how adding alternation affects computational power, we define the following complexity classes: ATIME(t(n)) = {L L is decided by a machine with unbounded alternation using O(t(n)) time} ASPACE(s(n)) = {L L is decided by a machine with unbounded alternation using O(s(n)) space} ATISP(a, t, s) = {L L is decided by a machine with alternation quantified by a, time quantified by t and space quantified by s} Even though NL = conl might suggest that alternation would not be very helpful in reducing space complexity, unbounded alternation is very powerful. We have the following results: Theorem 4 SPACE(s) ATIME(s 2 ) SPACE(s 2 ) Proof Sketch: The proof of the first containment is very much like that of Savitch s theorem. We will construct an alternating TM to simulate a deterministic TM M that uses s-space. Suppose M has less than 2 ks configurations reachable in s space. Then, we have to determine if M can go from the initial c i configuration to the final c f configuration in this much time. An alternating procedure that tests if configuration c 2 is reachable from c 1 within time t first existentially branches to guess a configuration c m between c 1 and c 2 and then universally branches to test recursively that both the paths between c 1 and c m and between c m and c 2 can be traversed in t/2 time. So, using this alternating procedure, an 8-3

alternating TM can test if M can go from c i to c f in the required amount of time. The time needed for the test is O(s) to write a configuration at each recursive step times log 2 ks = O(s), the depth of the recursion tree. So, the total time needed is O(s 2 ). To get the second containment, we can do a straightforward simulation of the alternating machine with a deterministic TM. For example, the deterministic TM can do a depth-first-search of the alternation tree to determine which nodes in the tree are accepting. The depth of the recursion stack in the DFS is the time complexity of the alternating algorithm. At each recursive step, we need only a constant amount of space to store which nondeterministic choice was made and whether the branching was existential or universal. Thus, the space complexity of the deterministic simulation is the same order as the time complexity of the alternating machine. And even more suprisingly, Theorem 5 TIME(2 s ) ASPACE(s) TIME(2 O(s) ) Proof Sketch: We ll show the second containment first. Suppose we have an alternating TM using space O(s). Then the depth of the alternating tree will be at most 2 O(s) and the total number of nodes will be at most 2 2O(s). But the number of configurations is also at most 2 O(s). So, if we collapse identical configurations in each row of the alternation tree into single nodes, then we have a 2 O(s) 2 O(s) DAG, and so 2 O(s) is enough time to traverse this graph. The first containment is a bit harder to show. Consider the tableau of a DTM A with time complexity O(2 s ). We will construct a machine M that checks if A has contents σ on cell i of the tableau after t steps of computation. To do this, M first existentially guesses contents σ 1, σ 2, σ 3 of cells i 1,i,i + 1 at time t 1 to verify if they would yield σ according to A s transition function. Next, M branches universally to recursively check whether A has contents σ 1, σ 2 and σ 3 in the cells i 1, i and i + 1 respectively at time t 1. At any one point of this checking, M needs to store a pointer to a constant number of configurations. Hence, the space complexity of M is ASPACE(s). 5 Conclusion Next lecture, we ll show the following: SAT L SAT / TIME(n 1+ɛ ) (for some ɛ > 0). NP P/poly PH collapses We can attempt to apply some of the results we ve learned here to nonmathematical issues. For example, consider a debate between two contestants in an election. Say candidate 1 claims x L and candidate 2 claims x / L. Computationally, candidate 1 is like an existential step in an NTM and candidate 2 is like a universal step. The interaction with the audience can be modeled as a polytime process. So, with an unbounded number of alternations between the two candidates, the debate system has the power of a ATIME(poly) TM. By Theorem 4, therefore, such a debate system decides languages in PSPACE. If the candidates do not have anything to say in the debate (i.e. there is no alternation), then the debate system decides languages in P. And if the contestants alternate a bounded number of times, then the debate decides a language in PH. Thus, if it turns out that PH = PSPACE, then in some sense, a full-fledged debate will only be as interesting as a debate with a fixed number of rounds. The complexity class PSPACE has been used extensively to analyze the complexity of games. Whenever winning or losing a game is a function only of the current board configuration (like Go and not like chess 2 ), the game is in PSPACE. This is so because fundamentally, alternation describes a game between two players, the player with existential moves and the player with universal moves; because ATIME(poly) = PSPACE, any such game where a polynomial number of configurations are visited from 2 If a configuration repeats in chess 3 times, then the game ends in a draw. 8-4

start to end is in PSPACE. Now contrast such hard games to a game like Solitaire where a computer randomly deals out a deck of cards. Is playing Go against a human expert as intellectually challenging as playing Solitaire against a random distribution of cards? Surprisingly, it turns out that there are distributions of cards for which Solitaire is as challenging as Go! Configurations in Go can be reduced to Solitaire card configurations (although it is not clear if such configuration would actually arise in a real Solitaire game). A random instance of Solitaire can be shown to be hard. 8-5