Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

Similar documents
Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

Technische Universität München Summer term 2010 Theoretische Informatik August 2, 2010 Dr. J. Kreiker / Dr. M. Luttenberger, J. Kretinsky SOLUTION

CSE 555 HW 5 SAMPLE SOLUTION. Question 1.

Complexity Theory Final Exam

NP Complete Problems. COMP 215 Lecture 20

NP-completeness was introduced by Stephen Cook in 1971 in a foundational paper.

ECE 695 Numerical Simulations Lecture 2: Computability and NPhardness. Prof. Peter Bermel January 11, 2017

2 P vs. NP and Diagonalization

Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

CS154, Lecture 17: conp, Oracles again, Space Complexity

The Polynomial Hierarchy

DRAFT. Diagonalization. Chapter 4

Essential facts about NP-completeness:

Lecture Notes Each circuit agrees with M on inputs of length equal to its index, i.e. n, x {0, 1} n, C n (x) = M(x).

SAT, NP, NP-Completeness

Lecture 2 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

Lecture 18: PCP Theorem and Hardness of Approximation I

Lecture 19: Finish NP-Completeness, conp and Friends

CS154, Lecture 13: P vs NP

NP-Complete Reductions 1

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

The Cook-Levin Theorem

Beyond NP [HMU06,Chp.11a] Tautology Problem NP-Hardness and co-np Historical Comments Optimization Problems More Complexity Classes

15.1 Proof of the Cook-Levin Theorem: SAT is NP-complete

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

CS151 Complexity Theory. Lecture 1 April 3, 2017

CSE 105 THEORY OF COMPUTATION. Spring 2018 review class

Computational Complexity Theory

NP-Completeness. A language B is NP-complete iff B NP. This property means B is NP hard

Computational Complexity: A Modern Approach. Draft of a book: Dated January 2007 Comments welcome!

CSE 105 THEORY OF COMPUTATION

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness

NP and NP Completeness

Lecture 3: Reductions and Completeness

Complexity Theory. Jörg Kreiker. Summer term Chair for Theoretical Computer Science Prof. Esparza TU München

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan August 30, Notes for Lecture 1

conp, Oracles, Space Complexity

6-1 Computational Complexity

COSE215: Theory of Computation. Lecture 21 P, NP, and NP-Complete Problems

The P-vs-NP problem. Andrés E. Caicedo. September 10, 2011

Introduction to Complexity Theory. Bernhard Häupler. May 2, 2006

Definition: conp = { L L NP } What does a conp computation look like?

CS154, Lecture 13: P vs NP

Review of unsolvability

Turing Machines and Time Complexity

COSE215: Theory of Computation. Lecture 20 P, NP, and NP-Complete Problems

: Computational Complexity Lecture 3 ITCS, Tsinghua Univesity, Fall October 2007

CS21 Decidability and Tractability

A.Antonopoulos 18/1/2010

Computability and Complexity Theory: An Introduction

Umans Complexity Theory Lectures

CSC 5170: Theory of Computational Complexity Lecture 9 The Chinese University of Hong Kong 15 March 2010

1. Introduction Recap

CS 6505, Complexity and Algorithms Week 7: NP Completeness

Umans Complexity Theory Lectures

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Theory of Computation CS3102 Spring 2015 A tale of computers, math, problem solving, life, love and tragic death

Topics in Complexity Theory

NP Completeness and Approximation Algorithms

Final exam study sheet for CS3719 Turing machines and decidability.

CMPT 710/407 - Complexity Theory Lecture 4: Complexity Classes, Completeness, Linear Speedup, and Hierarchy Theorems

Lecture 9: Polynomial-Time Hierarchy, Time-Space Tradeoffs

Time Complexity. Definition. Let t : n n be a function. NTIME(t(n)) = {L L is a language decidable by a O(t(n)) deterministic TM}

Lecture 7: Polynomial time hierarchy

Exam Computability and Complexity

P = k T IME(n k ) Now, do all decidable languages belong to P? Let s consider a couple of languages:

NP, polynomial-time mapping reductions, and NP-completeness

The Complexity of Optimization Problems

Lecture 1: Course Overview and Turing machine complexity

NP-Completeness Part II

Lecture 13, Fall 04/05

Lecture 20: conp and Friends, Oracles in Complexity Theory

Lecture 17: Cook-Levin Theorem, NP-Complete Problems

Summer School on Introduction to Algorithms and Optimization Techniques July 4-12, 2017 Organized by ACMU, ISI and IEEE CEDA.

CISC 4090 Theory of Computation

Announcements. Friday Four Square! Problem Set 8 due right now. Problem Set 9 out, due next Friday at 2:15PM. Did you lose a phone in my office?

PCP Theorem and Hardness of Approximation

Lecture 3 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

Complexity: moving from qualitative to quantitative considerations

NP Completeness. Richard Karp, 1972.

Welcome to... Problem Analysis and Complexity Theory , 3 VU

Time and space classes

Lecture 4: NP and computational intractability

Advanced Topics in Theoretical Computer Science

CS21 Decidability and Tractability

Intro to Theory of Computation

Lecture 3: Nondeterminism, NP, and NP-completeness

Lecture 25: Cook s Theorem (1997) Steven Skiena. skiena

NP-Completeness. Subhash Suri. May 15, 2018

Finish K-Complexity, Start Time Complexity

Quantum Computing Lecture 8. Quantum Automata and Complexity

CS151 Complexity Theory

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

COMPLEXITY THEORY. Lecture 17: The Polynomial Hierarchy. TU Dresden, 19th Dec Markus Krötzsch Knowledge-Based Systems

Chapter 2 : Time complexity

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

Artificial Intelligence. 3 Problem Complexity. Prof. Dr. Jana Koehler Fall 2016 HSLU - JK

Transcription:

Complexity Theory Jörg Kreiker Chair for Theoretical Computer Science Prof. Esparza TU München Summer term 2010

Lecture 6 conp

Agenda conp the importance of P vs. NP vs. conp neither in P nor NP-complete: Ladner s theorem wrap-up Lecture 1-6 muddiest point

On conp conp reminder: L {0, 1} conp iff {0, 1} \ L NP example: SAT contains

On conp conp reminder: L {0, 1} conp iff {0, 1} \ L NP example: SAT contains not well-formed formulas unsatisfiable formulas

On conp conp reminder: L {0, 1} conp iff {0, 1} \ L NP example: SAT contains not well-formed formulas unsatisfiable formulas does SAT have polynomial certificates?

On conp conp reminder: L {0, 1} conp iff {0, 1} \ L NP example: SAT contains not well-formed formulas unsatisfiable formulas does SAT have polynomial certificates? not known: open problem whether NP is closed under complement note that P is closed under complement, compare with NFA vs DFA closure

On conp For all certificates like for NP there is a characterization in terms of certificates for conp it is dual: for all certificates 3SAT: to prove unsatifiability one must check all assignments, for satisfiability only one

On conp For all certificates like for NP there is a characterization in terms of certificates for conp it is dual: for all certificates 3SAT: to prove unsatifiability one must check all assignments, for satisfiability only one Theorem (conp certificates) A language L {0, 1} is in conp iff there exists a polynomial p and a TM M such that x {0, 1} x L u {0, 1} p( x ) M(x, u) = 1

On conp Completeness like for NP one can define conp-hardness and completeness L is conp-complete iff L conp and all problems in conp are polynomial-time Karp-reducible to L classical example: Tautology = {ϕ ϕ is Boolean formula that is true for every assignment} example: x x Tautology proof?

On conp Completeness like for NP one can define conp-hardness and completeness L is conp-complete iff L conp and all problems in conp are polynomial-time Karp-reducible to L classical example: Tautology = {ϕ ϕ is Boolean formula that is true for every assignment} example: x x Tautology proof? note that L is conp-complete, if L is NP-complete SAT is conp complete Tautology is conp-complete (reduction from SAT by negating formula)

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r).

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n compute from ϕ a regular expression: f(ϕ)=(α 1 α 2... α m )

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n compute from ϕ a regular expression: f(ϕ)=(α 1 α 2... α m ) α i = γ i1... γ in

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n compute from ϕ a regular expression: f(ϕ)=(α 1 α 2... α m ) α i = γ i1... γ in 0 x j C i γ ij = 1 x j C i (0 1) otherwise

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n compute from ϕ a regular expression: f(ϕ)=(α 1 α 2... α m ) α i = γ i1... γ in 0 x j C i γ ij = 1 x j C i (0 1) otherwise example: (x y z) (y z w) transformed to (001(0 1)) (0 1)100)

On conp Regular Expression Equivalence Remember yesterday s teaser! A regular expression over {0, 1} is defined by r ::= 0 1 rr r r r r r The language defined by r is written L(r). let ϕ = C 1... C m be a Boolean formula in 3CNF over variables x 1,..., x n compute from ϕ a regular expression: f(ϕ)=(α 1 α 2... α m ) α i = γ i1... γ in 0 x j C i γ ij = 1 x j C i (0 1) otherwise example: (x y z) (y z w) transformed to (001(0 1)) (0 1)100) observe: ϕ is unsatisfiable iff f(ϕ) = {0, 1} n

On conp Regular expressions and computational complexity previous slide establishes: 3SAT p RegExpEq 0 that is: regular expression equivalence is conp-hard

On conp Regular expressions and computational complexity previous slide establishes: 3SAT p RegExpEq 0 that is: regular expression equivalence is conp-hard it is conp-complete for expressions without, because one needs to check for all expressions of length n whether they are included (test polynomial by NFA transformation)

On conp Regular expressions and computational complexity previous slide establishes: 3SAT p RegExpEq 0 that is: regular expression equivalence is conp-hard it is conp-complete for expressions without, because one needs to check for all expressions of length n whether they are included (test polynomial by NFA transformation) the problem becomes PSPACE-complete when is added the problem becomes EXP-complete when, is added

On conp Agenda conp the importance of P vs. NP vs. conp neither in P nor NP-complete: Ladner s theorem wrap-up Lecture 1-6 muddiest point

P vs NP vs conp Open and known problems OPEN P = NP? NP = conp?

P vs NP vs conp Open and known problems OPEN P = NP? NP = conp? KNOWN if an NP-complete problem is in P, then P = NP P conp NP if L conp and L NP-complete then NP = conp if P = NP then P = NP = conp if NP conp then P NP if EXP NEXP then P NP (equalities scale up, inequalities scale down)

P vs NP vs conp What if P = NP? one of the most important open problems computational utopia SAT has polynomial algorithm 1000s of other problems, too (due to reductions, completeness) finding solutions is as easy as verifying them guessing can be done deterministically decryption as easy as encryption randomization can be de-randomized

P vs NP vs conp What if NP = conp Problems have short certificates that don t seem to have any! like tautology, unsatisfiability like unsatisfiable ILPs like regular expression equivalence

P vs NP vs conp How to cope with NP-complete problems? ignore (see SAT), it may still work modify your problem (2SAT, 2Coloring) NP-completeness talks about worst cases and exact solutions try average cases try approximations randomize explore special cases (TSP)

P vs NP vs conp In praise of reductions reductions help, when lower bounds are hard to come by reductions helped to prove NP-completeness for 1000s of natural problems in fact, most natural problems (exceptions are Factoring and Iso) are either in P or NP-complete but, unless P = NP, there exist such problems

P vs NP vs conp Agenda conp the importance of P vs. NP vs. conp neither in P nor NP-complete: Ladner s theorem wrap-up Lecture 1-6 muddiest point

Ladner s theorem Ladner s Theorem P/NP intermediate languages exist! Theorem (Ladner) If P NP then there exists a language L NP \ P that is not NP-complete.

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ }

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that x {0, 1} with x log n

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that x {0, 1} with x log n M i outputs SAT H (x)

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that x {0, 1} with x log n M i outputs SAT H (x) within i x i steps

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that x {0, 1} with x log n M i outputs SAT H (x) within i x i steps M i is the i-th TM (in enumeration of TM descriptions)

Ladner s theorem Proof essential steps let F : N N be a function define SAT F to be {ϕ01 nh(n) ϕ SAT, n = ϕ } now define a function H and fix SAT H H(n) is the smallest i < log log n such that x {0, 1} with x log n M i outputs SAT H (x) within i x i steps M i is the i-th TM (in enumeration of TM descriptions) if no such i exists then H(n) = log log n

Ladner s theorem Proof essential steps Using the definition of SAT H one can show 1. SAT H P H(n) O(1) 2. SAT H P implies lim n H(n) =

Ladner s theorem Proof essential steps Using the definition of SAT H one can show 1. SAT H P H(n) O(1) 2. SAT H P implies lim n H(n) = If SAT H P, then H(n) C for some constant. This implies that SAT is also in P, which implies P = NP (padding). Contradiction!

Ladner s theorem Proof essential steps Using the definition of SAT H one can show 1. SAT H P H(n) O(1) 2. SAT H P implies lim n H(n) = If SAT H P, then H(n) C for some constant. This implies that SAT is also in P, which implies P = NP (padding). Contradiction! If SAT H is NP-complete, then there is a reduction from SAT to SAT H in time O(n i ) for some constant. For large n it maps SAT instances of size n to SAT H instances of size smaller than n H(n). This implies SAT P. Contradiction!

Wrap-up What you should know by now deterministic TMs capture the inuitive notion of algorithms and computability universal TM general-purpose computer or an interpreter some problems are uncomputable aka. undecidable, like the halting problem this is proved by diagonalization complexity class P captures tractable problems P is robust under TM definition tweaks (tapes, alphabet size, obliviousness, universal simulation) non-deterministic TMs can be simulated by TM in exponential time NP non-det. poly. time polynomially checkable certificates

Wrap-up What you should know by now NP non-det. poly. time polynomially checkable certificates reductions allow to define hardness and completeness of problems complete problems are the hardest within a class, if they can be solved efficiently the whole class can NP complete problems: 3SAT (by Cook-Levin); Indset, 3 Coloring, ILP (by reduction from 3SAT) SAT is practically useful and feasible conp complete problems: Tautology, star-free regular expression equivalence probably there are problems neither in P nor NP-complete (Ladner)

Wrap-up What s next? space classes space and time hierarchy theorems generalization of NP and conp: polynomial hierarchy probabilistic TMs, randomization complexity and proofs descriptive complexity

Wrap-up Agenda conp the importance of P vs. NP vs. conp neither in P nor NP-complete: Ladner s theorem wrap-up Lecture 1-6 muddiest point