Advanced topic: Space complexity

Similar documents
SOLUTION: SOLUTION: SOLUTION:

Space Complexity. The space complexity of a program is how much memory it uses.

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

CSE200: Computability and complexity Space Complexity

Logarithmic space. Evgenij Thorstensen V18. Evgenij Thorstensen Logarithmic space V18 1 / 18

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018

Complexity Theory 112. Space Complexity

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 9/6/2004. Notes for Lecture 3

Notes for Lecture Notes 2

MTAT Complexity Theory October 13th-14th, Lecture 6

1 Computational Problems

Notes on Space-Bounded Complexity

CS154, Lecture 17: conp, Oracles again, Space Complexity

Notes on Space-Bounded Complexity

Lecture 3. 1 Terminology. 2 Non-Deterministic Space Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, 2005.

CSCI 1590 Intro to Computational Complexity

Lecture 21: Space Complexity (The Final Exam Frontier?)

The space complexity of a standard Turing machine. The space complexity of a nondeterministic Turing machine

Computational complexity

Lecture 8: Complete Problems for Other Complexity Classes

Computer Sciences Department

6.045 Final Exam Solutions

Lecture 6: Oracle TMs, Diagonalization Limits, Space Complexity

CSCI3390-Lecture 14: The class NP

Time Complexity (1) CSCI Spring Original Slides were written by Dr. Frederick W Maier. CSCI 2670 Time Complexity (1)

conp, Oracles, Space Complexity

Polynomial Time Computation. Topics in Logic and Complexity Handout 2. Nondeterministic Polynomial Time. Succinct Certificates.

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

Quantum Computing Lecture 8. Quantum Automata and Complexity

Lecture 22: PSPACE

Apropos of an errata in ÜB 10 exercise 3

Turing Machines and Time Complexity

CMPT 710/407 - Complexity Theory Lecture 4: Complexity Classes, Completeness, Linear Speedup, and Hierarchy Theorems

NP, polynomial-time mapping reductions, and NP-completeness

an efficient procedure for the decision problem. We illustrate this phenomenon for the Satisfiability problem.

Theory of Computation

Complexity. Complexity Theory Lecture 3. Decidability and Complexity. Complexity Classes

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Time-bounded computations

Lecture 20: conp and Friends, Oracles in Complexity Theory

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan August 30, Notes for Lecture 1

Min/Max-Poly Weighting Schemes and the NL vs UL Problem

CSCE 551 Final Exam, Spring 2004 Answer Key

Part I: Definitions and Properties

BBM402-Lecture 11: The Class NP

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

Theory of Computation Space Complexity. (NTU EE) Space Complexity Fall / 1

Turing Machines. Fall The Chinese University of Hong Kong. CSCI 3130: Formal languages and automata theory

Definition: conp = { L L NP } What does a conp computation look like?

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

CS151 Complexity Theory. Lecture 1 April 3, 2017

Turing machines and linear bounded automata

Complexity (Pre Lecture)

Space is a computation resource. Unlike time it can be reused. Computational Complexity, by Fu Yuxi Space Complexity 1 / 44

CS151 Complexity Theory. Lecture 4 April 12, 2017

NP-Completeness. Until now we have been designing algorithms for specific problems

The Class NP. NP is the problems that can be solved in polynomial time by a nondeterministic machine.

6.045: Automata, Computability, and Complexity (GITCS) Class 15 Nancy Lynch

1 PSPACE-Completeness

Introduction to Computational Complexity

Theory of Computation

Lecture 7: The Polynomial-Time Hierarchy. 1 Nondeterministic Space is Closed under Complement

1 Showing Recognizability

1 Computational problems

The Church-Turing Thesis

COL 352 Introduction to Automata and Theory of Computation Major Exam, Sem II , Max 80, Time 2 hr. Name Entry No. Group

Computation Histories

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

1 Deterministic Turing Machines

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

Turing machines and linear bounded automata

Theory of Computation

Turing machines and linear bounded automata

TURING MAHINES

CS5371 Theory of Computation. Lecture 12: Computability III (Decidable Languages relating to DFA, NFA, and CFG)

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

Space and Nondeterminism

Lecture 3: Reductions and Completeness

Time to learn about NP-completeness!

Computability Theory. CS215, Lecture 6,

6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

Lecture 20: PSPACE. November 15, 2016 CS 1010 Theory of Computation

Turing Machines. 22c:135 Theory of Computation. Tape of a Turing Machine (TM) TM versus FA, PDA

The purpose here is to classify computational problems according to their complexity. For that purpose we need first to agree on a computational

Further discussion of Turing machines

Lecture Notes 4. Issued 8 March 2018

COMPLEXITY THEORY. PSPACE = SPACE(n k ) k N. NPSPACE = NSPACE(n k ) 10/30/2012. Space Complexity: Savitch's Theorem and PSPACE- Completeness

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

CSC 5170: Theory of Computational Complexity Lecture 9 The Chinese University of Hong Kong 15 March 2010

Computational Models Lecture 11, Spring 2009

Turing Machines Part III

CS4026 Formal Models of Computation

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

P and NP. Or, how to make $1,000,000.

Variations of the Turing Machine

CSCI3390-Lecture 16: NP-completeness

Lecture Notes Each circuit agrees with M on inputs of length equal to its index, i.e. n, x {0, 1} n, C n (x) = M(x).

Introduction to Turing Machines. Reading: Chapters 8 & 9

Transcription:

Advanced topic: Space complexity CSCI 3130 Formal Languages and Automata Theory Siu On CHAN Chinese University of Hong Kong Fall 2016 1/28

Review: time complexity We have looked at how long it takes to solve various problems NP IS VC P L 01 PATH CLIQUE SAT What about the amount of memory? We measure memory usage (space) by the number of tape cells used Questions one may ask: If a problem can be solved quickly, can it be solved with little memory? If a problem can be solved with little memory, can it be solved quickly? 2/28

Space complexity The space complexity of a Turing machine M is the function s M (n): s M (n) = maximum number of cells that M ever reads on any input of length n Example: L = {w#w w {a, b} } M : On input x, until you reach # Read and cross of first a or b before # Read and cross off first a or b after # If mismatch, reject If all symbols except # are crossed off, accept space complexity: n + 1 +1 because M may scan the blank symbol after the input 3/28

Sublinear space If we assume the Turing machine has two tapes 1. Input tape: contains the input and is read-only 2. Work tape: initially empty, only the cells used here is counted We will assume this in this lecture Then L can be solved in O(log n) space L = {w#w w {a, b} } Idea: Keep a counter, storing the number of symbols matched so far Counter can represent a number of size m in using O(log m) bits 4/28

Logarithmic space Smallest reasonable amount of space used will be logarithmic in input length Just keeping one counter/pointer requires log n memory! A language L is in L if L can be decided by a deterministic Turing machine (with read-only input tape) in O(log n) space 5/28

Time vs space If a Turing machine runs in time t M (n), how much space can it use? If a Turing machine uses space s M (n), how long can it take? 6/28

Time vs space If a Turing machine runs in time t M (n), how much space can it use? At most as much space as the number of time steps s M (n) t M (n) If a Turing machine uses space s M (n), how long can it take? At most exponential time in the amount of space used t M (n) 2 O(s M (n)) if s M (n) log n Reason: Constant number of possibilities (say K) for each tape symbol n possible input head locations s M (n) possible work head locations Total number of configurations ns M (n)k s M (n) 2 O(s M (n)) if s M (n) log n 6/28

PATH PATH = { G, s, t Directed graph G has a directed path from node s to node t} As we will see, an important problem for space complexity How much space is required for solving PATH? BFS or DFS uses n space (n = V (G) ) We don t know how to solve PATH in O(log n) space, but we can solve it in O((log n) 2 ) space 7/28

PATH in (log n) 2 space Main idea: Recursion! If t is reachable from s, must be reachable within n 1 steps Solve the question Is v reachable from u within k steps? recursively Try all intermediate nodes w and asks Is w reachable from u within k/2 steps? Is v reachable from w within k/2 steps? If answer is YES to both sub-questions for some w, then v reachable from u within k steps 8/28

Savitch s algorithm Recursively answer Can u reach v within k steps? Algorithm 1 PATH(u, v, k) if k = 0 then return whether u = v else if k = 1 then return whether (u, v) E end if for every vertex w do if PATH(u, w, k/2 ) and PATH(w, v, k/2 ) then return true end if end for return false 9/28

PATH in (log n) 2 space Depth of recursion: O(log n) Additional memory for each level: O(log n) to remember the intermediate node for this level unlike time, space can be reused! Overall space used: O((log n) 2 ) 10/28

Aside: repeated squaring To compute A n, how many multiplications required? To compute A n : If n = 0, return 1 If n is even, recursively compute B = A n/2 and return B 2 If n is odd, retursively compute B = A (n 1)/2 and return B 2 B O(log n) multiplications When A is the adjacency matrix and not a scalar repeated squaring is analogous to previous algorithm for PATH 11/28

Nondeterministic log-space Why is PATH important? Analogous to P vs NP, we can consider the nondeterministic analog of L and asks L vs NL A language L is in NL if L can be decided by a nondeterministic Turing machine (with read-only input tape) in O(log n) space 12/28

NL-completeness 1. B is in NL; and A language B is NL-complete if 2. every language A in NL log-space reduces to B We consider log-space reductions, because polynomial-time reductions are too coarse NL PATH L Theorem PATH is NL-complete Assuming L NL 13/28

PATH is NL-complete PATH is in NL: Nondeterministic Turing machine guesses a path from s to t More precisely, the machine remembers the current node on the path and guesses the next node PATH is NL-hard: For any language A in NL Let N be a log-space nondeterministic Turing machine for A Construct the directed graph G whose vertices are configurations of N Let s be the initial configuration and t be the accepting configuration 14/28

PATH is NL-hard: details Listing all s N (n) nodes/configurations can be done with O(s N (n)) space Checking whether one configuration leads to another (whether one node has an edge to another) can be done in O(s N (n)) space Since s N (n) = O(log n), constructing G, s, t can be done in O(log n) space By modifying N, we may assume its accepting configuration is unique 15/28

Caveat and consequences Recall: NP = set of languages having polynomial-time verifier A similar definition (with log-space verifier) is not unlikely to be true for NL Intuitively, NL machines do not have enough memory to remember all nondeterministic choices Since PATH is NL-complete and can be solved in O((log n) 2 ) spaces Every problem in NL can be solved in O((log n) 2 ) space! (Savitch s theorem) Even though we believe NP-complete problems takes exponential amount of time compared to P problems, space is another story 16/28

Hierarchy theorems 17/28

Hierarchy theorem Given more space, can Turing machines/algorithms solve more problems? Are there problems solvable in n 3 space but not in n 2 space? Given any nice function f : N N, there is a language decidable in O(f (n)) space but not in o(f (n)) space For example, n 3, log n, n log n will be nice (If a function does not always take integer values, such as log n, we consider rounding down the output to an integer) 18/28

Space-constructible functions Technical definition of nice is space-constructible A function f : N N, where f (n) log n, is space-constructible if the function mapping an input w of length n to the binary representation of f (n) is computable by a Turing machine in space O(f (n)). Space hierarchy theorem is therefore Given any space-constructible function f : N N, there is a language decidable in O(f (n)) space but not in o(f (n)) space 19/28

Corollary For any a < b, there are functions computable in space O(n b ) but not in space O(n a ) Statement is intuitive Hardest part: proving that all Turing machines with less space fails to solve a problem 20/28

The difficult problem L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Need to show 1. L cannot be decided in space o(f (n)) 2. L can be decided in space O(f (n)) An artifical problem For technical reason, we assume the Turing machines M have constant-sized tape alphabet (such as 4), independent of n 21/28

Not solvable in space o(f (n)) L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Proof by contradiction Suppose L can be decided in space o(f (n)) by a Turing machine D What happens if M = D and w is very long? When w is very long, n is big, and o(f (n)) will be smaller than f (n) 22/28

Not solvable in space o(f (n)) L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Case 1: If D accepts D, w then D, w L (because D decides L) hence D rejects D, w (by definition of L) Case 2: If D rejects D, w then D, w / L (because D decides L) hence D doesn t reject D, w (by definition of L) Since D decides L, D accepts D, w Combining two cases contradiction 23/28

Solvable in space O(f (n)) L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Idea: simulate M Since M is supposed to use only f (n) space Simulation can be done using O(f (n)) space Keeping track of M s states takes O(log n) space If M tries to use more than f (n) space, aborts simulation and rejects Here we use the assumption that f (n) is space-constructible Simulator needs to know how much tape space to allocate for simulating M 24/28

Solvable in space O(f (n)) L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Idea: simulate M Challenge: M may infinite-loop on M, w Solution: Computation in space f (n) goes through 2 O(f (n)) configurations If the same configuration appears twice, M loops indefinitely When simulating M, keeps track of the number of steps If it exceeds 2 O(f (n)), simulator rejects This counter takes up additional O(f (n)) space 25/28

Conclusion L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } 1. L cannot be decided in space o(f (n)) 2. L can be decided in space O(f (n)) Why this artifical problem? 26/28

Diagonalization L = { M, w Turing machine M rejects M, w in space f (n) n = M, w } Need a problem not solvable by all Turing machines that runs in o(f (n)) space That s why L involves Turing machines running in small space 27/28

Time hierarchy Similar theorem for time complexity Given any time-constructible function f : N N, there is a language decidable in O(f (n)) time but not in o(f (n)/ log n) time A function f : N N, where f (n) n log n, is time-constructible if the function mapping an input w of length n to the binary representation of f (n) is computable by a Turing machine in time O(f (n)). L = { M, w Turing machine M rejects M, w in f (n)/ log n time n = M, w } Proof follows similar high-level strategy 1. L cannot be decided in o(f (n)/ log n) time 2. L can be decided in O(f (n)) time 28/28