Turing machines and linear bounded automata

Similar documents
Turing machines and linear bounded automata

Turing machines and linear bounded automata

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

Recap DFA,NFA, DTM. Slides by Prof. Debasis Mitra, FIT.

CS21 Decidability and Tractability

TURING MAHINES

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

CSCE 551: Chin-Tser Huang. University of South Carolina

Theory of Computation (IX) Yijia Chen Fudan University

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

Languages, regular languages, finite automata

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

CS 301. Lecture 18 Decidable languages. Stephen Checkoway. April 2, 2018

Introduction to Turing Machines. Reading: Chapters 8 & 9

Chomsky Normal Form and TURING MACHINES. TUESDAY Feb 4

Turing Machine Variants

Advanced topic: Space complexity

Undecidable Problems and Reducibility

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2018, face-to-face day section Prof. Marvin K. Nakayama

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Undecidability COMS Ashley Montanaro 4 April Department of Computer Science, University of Bristol Bristol, UK

UNIT-VIII COMPUTABILITY THEORY

Computability and Complexity

Homework 8. a b b a b a b. two-way, read/write

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2016, face-to-face day section Prof. Marvin K. Nakayama

The Turing machine model of computation

Decidability (What, stuff is unsolvable?)

CS500 Homework #2 Solutions

Rumination on the Formal Definition of DPDA

Turing Machines (TM) The Turing machine is the ultimate model of computation.

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

Decidability. Human-aware Robotics. 2017/10/31 Chapter 4.1 in Sipser Ø Announcement:

Chapter 8. Turing Machine (TMs)

Context Sensitive Grammar

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

What languages are Turing-decidable? What languages are not Turing-decidable? Is there a language that isn t even Turingrecognizable?

Variants of Turing Machine (intro)

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

V Honors Theory of Computation

Computability Theory. CS215, Lecture 6,

Turing Machines. Wolfgang Schreiner

Turing Machines Part III

Introduction to Turing Machines

UNRESTRICTED GRAMMARS

CS 154, Lecture 2: Finite Automata, Closure Properties Nondeterminism,

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

Space Complexity. The space complexity of a program is how much memory it uses.

Accept or reject. Stack

DM17. Beregnelighed. Jacob Aae Mikkelsen

The Church-Turing Thesis

Foundations of

Lecture notes on Turing machines

1 Showing Recognizability

CS4026 Formal Models of Computation

CSCE 551 Final Exam, Spring 2004 Answer Key

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

Most General computer?

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

CSE 105 THEORY OF COMPUTATION

Final exam study sheet for CS3719 Turing machines and decidability.

Decidability (intro.)

ACS2: Decidability Decidability

Introduction to Languages and Computation

Further discussion of Turing machines

SYLLABUS. Introduction to Finite Automata, Central Concepts of Automata Theory. CHAPTER - 3 : REGULAR EXPRESSIONS AND LANGUAGES

Turing machines COMS Ashley Montanaro 21 March Department of Computer Science, University of Bristol Bristol, UK

Turing Machines Part II

T (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,

MA/CSSE 474 Theory of Computation

ECS 120 Lesson 15 Turing Machines, Pt. 1

Please give details of your answer. A direct answer without explanation is not counted.

Automata Theory (2A) Young Won Lim 5/31/18

CS Lecture 28 P, NP, and NP-Completeness. Fall 2008

(a) Definition of TMs. First Problem of URMs

The Pumping Lemma: limitations of regular languages

CSE 105 Theory of Computation

Chapter 3: The Church-Turing Thesis

CMSC 330: Organization of Programming Languages. Pushdown Automata Parsing

Theory of Computation

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY

Closure under the Regular Operations

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Finite Automata. Mahesh Viswanathan

Theory Bridge Exam Example Questions

Finite Automata. Finite Automata


6.045: Automata, Computability, and Complexity Or, Great Ideas in Theoretical Computer Science Spring, Class 8 Nancy Lynch

CPSC 421: Tutorial #1

Turing machine. Turing Machine Model. Turing Machines. 1. Turing Machines. Wolfgang Schreiner 2. Recognizing Languages

Theory of computation: initial remarks (Chapter 11)

Turing Machines Part II

COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE

The Unsolvability of the Halting Problem. Chapter 19

Theory of Computation Turing Machine and Pushdown Automata

Turing Machine Recap

Transcription:

and linear bounded automata Informatics 2A: Lecture 29 John Longley School of Informatics University of Edinburgh jrl@inf.ed.ac.uk 27 November 2015 1 / 15

The Chomsky hierarchy: summary Level Language type Grammars Accepting machines 3 Regular X ɛ, X Y, NFAs (or DFAs) X ay (regular) 2 Context-free X β NPDAs (context-free) 1 Context-sensitive α β Nondet. linear with α β bounded automata (noncontracting) 0 Recursively α β enumerable (unrestricted) The material in red will be introduced today. 2 / 15

The length restriction in noncontracting grammars What s the effect of the restriction α β in noncontracting grammar rules? Idea: in a noncontracting derivation S s of a nonempty string s, all the sentential forms are of length at most s. This means that if L is context-sensitive, and we re trying to decide whether s L, we only need to consider possible sentential forms of length s. So intuitively, we have the problem under control, at least in principle. By contrast, without the length restriction, there s no upper limit on the length of intermediate forms that might appear in a derivation of s. So if we re searching for a derivation for s, how do we know when to stop looking? Intuitively, the problem here is wild and out of control. (This will be made more precise next lecture.) 3 / 15

Alan Turing (1912 1954) 4 / 15

Recall that NFAs are essentially memoryless, whilst NPDAs are equipped with memory in the form of a stack. To find the right kinds of machines for the top two Chomsky levels, we need to allow more general manipulation of memory. A Turing machine consists of a finite-state control unit, equipped with a memory tape, infinite in both directions. Each cell on the tape contains a symbol drawn from a finite alphabet Γ.... : a c 3 a b 5 $... read, write, move L/R finite control 5 / 15

, continued... : a c 3 a b 5 $... read, write, move L/R finite control At each step, the behaviour of the machine can depend on the current state of the control unit, the tape symbol at the current read position. Depending on these things, the machine may then overwrite the current tape symbol with a new symbol, shift the tape left or right by one cell, jump to a new control state. This happens repeatedly until (if ever) the control unit enters some identified final state. 6 / 15

, formally A Turing machine T consists of: A set Q of control states An initial state i Q A final (accepting) state f Q A tape alphabet Γ An input alphabet Σ Γ A blank symbol Γ Σ A transition function δ : Q Γ Q Γ {L, R}. A nondeterministic Turing machine replaces the transition function δ with a transition relation (Q Γ) (Q Γ {L, R}). (Numerous variant definitions of Turing machine are possible. All lead to notions of TM of equivalent power.) 7 / 15

as acceptors To use a Turing machine T as an acceptor for a language over Σ, assume Σ Γ, and set up the tape with the test string s Σ written left-to-right starting at the read position, and with blank symbols everywhere else. Then let the machine run (maybe overwriting s), and if it enters the final state, declare that the original string s is accepted. The language accepted by T (written L(T )) consists of all strings s that are accepted in this way. Theorem: A set L Σ is generated by some unrestricted (Type 0) grammar if and only if L = L(T ) for some Turing machine T. So both Type 0 grammars and lead to the same class of recursively enumerable languages. 8 / 15

Questions Q1. Which is the most powerful form of language acceptor (i.e., accepts the widest class of languages)? 1 DFAs 2 NPDAs 3 4 Your laptop 9 / 15

Questions Q1. Which is the most powerful form of language acceptor (i.e., accepts the widest class of languages)? Q2. Which is the least powerful form of language acceptor (i.e., accepts the narrowest class of languages)? 1 DFAs 2 NPDAs 3 4 Your laptop 9 / 15

Suppose we modify our model to allow just a finite tape, initially containing just the test string s with endmarkers on either side: h e y m a n The machine therefore has just a finite amount of memory, determined by the length of the input string. We call this a linear bounded automaton. (LBAs are sometimes defined as having tape length bounded by a constant multiple of length of input string. In fact, this doesn t make any difference.) Theorem: A language L Σ is context-sensitive if and only if L = L(T ) for some non-deterministic linear bounded automaton T. Rough idea: we can guess at a derivation for s. We can check each step since each sentential form fits within the tape. 10 / 15

Example: a non-context-sensitive language Sentences of first-order predicate logic can be regarded as strings over a certain alphabet, with symbols like,, x, =, (, ) etc. Let P be the language consisting of all such sentences that are provable using the rules of FOPL. E.g. the sentence ( x. A(x)) ( x. B(x)) ( x. A(x) B(x)) is in P. But the sentence ( x. A(x)) ( x. B(x)) ( x. A(x) B(x)) is not in P, even though it s syntactically well-formed. Theorem: P is recursively enumerable, but not context-sensitive. Intuition: to show s P, we d in effect have to construct a proof of s. But in general, a proof of s might involve formulae much, much longer than s itself, which wouldn t fit on the LBA tape. Put another way, mathematics itself is wild and out of control! 11 / 15

Determinism vs. non-determinism: a curiosity At the bottom level of the Chomsky hierarchy, it makes no difference: every NFA can be simulated by a DFA. At the top level, the same happens. Any nondeterministic Turing machine can be simulated by a deterministic one. At the context-free level, there is a difference: we need NPDAs to account for all context-free languages. (Example: Σ {ss s Σ } is a context-free language whose complement isn t context-free, see last lecture. However, if L is accepted by a DPDA then so is its complement.) What about the context-sensitive level? Are NLBAs strictly more powerful than DLBAs? Asked in 1964, and still open!! (Can t use the context-free argument because CSLs are closed under complementation shown in 1988.) 12 / 15

Detecting non-acceptance: LBAs versus TMs Suppose T is an LBA. How might we detect that s is not in L(T )? Clearly, if there s an accepting computation for s, there s one that doesn t pass through exactly the same machine configuration twice (if it did, we could shorten it). Since the tape is finite, the total number of machine configurations is finite (though large). So in theory, if T runs for long enough without reaching the final state, it will enter the same configuration twice, and we may as well abort. Note that on this view, repeated configurations would be spotted not by T itself, but by us watching, or perhaps by some super-machine spying on T. For with unlimited tape space, this reasoning doesn t work. Is there some general way of spotting that a computation isn t going to terminate?? See next lecture... 13 / 15

Wider significance of are important because (it s generally believed that) any symbolic computation that can be done by any mechanical procedure or algorithm can in principle be done by a Turing machine. This is called the Church-Turing Thesis. E.g.: Any language L Σ that can be recognized by some mechanical procedure can be recognized by a TM. Any mathematical function f : N N that can be computed by a mechanical procedure can be computed by a TM (e.g. representing integers in binary, and requiring the TM to write the result onto the tape.) 14 / 15

Status of Church-Turing Thesis The CT Thesis is a somewhat informal statement insofar as the general notion of a mechanical procedure isn t formally defined (although we have a pretty good idea of what we mean by it). Although a certain amount of philosophical hair-splitting is possible, the broad idea behind CTT is generally accepted. At any rate, anything that can be done on any present-day computer (even disregarding time/memory limitations) can in principle be done on a TM. So if we buy into CTT, theorems about what TMs can/can t do can be interpreted as fundamental statements about what can/can t be accomplished by mechanical computation in general. We ll see some examples of such theorems next time. 15 / 15