Turing machines and linear bounded automata

Similar documents
Turing machines and linear bounded automata

Turing machines and linear bounded automata

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

Chomsky Normal Form and TURING MACHINES. TUESDAY Feb 4

Recap DFA,NFA, DTM. Slides by Prof. Debasis Mitra, FIT.

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Introduction to Turing Machines. Reading: Chapters 8 & 9

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

Theory of Computation (IX) Yijia Chen Fudan University

TURING MAHINES

Languages, regular languages, finite automata

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

UNIT-VIII COMPUTABILITY THEORY

CS 301. Lecture 18 Decidable languages. Stephen Checkoway. April 2, 2018

CSCE 551: Chin-Tser Huang. University of South Carolina

CS21 Decidability and Tractability

Decidability (What, stuff is unsolvable?)

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Advanced topic: Space complexity

The Church-Turing Thesis

Undecidable Problems and Reducibility

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Turing Machine Variants

Homework 8. a b b a b a b. two-way, read/write

The Turing machine model of computation

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

Introduction to Turing Machines

CSCI3390-Assignment 2 Solutions

Computability Theory. CS215, Lecture 6,

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Variants of Turing Machine (intro)

Turing Machines Part III

Chapter 8. Turing Machine (TMs)

Undecidability COMS Ashley Montanaro 4 April Department of Computer Science, University of Bristol Bristol, UK

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

Lecture notes on Turing machines

Decidability. Human-aware Robotics. 2017/10/31 Chapter 4.1 in Sipser Ø Announcement:

Turing Machines (TM) The Turing machine is the ultimate model of computation.

CS4026 Formal Models of Computation

Most General computer?

Computability and Complexity

Rumination on the Formal Definition of DPDA

Theory of Computation - Module 4

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2018, face-to-face day section Prof. Marvin K. Nakayama

1 Unrestricted Computation

Turing Machines Part II

Homework Assignment 6 Answers

1 Showing Recognizability

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

CPSC 421: Tutorial #1

ECS 120 Lesson 15 Turing Machines, Pt. 1

Theory of Computation

Automata Theory (2A) Young Won Lim 5/31/18

Theory Bridge Exam Example Questions

Context Sensitive Grammar

CS500 Homework #2 Solutions

Decidability (intro.)

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

Part I: Definitions and Properties

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2016, face-to-face day section Prof. Marvin K. Nakayama

Introduction to Languages and Computation

Theory of Computation Turing Machine and Pushdown Automata

CSE 105 THEORY OF COMPUTATION

CS 154, Lecture 2: Finite Automata, Closure Properties Nondeterminism,

Finite Automata. Mahesh Viswanathan

Chapter 3: The Church-Turing Thesis

CS Lecture 28 P, NP, and NP-Completeness. Fall 2008

What languages are Turing-decidable? What languages are not Turing-decidable? Is there a language that isn t even Turingrecognizable?

CSCE 551 Final Exam, Spring 2004 Answer Key

Space Complexity. The space complexity of a program is how much memory it uses.

The Post Correspondence Problem

Final exam study sheet for CS3719 Turing machines and decidability.

SYLLABUS. Introduction to Finite Automata, Central Concepts of Automata Theory. CHAPTER - 3 : REGULAR EXPRESSIONS AND LANGUAGES

CMSC 330: Organization of Programming Languages. Pushdown Automata Parsing

V Honors Theory of Computation

Turing Machines Part II

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Foundations of

Turing Machines. 22c:135 Theory of Computation. Tape of a Turing Machine (TM) TM versus FA, PDA

Turing Machines Part II

Turing Machines. Wolfgang Schreiner

Accept or reject. Stack

T (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,

CS5371 Theory of Computation. Lecture 12: Computability III (Decidable Languages relating to DFA, NFA, and CFG)

UNRESTRICTED GRAMMARS

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

Pushdown Automata (2015/11/23)

Introduction to Theory of Computing

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture.

MA/CSSE 474 Theory of Computation

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET

Decidability and Undecidability

Turing machines COMS Ashley Montanaro 21 March Department of Computer Science, University of Bristol Bristol, UK

Please give details of your answer. A direct answer without explanation is not counted.

Finite Automata. Finite Automata

Finite Automata Part One

Automata Theory CS S-12 Turing Machine Modifications

Transcription:

and linear bounded automata Informatics 2A: Lecture 30 John Longley School of Informatics University of Edinburgh jrl@inf.ed.ac.uk 25 November 2016 1 / 17

The Chomsky hierarchy: summary Level Language type Grammars Accepting machines 3 Regular X ɛ, X Y, NFAs (or DFAs) X ay (regular) 2 Context-free X β NPDAs (context-free) 1 Context-sensitive α β Nondet. linear with α β bounded automata (noncontracting) 0 Recursively α β enumerable (unrestricted) The material in red will be introduced today. 2 / 17

The length restriction in noncontracting grammars What s the effect of the restriction α β in noncontracting grammar rules? Idea: in a noncontracting derivation S s of a nonempty string s, all the sentential forms are of length at most s. This means that if L is context-sensitive, and we re trying to decide whether s L, we only need to consider possible sentential forms of length s. So intuitively, we have the problem under control, at least in principle. By contrast, without the length restriction, there s no upper limit on the length of intermediate forms that might appear in a derivation of s. So if we re searching for a derivation for s, how do we know when to stop looking? Intuitively, the problem here is wild and out of control. (This will be made more precise next lecture.) 3 / 17

Alan Turing (1912 1954) 4 / 17

Recall that NFAs are essentially memoryless, whilst NPDAs are equipped with memory in the form of a stack. To find the right kinds of machines for the top two Chomsky levels, we need to allow more general manipulation of memory. A Turing machine consists of a finite-state control unit, equipped with a memory tape, infinite in both directions. Each cell on the tape contains a symbol drawn from a finite alphabet Γ.... : a c 3 a b 5 $... read, write, move L/R finite control 5 / 17

, continued... : a c 3 a b 5 $... read, write, move L/R finite control At each step, the behaviour of the machine can depend on the current state of the control unit, the tape symbol at the current read position. Depending on these things, the machine may then overwrite the current tape symbol with a new symbol, shift the tape left or right by one cell, jump to a new control state. This happens repeatedly until (if ever) the control unit enters some identified final state. 6 / 17

, formally A Turing machine T consists of: A set Q of control states An initial state i Q A final (accepting) state f Q A finite tape alphabet Γ An input alphabet Σ Γ A blank symbol Γ Σ A transition function δ : Q Γ Q Γ {L, R}. A nondeterministic Turing machine replaces the transition function δ with a transition relation (Q Γ) (Q Γ {L, R}). (Numerous variant definitions of Turing machine are possible. All lead to notions of TM of equivalent power.) 7 / 17

as acceptors To use a Turing machine T as an acceptor for a language over Σ, assume Σ Γ, and set up the tape with the test string s Σ written left-to-right starting at the read position, and with blank symbols everywhere else. Then let the machine run (maybe overwriting s), and if it enters the final state, declare that the original string s is accepted. The language accepted by T (written L(T )) consists of all strings s that are accepted in this way. Theorem: A set L Σ is generated by some unrestricted (Type 0) grammar if and only if L = L(T ) for some Turing machine T. So both Type 0 grammars and lead to the same class of recursively enumerable languages. 8 / 17

Questions Q1. Which is the most powerful form of language acceptor (i.e., accepts the widest class of languages)? 1 DFAs 2 NPDAs 3 4 Your laptop 9 / 17

Questions Q1. Which is the most powerful form of language acceptor (i.e., accepts the widest class of languages)? Q2. Which is the least powerful form of language acceptor (i.e., accepts the narrowest class of languages)? 1 DFAs 2 NPDAs 3 4 Your laptop 9 / 17

Suppose we modify our model to allow just a finite tape, initially containing just the test string s with endmarkers on either side: h e y m a n The machine therefore has just a finite amount of memory, determined by the length of the input string. We call this a linear bounded automaton. (LBAs are sometimes defined as having tape length bounded by a constant multiple of length of input. No essential difference.) 10 / 17

Example: An LBA for {a n b n c n n 1} The tape alphabet for an LBA, though finite, might be considerably bigger than the input alphabet. So we can store more information on a tape than just an input string or related sentential form. E.g. let Γ = {a, b, c, a, b, c}, and Σ = {a, b, c}. Occurrences of green letters can serve as markers for positions of interest. To test whether an input string has the form a n b n c n : 1 Scan string to ensure it has form a k b m c n for k, m, n 1. Along the way, mark leftmost a, b, c. E.g. aaabbbccc. 2 Scan string again to see if it s the rightmost a, b, c that are marked. If yes for all three, ACCEPT. If yes for some but not all of a, b, c, REJECT. 3 Scan string again moving the markers one position to the right. E.g. aaabbbccc becomes aaabbbccc. Then Go to 2. All this can be done by a (deterministic) LBA. 11 / 17

Recommended exercise... In a similar spirit, outline how the language {ss s {a, b} } could be recognized by a deterministic LBA. Solution to be posted on discussion forum next week. 12 / 17

LBAs and context-sensitive languages Theorem: A language L Σ is context-sensitive if and only if L = L(T ) for some non-deterministic linear bounded automaton T. Rough idea: we can guess at a derivation for s. We can check each step since each sentential form fits onto the tape. To implement this by an LBA, expand the tape alphabet so that the tape can simultaneously store: the input string two successive sentential forms in a derivation a few markers used to check the validity of the derivation step. The context-sensitive grammar rules themselves will be hardwired into the design of the LBA. 13 / 17

A non-context-sensitive language Sentences of first-order predicate logic can be regarded as strings over a certain alphabet, with symbols like,, x, =, (, ) etc. Let P be the language consisting of all such sentences that are provable using the rules of FOPL. E.g. the sentence ( x. A(x)) ( x. B(x)) ( x. A(x) B(x)) is in P. But the sentence ( x. A(x)) ( x. B(x)) ( x. A(x) B(x)) is not in P, even though it s syntactically well-formed. Theorem: P is recursively enumerable, but not context-sensitive. Intuition: to show s P, we d in effect have to construct a proof of s. But in general, a proof of s might involve formulae much, much longer than s itself, which wouldn t fit on the LBA tape. Put another way, mathematics itself is wild and out of control! 14 / 17

Determinism vs. non-determinism: a curiosity At the bottom level of the Chomsky hierarchy, it makes no difference: every NFA can be simulated by a DFA. At the top level, the same happens. Any nondeterministic Turing machine can be simulated by a deterministic one. At the context-free level, there is a difference: we need NPDAs to account for all context-free languages. (Reason: Context-free languages aren t closed under complementation, see last lecture. However, if L is accepted by a DPDA then so is its complement.) What about the context-sensitive level? Are NLBAs strictly more powerful than DLBAs? Asked in 1964, and still open!! Can t use the above argument here because CSLs are closed under complementation (shown in 1988). 15 / 17

Detecting non-acceptance: LBAs versus TMs Suppose T is an LBA. How might we detect that s is not in L(T )? Clearly, if there s an accepting computation for s, there s one that doesn t pass through exactly the same machine configuration twice (if it did, we could shorten it). Since the tape is finite, the total number of machine configurations is finite (though large). So in theory, if T runs for long enough without reaching the final state, it will enter the same configuration twice, and we may as well abort. Note that on this view, repeated configurations would be spotted not by T itself, but by us watching, or perhaps by some super-machine spying on T. For with unlimited tape space, this reasoning doesn t work. Is there some general way of spotting that a computation isn t going to terminate?? See next lecture... 16 / 17

Wider significance of are important because (it s generally believed that) any symbolic computation that can be done by any mechanical procedure or algorithm can in principle be done by a Turing machine. This is called the Church-Turing Thesis. E.g.: Any language L Σ that can be recognized by some mechanical procedure can be recognized by a TM. Any mathematical function f : N N that can be computed by a mechanical procedure can be computed by a TM (e.g. representing integers in binary, and requiring the TM to write the result onto the tape.) 17 / 17

Status of Church-Turing Thesis The CT Thesis is a somewhat informal statement insofar as the general notion of a mechanical procedure isn t formally defined (although we have a pretty good idea of what we mean by it). Although a certain amount of philosophical hair-splitting is possible, the broad idea behind CTT is generally accepted. At any rate, anything that can be done on any present-day computer (even disregarding time/memory limitations) can in principle be done on a TM. So if we buy into CTT, theorems about what TMs can/can t do can be interpreted as fundamental statements about what can/can t be accomplished by mechanical computation in general. We ll see some examples of such theorems next time. 18 / 17