Turing machines and linear bounded automata

Similar documents
Turing machines and linear bounded automata

Turing machines and linear bounded automata

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

CS21 Decidability and Tractability

TURING MAHINES

Languages, regular languages, finite automata

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Recap DFA,NFA, DTM. Slides by Prof. Debasis Mitra, FIT.

CS 301. Lecture 18 Decidable languages. Stephen Checkoway. April 2, 2018

Chomsky Normal Form and TURING MACHINES. TUESDAY Feb 4

Undecidable Problems and Reducibility

Introduction to Turing Machines. Reading: Chapters 8 & 9

Context Sensitive Grammar

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Theory of Computation (IX) Yijia Chen Fudan University

Advanced topic: Space complexity

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

The Turing machine model of computation

Computability and Complexity

CS4026 Formal Models of Computation

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2018, face-to-face day section Prof. Marvin K. Nakayama

Undecidability COMS Ashley Montanaro 4 April Department of Computer Science, University of Bristol Bristol, UK

Turing Machines. Wolfgang Schreiner

CSCE 551: Chin-Tser Huang. University of South Carolina

Variants of Turing Machine (intro)

Turing Machine Variants

Turing Machines (TM) The Turing machine is the ultimate model of computation.

What languages are Turing-decidable? What languages are not Turing-decidable? Is there a language that isn t even Turingrecognizable?

1 Showing Recognizability

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2016, face-to-face day section Prof. Marvin K. Nakayama

UNIT-VIII COMPUTABILITY THEORY

Decidability (What, stuff is unsolvable?)

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Accept or reject. Stack

DM17. Beregnelighed. Jacob Aae Mikkelsen

SYLLABUS. Introduction to Finite Automata, Central Concepts of Automata Theory. CHAPTER - 3 : REGULAR EXPRESSIONS AND LANGUAGES

Final exam study sheet for CS3719 Turing machines and decidability.

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

Closure under the Regular Operations

Rumination on the Formal Definition of DPDA

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

CS 154, Lecture 2: Finite Automata, Closure Properties Nondeterminism,

Theory of Computation (IV) Yijia Chen Fudan University

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

Turing machines COMS Ashley Montanaro 21 March Department of Computer Science, University of Bristol Bristol, UK

The Pumping Lemma: limitations of regular languages

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

ECS 120 Lesson 15 Turing Machines, Pt. 1

CS Lecture 28 P, NP, and NP-Completeness. Fall 2008

Theory of computation: initial remarks (Chapter 11)

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

CSE 105 THEORY OF COMPUTATION

CMSC 330: Organization of Programming Languages. Pushdown Automata Parsing

Lecture Notes: The Halting Problem; Reductions

Finite Automata. Finite Automata

Space Complexity. The space complexity of a program is how much memory it uses.

Turing machine. Turing Machine Model. Turing Machines. 1. Turing Machines. Wolfgang Schreiner 2. Recognizing Languages

Foundations of

T (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,

The Post Correspondence Problem

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY

Lecture notes on Turing machines

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen

CS500 Homework #2 Solutions

Introduction to Languages and Computation

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET

Computability Theory. CS215, Lecture 6,

Lecture 12: Mapping Reductions

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Most General computer?

COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE

MA/CSSE 474 Theory of Computation

CSE 105 THEORY OF COMPUTATION

Theory of Computation

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

ACS2: Decidability Decidability

CSCE 551 Final Exam, Spring 2004 Answer Key

St.MARTIN S ENGINEERING COLLEGE Dhulapally, Secunderabad

Homework 8. a b b a b a b. two-way, read/write

Decidability (intro.)

The Church-Turing Thesis

PS2 - Comments. University of Virginia - cs3102: Theory of Computation Spring 2010

Finite Automata. Mahesh Viswanathan

V Honors Theory of Computation

Further discussion of Turing machines

Turing Machines Part III

CSE 105 THEORY OF COMPUTATION

Pushdown Automata (2015/11/23)

Automata Theory (2A) Young Won Lim 5/31/18

Introduction to Turing Machines

Decidability. Human-aware Robotics. 2017/10/31 Chapter 4.1 in Sipser Ø Announcement:

Theory of Computation

Please give details of your answer. A direct answer without explanation is not counted.

CS5371 Theory of Computation. Lecture 12: Computability III (Decidable Languages relating to DFA, NFA, and CFG)

5 3 Watson-Crick Automata with Several Runs

Nondeterministic Finite Automata

Transcription:

and linear bounded automata Informatics 2A: Lecture 29 John Longley School of Informatics University of Edinburgh jrl@inf.ed.ac.uk 25 November, 2011 1 / 13

1 The Chomsky hierarchy: summary 2 3 4 2 / 13

The Chomsky hierarchy: summary Level Language type Grammar rules Accepting machines 3 Regular X ɛ, X Y, NFAs (or DFAs) X ay 2 Context-free X β Nondeterministic pushdown automata 1 Context-sensitive α β Nondet. linear with α β bounded automata 0 Recursively α β enumerable (unrestricted) 3 / 13

The length restriction in CSGs: some intuition What s the motivation for the restriction α β in context-sensitive rules? Idea: in a context-sensitive derivation S s, all the sentential forms are of length at most s. This means that if L is context-sensitive, and we re trying to decide whether s L, we only need to consider possible sentential forms of length s. So intuitively, we have the problem under control, at least in principle. By contrast, without the length restriction, there s no upper limit on the length of intermediate forms that might appear in a derivation of s. So if we re searching for a derivation for s, how do we know when to stop looking? Intuitively, the problem here is wild and out of control. We ll see this intuition made more precise as we proceed. 4 / 13

Recall that NFAs are essentially memoryless, whilst NPDAs are equipped with memory in the form of a stack. To find the right kinds of machines for the top two Chomsky levels, we need to allow more general manipulation of memory. A Turing machine essentially consists of a finite-state control unit, equipped with a memory tape, infinite in both directions. Each cell on the tape contains a symbol drawn from a finite alphabet Γ.... : a c 3 a b 5 $... read, write, move L/R finite control 5 / 13

, continued... : a c 3 a b 5 $... read, write, move L/R finite control At each step, the behaviour of the machine can depend on the current state of the control unit, the tape symbol at the current read position. Depending on these things, the machine may then overwrite the current tape symbol with a new symbol, shift the tape left or right by one cell, jump to a new control state. This happens repeatedly until (let s say) the control unit enters some final state. 6 / 13

as acceptors To use a Turing machine T as an acceptor for a language over Σ, assume Σ Γ, and set up the tape with the test string s Σ written left-to-right starting at the read position, and with blank symbols everywhere else. Then let the machine run (maybe overwriting s), and if it enters the final state, declare that the original string s is accepted. The language accepted by T (written L(T )) consists of all strings s that are accepted in this way. Theorem: A set L Σ is generated by some unrestricted (Type 0) grammar if and only if L = L(T ) for some Turing machine T. So both Type 0 grammars and lead to the same class of recursively enumerable languages. 7 / 13

, a bit more formally A Turing machine T consists of: A set Q of control states An initial state i Q A final (accepting) state f Q A tape alphabet Γ An input alphabet Σ Γ A blank symbol Γ Σ A transition function δ : Q Γ Q Γ {L, R}. Numerous minor variations possible differences unimportant. Can then define formally what s meant by an accepting computation for a string s Σ... 8 / 13

Suppose we modify our model to allow just a finite tape, initially containing just the test string s with endmarkers on either side: h e y m a n The machine therefore has just a finite amount of memory, determined by the length of the input string. We call this a linear bounded automaton. (LBAs are sometimes defined as having tape length bounded by a constant multiple of length of input string doesn t make any difference in principle.) Theorem: A language L Σ is context-sensitive if and only if L = L(T ) for some non-deterministic linear bounded automaton T. Rough idea: we can guess at a derivation for s. We can check each step since each sentential form fits within the tape. 9 / 13

Determinism vs. non-determinism: a curiosity At the bottom level of the Chomsky hierarchy, it makes no difference: every NFA can be simulated by a DFA. At the top level, the same happens. We ve defined a deterministic version of but any non-det Turing machine can be simulated by a det one. At the context-free level, there is a difference: we need NPDAs to account for all context-free languages. (Example: Σ {ss s Σ } is a context-free language whose complement isn t context-free, see last lecture. However, if L is accepted by a DPDA then so is its complement can just swap accepting and non-accepting states.) What about the context-sensitive level? Are NLBAs strictly more powerful than DLBAs? This is still an open question! (Can t use the same argument because it turns out that CSLs are closed under complementation only shown in 1988.) 10 / 13

Detecting non-acceptance: LBAs versus TMs Suppose T is an LBA. How might we detect that s is not in L(T )? Clearly, if there s an accepting computation for s, there s one that doesn t pass through exactly the same machine configuration twice (if it did, we could shorten it). Since the tape is finite, the total number of machine configurations is finite (though large). So in theory, if T runs for long enough without reaching the final state, it will enter the same configuration twice, and we may as well abort. Note that on this view, repeated configurations would be spotted not by T itself, but by us watching, or perhaps by some super-machine spying on T. For with unlimited tape space, this reasoning doesn t work. Is there some general way of spotting that a computation isn t going to terminate?? See next lecture... 11 / 13

Wider significance of are important because (it s generally believed that) anything that can be done by any mechanical procedure or algorithm can in principle be done by a Turing machine (Church-Turing thesis). E.g.: Any language L Σ that can be recognized by some mechanical procedure can be recognized by a TM. Any mathematical function f : N N that can be computed by a mechanical procedure can be computed by a TM (e.g. representing integers in binary, and requiring the TM to write the result onto the tape.) 12 / 13

Status of Church-Turing Thesis The CT Thesis is a somewhat informal statement insofar as the general notion of a mechanical procedure isn t formally defined (although we have a pretty good idea of what we mean by it). Although a certain amount of philosophical hair-splitting is possible, the broad idea behind CTT is generally accepted. At any rate, anything that can be done on any present-day computer (even disregarding time/memory limitations) can in principle be done on a TM. So if we buy into CTT, theorems about what TMs can/can t do can be interpreted as fundamental statements about what can/can t be accomplished by mechanical computation in general. We ll see some examples of such theorems next time. 13 / 13