The tape of M. Figure 3: Simulation of a Turing machine with doubly infinite tape

Similar documents
CSCC63 Worksheet Turing Machines

CSE 105 THEORY OF COMPUTATION

Introduction to Turing Machines. Reading: Chapters 8 & 9

The Church-Turing Thesis

Turing Machines (TM) Deterministic Turing Machine (DTM) Nondeterministic Turing Machine (NDTM)

The Turing Machine. CSE 211 (Theory of Computation) The Turing Machine continued. Turing Machines

Recap DFA,NFA, DTM. Slides by Prof. Debasis Mitra, FIT.

Part I: Definitions and Properties

Most General computer?

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

Chapter 3: The Church-Turing Thesis

CS4026 Formal Models of Computation

ECS 120 Lesson 15 Turing Machines, Pt. 1

CSCI3390-Assignment 2 Solutions

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

Automata Theory (2A) Young Won Lim 5/31/18

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

CpSc 421 Final Exam December 15, 2006

Lecture notes on Turing machines

where Q is a finite set of states

Finite Automata. Mahesh Viswanathan

1 Definition of a Turing machine

CS21 Decidability and Tractability

Variants of Turing Machine (intro)

Turing Machines. Nicholas Geis. February 5, 2015

CSE 105 THEORY OF COMPUTATION

The Power of One-State Turing Machines

Computability Theory. CS215, Lecture 6,

Lecture 14: Recursive Languages

Turing machines Finite automaton no storage Pushdown automaton storage is a stack What if we give the automaton a more flexible storage?

UNIT-VIII COMPUTABILITY THEORY

Busch Complexity Lectures: Turing Machines. Prof. Busch - LSU 1

Languages, regular languages, finite automata

1 Showing Recognizability

Turing Machines, diagonalization, the halting problem, reducibility

Turing Machines. The Language Hierarchy. Context-Free Languages. Regular Languages. Courtesy Costas Busch - RPI 1

Turing Machines Part II

Universal Turing Machine. Lecture 20

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

Sample Project: Simulation of Turing Machines by Machines with only Two Tape Symbols

TURING MAHINES

Boolean circuits. Lecture Definitions

SE 3310b Theoretical Foundations of Software Engineering. Turing Machines. Aleksander Essex

Homework Assignment 6 Answers

The Turing machine model of computation

IV. Turing Machine. Yuxi Fu. BASICS, Shanghai Jiao Tong University

CHAPTER 5. Interactive Turing Machines

Theory of Computation

Decidability (What, stuff is unsolvable?)

Griffith University 3130CIT Theory of Computation (Based on slides by Harald Søndergaard of The University of Melbourne) Turing Machines 9-0

Turing Machines. 22c:135 Theory of Computation. Tape of a Turing Machine (TM) TM versus FA, PDA

P = k=0 TIMETM(nk ), NP-completeness and Cook-Levin theorem

Lecture 1: Introduction

Chapter 6: Turing Machines

Designing 1-tape Deterministic Turing Machines

The halting problem is decidable on a set of asymptotic probability one

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

Parallel Turing Machines on a Two-Dimensional Tape

Automata Theory - Quiz II (Solutions)

CpSc 421 Homework 9 Solution

COMPUTATIONAL COMPLEXITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

CSCE 551 Final Exam, Spring 2004 Answer Key

Computability Crib Sheet

2.6 Variations on Turing Machines

Theory of Computation (IX) Yijia Chen Fudan University

Testing Emptiness of a CFL. Testing Finiteness of a CFL. Testing Membership in a CFL. CYK Algorithm

CS 525 Proof of Theorems 3.13 and 3.15

Turing machines and linear bounded automata

CS151 Complexity Theory. Lecture 1 April 3, 2017

jflap demo Regular expressions Pumping lemma Turing Machines Sections 12.4 and 12.5 in the text

Computation Histories

4.2 The Halting Problem

Lecture 12: Mapping Reductions

Turing machines COMS Ashley Montanaro 21 March Department of Computer Science, University of Bristol Bristol, UK

CS20a: Turing Machines (Oct 29, 2002)

1 Acceptance, Rejection, and I/O for Turing Machines

Formal Definition of Computation. August 28, 2013

Turing Machines. Our most powerful model of a computer is the Turing Machine. This is an FA with an infinite tape for storage.

Notes for Comp 497 (Comp 454) Week 12 4/19/05. Today we look at some variations on machines we have already seen. Chapter 21

A Universal Turing Machine

Theory of Computation - Module 4

Theory of Computation

Theory of Computation

Lecture 1: Course Overview and Turing machine complexity

Theory of Computing Tamás Herendi

COMPARATIVE ANALYSIS ON TURING MACHINE AND QUANTUM TURING MACHINE

CS 21 Decidability and Tractability Winter Solution Set 3

(a) Definition of TMs. First Problem of URMs

1 Unrestricted Computation

CPSC 421: Tutorial #1

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

Non-emptiness Testing for TMs

CSCI3390-Lecture 6: An Undecidable Problem

Turing Machine Variants

6.5.3 An NP-complete domino game

Lecture Notes On THEORY OF COMPUTATION MODULE -1 UNIT - 2

Transcription:

UG3 Computability and Intractability (2009-2010): Note 4 4. Bells and whistles. In defining a formal model of computation we inevitably make a number of essentially arbitrary design decisions. These decisions should not affect the class of languages that can be recognized within the model. One way of gaining confidence in a particular model is to demonstrate that extending the model in various ways does not add to the class of languages that can be recognized. We consider here four simple extensions to the Turing machine of Note 3, and show that none of these extensions adds extra power to the model in this sense. In carrying out this programme, we encounter the important technique of simulation: to show that two models have equivalent power, we demonstrate that the ostensibly more restricted model may simulate any computation within the more general model. 4.1. Doubly infinite tape. A Turing machine, M, with doubly infinite tape is formally identical in its specification to the singly infinite version defined in Note 3. However a configuration of M is now of the form αqβ, where α is a sequence of tape symbols which is infinite to the left, and β is a sequence of tape symbols infinite to the right. (Formally, the elements of α are indexed by the negative integers:..., 3, 2, 1.) The one step relation can be defined by analogy with the corresponding relation in Note 3. Observe that it is now impossible for the head to drop off the left hand end of the tape, as the tape no longer has a left hand end. All but a finite number of symbols in α and β are blanks, so, just as before, we can make the configurations of M finite by stripping away leading and trailing blanks. The initial configuration of M and the criterion for acceptance are unchanged from Note 3. The generalization to a doubly infinite tape adds no extra power to our model of computation. Lemma 4.1 A language L is accepted by a Turing machine with doubly infinite tape if and only if L is accepted by a Turing machine with singly infinite tape. proof. The if part is easy. Suppose L is accepted by a Turing machine M with singly infinite tape. Then L is also accepted by the doubly infinite machine M that first writes a special symbol onto the tape just to the left of the input word and then behaves exactly as M. If the head of M ever re-encounters the special symbol, then M immediately halts and rejects. Now we tackle the only if part. Suppose L is accepted by the Turing machine M = (Q, Γ, Σ, b, q I, q F, δ) with doubly infinite tape. We show how to construct a machine M = ( Q, Γ, Σ, b, q I, q F, δ ) with singly infinite tape which accepts L. Imagine that the squares of the doubly infinite tape of M are indexed by integers, with the input word starting at tape square 0. Fold the tape about the middle of 21

s 5 s 4 s 3 s 2 s 1 s 0 s 1 s 2 s 3 s 4 s 5 The tape of M $ s 1 s 2 s 3 s 4 s 5 s 0 s 1 s 2 s 3 s 4 s 5 The tape of M Figure 3: Simulation of a Turing machine with doubly infinite tape square 0 so that squares i and i, for all i, are brought into alignment. In essence the machine M works by simulating the operation of M using the (singly infinite) folded tape. Refining this idea, we can think of the tape of M as having two tracks, the upper track representing the tape squares of M with negative index, and the lower track representing those with positive index. The first tape square on the upper track contains a special symbol $ which is not in the tape alphabet Γ of M. Figure 3 is intended to suggest the correspondence between the tapes of M and M. It should seem plausible that M can be constructed so as to simulate the computation of M, move for move. Informally, the technique is the following. When the tape head of M is over a square with positive index i, the head of M is over square i of its tape, reading from the lower track. When the tape head of M is over a square with negative index i, the head of M is also over square i, but this time reading from the upper track; in this mode, M always moves its head in the opposite direction to the head of M. On this occasion only, we shall flesh out the informal argument by providing a formal description of the simulating machine M. By doing this we shall conclusively demonstrate that a Turing machine with doubly infinite tape can be converted, in an entirely mechanical way, into an equivalent machine with only singly infinite tape. The states, tape alphabet and input alphabet of M are as follows: Q = q I, q F } (Q 0, 1}), Γ = Γ (Γ $}), Σ = Σ b} Γ. Aside from the initial and final states (which are special) the states of M are pairs of the form q, 0 or q, 1, where q is a state of the simulated machine M. The interpretation of state q, 0 (respectively, state q, 1 ) is that the simulated machine is in state q and is scanning a tape square with positive (respectively, negative) index; i.e., the simulating machine M is reading from the lower (respectively, up- 22

per) track of its tape. The tape symbols of M are pairs of the form s 0, s 1, where s 0 Γ is to be interpreted as the symbol on the lower track of M s tape, and s 1 as the symbol on the upper track. The input symbols are elements of Γ in which the second component of the pair is the blank symbol. Note that there is an obvious correspondence between elements of Σ and Σ 14. Naturally enough, the blank symbol of M is the pair b = b, b. Finally, the transition function δ of M is defined by rules (1) (5) below. The ordering of rules is significant: the earlier rules take precedence over the later ones. (There is a parallel here with function definition by patterns in the programming language Haskell.) (1) Transitions from the initial state: ( q δ( q I, s, b ) =, 0, s, $, R), if δ(q I, s) = (q, s, R); ( q, 1, s, $, R), if δ(q I, s) = (q, s, L). [The special symbol $ is written to mark the end of the tape while, simultaneously, the first move of M is simulated.] (2) Transitions to the final state: δ( q F,, s 0, s 1 ) = ( q F, s 0, s 1, R). [If M enters its accepting state, then M enters its accepting state on the following move.] (3) Transitions when M is scanning square 0: ( q δ( q,, s, $ ) =, 0, s, $, R), if δ(q, s) = (q, s, R); ( q, 1, s, $, R), if δ(q, s) = (q, s, L). [If the head of M moves right the simulation is continued on the lower track, otherwise on the upper track.] (4) Transitions when M is scanning a square with positive index: ( q δ( q, 0, s 0, s 1 ) =, 0, s 0, s 1, L), if δ(q, s 0 ) = (q, s 0, L); ( q, 0, s 0, s 1, R), if δ(q, s 0 ) = (q, s 0, R). 14 Strictly speaking the machine M does not accept the same language as M but rather the language x, b x L}. However we could extend the alphabet of M by throwing in the symbols of Σ and adding a pre-processing phase to M that checks the input string is from Σ and converts it to the paired form required; after that the machine behaves as described in the proof. 23

(5) Transitions when M is scanning a square with negative index: ( q δ( q, 1, s 0, s 1 ) =, 1, s 0, s 1, R), if δ(q, s 1 ) = (q, s 1, L); ( q, 1, s 0, s 1, L), if δ(q, s 1 ) = (q, s 1, R). [Note that M must move its head in the opposite direction to that of M.] This completes the formal description of M. from M is a purely mechanical process. Note that the construction of M Most people will agree that specifying simulations at this level of detail is quite tedious. Although it is instructive to carry out one or two proofs with this formality, little is gained from repeating the exercise many times. Indeed the formalism merely serves to obscure the idea of the proof. In future, then, we shall present proofs more informally; our growing intuition about Turing machines will assure us that the details could, in principle, be filled in on request. 4.2. Several tapes. A k-tape Turing machine has a finite control and k heads, each scanning a separate (singly infinite) tape. In a single transition, the machine performs the following sequence of actions, depending on the current state of the finite control, and on the k tape symbols scanned by the tape heads: 1. the finite control moves to a new state; 2. each tape head prints a new symbol on the tape square it currently scans; 3. each tape head moves (independently) one square left or right. Initially, the input appears left justified on the first tape, and the other tapes are all blank. All tape heads are positioned at the leftmost square of each tape. The most important detail in the formal definition is that the transition function δ maps Q Γ k to Q (Γ L, R}) k, which says that each transition is determined by the state and the k-tuple of currently scanned symbols, with independent actions (symbol written and direction of movement) for each of the tapes. Lemma 4.2 If a language L is accepted by a k-tape Turing machine for some k, then it is accepted by a single-tape Turing machine. proof. Let M be a k-tape Turing machine which accepts the language L. We shall construct a single tape Turing machine M which simulates the operation of M. We shall imagine that M s tape is divided into 2k tracks, two tracks for each tape of M. One track is used to record the contents of the corresponding tape of M. The other track is blank except that one square is marked by the special symbol ; this head marker points out the square which is being scanned by the 24

s 0 s 1 s 2 s 3 s 4 s 5 t 0 t 1 t 2 t 3 t 4 t 5 The tapes of M s 0 s 1 s 2 s 3 s 4 s 5 b b b b b t 0 t 1 t 2 t 3 t 4 t 5 b b b b b The tape of M Figure 4: Simulation of a 2-tape machine corresponding tape head of M. The first action of M is to print a composite symbol, containing k of these head markers, onto the first square of its tape. [Questions: formally, what is the tape alphabet of M? the input alphabet? the blank symbol?] Figure 4 is intended to convey the arrangement of the single tape of M and its correspondence with the k tapes of M. In simulating a single move of M, the machine M has to perform a sequence of moves. The current state of M is always recorded in M s finite control. Each move of M is simulated by performing a sweep from the leftmost head marker to the rightmost, followed by a sweep from the rightmost head marker to the leftmost. In the left to right sweep, M makes a note, in its finite control, of all k symbols scanned by the simulated machine M. By keeping a note of the number of head markers encountered during the sweep (again in the finite control) the sweep can be terminated as soon as the rightmost head marker has been reached. The machine M now has all the information necessary to simulate a single step of M; this it does during the right to left sweep. As each head marker is encountered, M alters the tape symbol corresponding to that marker, and moves the marker left or right as appropriate. Again, by counting the head markers as they are encountered, the sweep can be terminated as soon as the leftmost marker has been reached. Finally, M moves to a new state to reflect the new state of M. If the new state of M is an accepting state then M accepts. Lemma 4.2 will prove useful to us later. It is often easier to construct a Turing machine to perform a certain task if we allow ourselves to use a number of work tapes, each with a specified function. Lemma 2 assures us that any machine we construct in this way could, in principle, be transformed into a one-tape machine. 25

4.3. Other variants. A two-dimensional Turing machine is one that has a two-dimensional array or page of squares, in place of the one-dimensional array of squares we have been calling a tape. The page has has a definite top-left corner, but is unbounded below and to the right. The transitions of the two-dimensional machine are similar to those of the conventional one-dimensional machine, except that the tape head can now move one square up, down, left, or right at each step, rather like a crippled 15 king in chess. The two-dimensional machine corresponds more closely to hand computation using a sheet of paper; however, it has no greater power than a one-dimensional machine: Lemma 4.3 If a language L is accepted by a two-dimensional Turing machine, then L is accepted by a one-dimensional Turing machine. Again the proof, which is omitted, involves simulating the computation of an arbitrary two-dimensional machine on a one-dimensional machine. Finally, we might attempt to increase the power of the basic model by adding extra tape heads. This mirrors the ability of a human beings to shuffle the pages they are working on, and observe different symbols in widely separated parts of the manuscript simultaneously, or at least in quick succession. A k-head Turing machine is similar to the basic model except that it is equipped with k heads each moving independently of each other. Using a simulation similar to that employed in the proof of Lemma 4.2, it is not difficult to show the following: Lemma 4.4 If a language L is accepted by a k-head Turing machine, then L is accepted by a one-head Turing machine. It is reassuring that the model we have chosen to work with is robust, i.e., is not affected by minor modifications to the definition. Indeed, we would have to discard the model if this turned out not to be the case. 15 In chess, a king can also move along the diagonals. 26