More About Turing Machines. Programming Tricks Restrictions Extensions Closure Properties

Similar documents
More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Most General computer?

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

Chapter 8. Turing Machine (TMs)

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

CSE 105 THEORY OF COMPUTATION

Turing Machines Part II

Homework 8. a b b a b a b. two-way, read/write

Lecture 14: Recursive Languages

Chapter 7 Turing Machines

Notes for Comp 497 (Comp 454) Week 12 4/19/05. Today we look at some variations on machines we have already seen. Chapter 21

Further discussion of Turing machines

Testing Emptiness of a CFL. Testing Finiteness of a CFL. Testing Membership in a CFL. CYK Algorithm

Turing Machine Variants. Sipser pages

Computability Theory. CS215, Lecture 6,

Turing s thesis: (1930) Any computation carried out by mechanical means can be performed by a Turing Machine

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

CS 301. Lecture 18 Decidable languages. Stephen Checkoway. April 2, 2018

Part I: Definitions and Properties

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

Homework. Turing Machines. Announcements. Plan for today. Now our picture looks like. Languages

Turing Machines Part II

Equivalence of TMs and Multitape TMs. Theorem 3.13 and Corollary 3.15 By: Joseph Lauman

Introduction to Turing Machines. Reading: Chapters 8 & 9

Busch Complexity Lectures: Turing s Thesis. Costas Busch - LSU 1

Turing Machines Part III

Polynomial Space. The classes PS and NPS Relationship to Other Classes Equivalence PS = NPS A PS-Complete Problem

An example of a decidable language that is not a CFL Implementation-level description of a TM State diagram of TM

Turing Machines Part II

A Note on Turing Machine Design

T (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,

CS154 Final Examination

DM17. Beregnelighed. Jacob Aae Mikkelsen

Decidability (What, stuff is unsolvable?)

Decision Problems with TM s. Lecture 31: Halting Problem. Universe of discourse. Semi-decidable. Look at following sets: CSCI 81 Spring, 2012

Great Theoretical Ideas in Computer Science. Lecture 4: Deterministic Finite Automaton (DFA), Part 2

CSE 105 THEORY OF COMPUTATION

CPSC 421: Tutorial #1

Variations of the Turing Machine

Final Exam Comments. UVa - cs302: Theory of Computation Spring < Total

CSCI3390-Assignment 2 Solutions

CS 21 Decidability and Tractability Winter Solution Set 3

V Honors Theory of Computation

Automata Theory CS S-12 Turing Machine Modifications

MA/CSSE 474 Theory of Computation

Equivalence of Regular Expressions and FSMs

SD & Turing Enumerable. Lecture 33: Reductions and Undecidability. Lexicographically Enumerable. SD & Turing Enumerable

CSCC63 Worksheet Turing Machines

CS 154, Lecture 2: Finite Automata, Closure Properties Nondeterminism,

where Q is a finite set of states

Languages, regular languages, finite automata

Turing Machine properties. Turing Machines. Alternate TM definitions. Alternate TM definitions. Alternate TM definitions. Alternate TM definitions

CISC 4090: Theory of Computation Chapter 1 Regular Languages. Section 1.1: Finite Automata. What is a computer? Finite automata

CS 525 Proof of Theorems 3.13 and 3.15

The Turing machine model of computation

The Post Correspondence Problem

Turing s Thesis. Fall Costas Busch - RPI!1

Announcements. Problem Set 6 due next Monday, February 25, at 12:50PM. Midterm graded, will be returned at end of lecture.

Computation Histories

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

Computability Crib Sheet

Turing Machines (TM) The Turing machine is the ultimate model of computation.

Lecture 25: Cook s Theorem (1997) Steven Skiena. skiena

Turing Machines. Wolfgang Schreiner

Space Complexity. The space complexity of a program is how much memory it uses.

Turing Machines. 22c:135 Theory of Computation. Tape of a Turing Machine (TM) TM versus FA, PDA

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Advanced topic: Space complexity

Verifying whether One-Tape Turing Machines Run in Linear Time

Closure Properties of Regular Languages. Union, Intersection, Difference, Concatenation, Kleene Closure, Reversal, Homomorphism, Inverse Homomorphism

jflap demo Regular expressions Pumping lemma Turing Machines Sections 12.4 and 12.5 in the text

Theory Bridge Exam Example Questions

CS21 Decidability and Tractability

Turing Machines A Turing Machine is a 7-tuple, (Q, Σ, Γ, δ, q0, qaccept, qreject), where Q, Σ, Γ are all finite

MTAT Complexity Theory October 13th-14th, Lecture 6

6.045: Automata, Computability, and Complexity Or, Great Ideas in Theoretical Computer Science Spring, Class 8 Nancy Lynch

UNIT-VIII COMPUTABILITY THEORY

CS4026 Formal Models of Computation

Theory of computation: initial remarks (Chapter 11)

Before we show how languages can be proven not regular, first, how would we show a language is regular?

Automata and Computability. Solutions to Exercises

An example of a decidable language that is not a CFL Implementation-level description of a TM State diagram of TM

CSE 105 Theory of Computation

TURING MAHINES

Foundations of

Lecture Notes On THEORY OF COMPUTATION MODULE -1 UNIT - 2

Before We Start. Turing Machines. Languages. Now our picture looks like. Theory Hall of Fame. The Turing Machine. Any questions? The $64,000 Question

CS154 Final Examination

CpSc 421 Final Exam December 15, 2006

Lecture 3. 1 Terminology. 2 Non-Deterministic Space Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, 2005.

Introduction to Turing Machines

Recap DFA,NFA, DTM. Slides by Prof. Debasis Mitra, FIT.

Pushdown Automata. Chapter 12

Closure under the Regular Operations

Lecture 12: Mapping Reductions

Chapter 1 - Time and Space Complexity. deterministic and non-deterministic Turing machine time and space complexity classes P, NP, PSPACE, NPSPACE

Reducability. Sipser, pages

The Church-Turing Thesis

Computational Models Lecture 8 1

1 Showing Recognizability

Transcription:

More About Turing Machines Programming Tricks Restrictions Extensions Closure Properties 1

Overview At first, the TM doesn t look very powerful. Can it really do anything a computer can? We ll discuss programming tricks to convince you that it can simulate a real computer. 2

Overview (2) We need to study restrictions on the basic TM model (e.g., tapes infinite in only one direction). Assuming a restricted form makes it easier to talk about simulating arbitrary TM s. That s essential to exhibit a language that is not recursively enumerable. 3

Overview (3) We also need to study generalizations of the basic model. Needed to argue there is no more powerful model of what it means to compute. Example: A nondeterministic TM with 50 six-dimensional tapes is no more powerful than the basic model. 4

Programming Trick: Multiple Tracks Think of tape symbols as vectors with k components. Each component chosen from a finite alphabet. Makes the tape appear to have k tracks. Let input symbols be blank in all but one track. 5

Picture of Multiple Tracks Represents input symbol 0 q Represents the blank 0 B B X Y Z B B B Represents one symbol [X,Y,Z] 6

Programming Trick: Marking A common use for an extra track is to mark certain positions. Almost all cells hold B (blank) in this track, but several hold special symbols (marks) that allow the TM to find particular places on the tape. 7

Marking q B W X Y B Z Marked Y Unmarked W and Z 8

Programming Trick: Caching in the State The state can also be a vector. First component is the control state. Other components hold data from a finite alphabet. 9

Example: Using These Tricks This TM doesn t do anything terribly useful; it copies its input w infinitely. Control states: q: Mark your position and remember the input symbol seen. p: Run right, remembering the symbol and looking for a blank. Deposit symbol. r: Run left, looking for the mark. 10

Example (2) States have the form [x, Y], where x is q, p, or r and Y is 0, 1, or B. Only p uses 0 and 1. Tape symbols have the form [U, V]. U is either X (the mark ) or B. V is 0, 1 (the input symbols) or B. [B, B] is the TM blank; [B, 0] and [B, 1] are the inputs. 11

The Transition Function Convention: a and b each stand for either 0 or 1. δ([q,b], [B,a]) = ([p,a], [X,a], R). In state q, copy the input symbol under the head (i.e., a ) into the state. Mark the position read. Go to state p and move right. 12

Transition Function (2) δ([p,a], [B,b]) = ([p,a], [B,b], R). In state p, search right, looking for a blank symbol (not just B in the mark track). δ([p,a], [B,B]) = ([r,b], [B,a], L). When you find a B, replace it by the symbol (a ) carried in the cache. Go to state r and move left. 13

Transition Function (3) δ([r,b], [B,a]) = ([r,b], [B,a], L). In state r, move left, looking for the mark. δ([r,b], [X,a]) = ([q,b], [B,a], R). When the mark is found, go to state q and move right. But remove the mark from where it was. q will place a new mark and the cycle repeats. 14

Simulation of the TM q B... B B B B...... 0 1 B B... 15

Simulation of the TM p 0... X B B B...... 0 1 B B... 16

Simulation of the TM p 0... X B B B...... 0 1 B B... 17

Simulation of the TM r B... X B B B...... 0 1 0 B... 18

Simulation of the TM r B... X B B B...... 0 1 0 B... 19

Simulation of the TM q B... B B B B...... 0 1 0 B... 20

Simulation of the TM p 1... B X B B...... 0 1 0 B... 21

Semi-infinite Tape We can assume the TM never moves left from the initial position of the head. Let this position be 0; positions to the right are 1, 2, and positions to the left are 1, 2, New TM has two tracks. Top holds positions 0, 1, 2, Bottom holds a marker, positions 1, 2, 22

Simulating Infinite Tape by Semi-infinite Tape q U/L State remembers whether simulating upper or lower track. Reverse directions for lower track. 0 1 2 3... * -1-2 -3... Put * here at the first move You don t need to do anything, because these are initially B. 23

More Restrictions Read in Text Two stacks can simulate one tape. One holds positions to the left of the head; the other holds positions to the right. In fact, by a clever construction, the two stacks to be counters = only two stack symbols, one of which can only appear at the bottom. Factoid: Invented by Pat Fischer, whose main claim to fame is that he was a victim of the Unabomber. 24

Extensions More general than the standard TM. But still only able to define the RE languages. 1. Multitape TM. 2. Nondeterministic TM. 3. Store for key-value pairs. 25

Multitape Turing Machines Allow a TM to have k tapes for any fixed k. Move of the TM depends on the state and the symbols under the head for each tape. In one move, the TM can change state, write symbols under each head, and move each head independently. 26

Simulating k Tapes by One Use 2k tracks. Each tape of the k-tape machine is represented by a track. The head position for each track is represented by a mark on an additional track. 27

Picture of Multitape Simulation q X head for tape 1... A B C A C B... tape 1 X head for tape 2... U V U U W V... tape 2 28

Nondeterministic TM s Allow the TM to have a choice of move at each step. Each choice is a state-symbol-direction triple, as for the deterministic TM. The TM accepts its input if any sequence of choices leads to an accepting state. 29

Simulating a NTM by a DTM The DTM maintains on its tape a queue of ID s of the NTM. A second track is used to mark certain positions: 1. A mark for the ID at the head of the queue. 2. A mark to help copy the ID at the head and make a one-move change. 30

Picture of the DTM Tape Front of queue Where you are copying ID k with a move X Y ID 0 # ID 1 # # ID k # ID k+1 # ID n # New ID Rear of queue 31

Operation of the Simulating DTM The DTM finds the ID at the current front of the queue. It looks for the state in that ID so it can determine the moves permitted from that ID. If there are m possible moves, it creates m new ID s, one for each move, at the rear of the queue. 32

Operation of the DTM (2) The m new ID s are created one at a time. After all are created, the marker for the front of the queue is moved one ID toward the rear of the queue. However, if a created ID has an accepting state, the DTM instead accepts and halts. 33

Why the NTM -> DTM Construction Works There is an upper bound, say k, on the number of choices of move of the NTM for any state/symbol combination. Thus, any ID reachable from the initial ID by n moves of the NTM will be constructed by the DTM after constructing at most (k n+1 -k)/(k-1)id s. Sum of k+k 2 + +k n 34

Why? (2) If the NTM accepts, it does so in some sequence of n choices of move. Thus the ID with an accepting state will be constructed by the DTM in some large number of its own moves. If the NTM does not accept, there is no way for the DTM to accept. 35

Taking Advantage of Extensions We now have a really good situation. When we discuss construction of particular TM s that take other TM s as input, we can assume the input TM is as simple as possible. E.g., one, semi-infinite tape, deterministic. But the simulating TM can have many tapes, be nondeterministic, etc. 36

Real Computers Recall that, since a real computer has finite memory, it is in a sense weaker than a TM. Imagine a computer with an infinite store for name-value pairs. Generalizes an address space. 37

Simulating a Name-Value Store by a TM The TM uses one of several tapes to hold an arbitrarily large sequence of name-value pairs in the format #name*value# Mark, using a second track, the left end of the sequence. A second tape can hold a name whose value we want to look up. 38

Lookup Starting at the left end of the store, compare the lookup name with each name in the store. When we find a match, take what follows between the * and the next # as the value. 39

Insertion Suppose we want to insert name-value pair (n, v), or replace the current value associated with name n by v. Perform lookup for name n. If not found, add n*v# at the end of the store. 40

Insertion (2) If we find #n*v #, we need to replace v by v. If v is shorter than v, you can leave blanks to fill out the replacement. But if v is longer than v, you need to make room. 41

Insertion (3) Use a third tape to copy everything from the first tape at or to the right of v. Mark the position of the * to the left of v before you do. Copy from the third tape to the first, leaving enough room for v. Write v where v was. 42

Closure Properties of Recursive and RE Languages Both closed under union, concatenation, star, reversal, intersection, inverse homomorphism. Recursive closed under difference, complementation. RE closed under homomorphism. 43

Union Let L 1 = L(M 1 ) and L 2 = L(M 2 ). Assume M 1 and M 2 are single-semiinfinite-tape TM s. Construct 2-tape TM M to copy its input onto the second tape and simulate the two TM s M 1 and M 2 each on one of the two tapes, in parallel. 44

Union (2) Recursive languages: If M 1 and M 2 are both algorithms, then M will always halt in both simulations. Accept if either accepts. RE languages: accept if either accepts, but you may find both TM s run forever without halting or accepting. 45

Picture of Union/Recursive Accept M 1 Reject OR Accept Input w M M 2 Accept Reject AND Reject Remember: = halt without accepting 46

Picture of Union/RE Input w M 1 M 2 M Accept Accept OR Accept 47

Intersection/Recursive Same Idea Accept M 1 Reject AND Accept Input w M M 2 Accept Reject OR Reject 48

Intersection/RE Input w M 1 M 2 M Accept Accept AND Accept 49

Difference, Complement Recursive languages: both TM s will eventually halt. Accept if M 1 accepts and M 2 does not. Corollary: Recursive languages are closed under complementation. RE Languages: can t do it; M 2 may never halt, so you can t be sure input is in the difference. 50

Concatenation/RE Let L 1 = L(M 1 ) and L 2 = L(M 2 ). Assume M 1 and M 2 are single-semiinfinite-tape TM s. Construct 2-tape Nondeterministic TM M: 1. Guess a break in input w = xy. 2. Move y to second tape. 3. Simulate M 1 on x, M 2 on y. 4. Accept if both accept. 51

Concatenation/Recursive Can t use a NTM. Systematically try each break w = xy. M 1 and M 2 will eventually halt for each break. Accept if both accept for any one break. Reject if all breaks tried and none lead to acceptance. 52

Star Same ideas work for each case. RE: guess many breaks, accept if M 1 accepts each piece. Recursive: systematically try all ways to break input into some number of pieces. 53

Reversal Start by reversing the input. Then simulate TM for L to accept w if and only w R is in L. Works for either Recursive or RE languages. 54

Inverse Homomorphism Apply h to input w. Simulate TM for L on h(w). Accept w iff h(w) is in L. Works for Recursive or RE. 55

Homomorphism/RE Let L = L(M 1 ). Design NTM M to take input w and guess an x such that h(x) = w. M accepts whenever M 1 accepts x. Note: won t work for Recursive languages. 56