CS 125 Section #10 (Un)decidability and Probability November 1, 2016

Similar documents
CSE 105 THEORY OF COMPUTATION

Undecidability. We are not so much concerned if you are slow as when you come to a halt. (Chinese Proverb)

16.1 Countability. CS125 Lecture 16 Fall 2014

CSE 105 THEORY OF COMPUTATION. Spring 2018 review class

1 Showing Recognizability

CSE 105 THEORY OF COMPUTATION

Lecture Notes: The Halting Problem; Reductions

Decidability (What, stuff is unsolvable?)

CSE 105 THEORY OF COMPUTATION

Computability Crib Sheet

Computational Models Lecture 9, Spring 2009

Reductions in Computability Theory

CPSC 421: Tutorial #1

V Honors Theory of Computation

Undecidability. Andreas Klappenecker. [based on slides by Prof. Welch]

CSCI3390-Lecture 6: An Undecidable Problem

Turing Machines, diagonalization, the halting problem, reducibility

CSCE 551: Chin-Tser Huang. University of South Carolina

1 Reducability. CSCC63 Worksheet Reducability. For your reference, A T M is defined to be the language { M, w M accepts w}. Theorem 5.

6.045: Automata, Computability, and Complexity Or, Great Ideas in Theoretical Computer Science Spring, Class 8 Nancy Lynch

Decidability and Undecidability

CS 361 Meeting 26 11/10/17

ACS2: Decidability Decidability

CSE 105 THEORY OF COMPUTATION

Introduction to Turing Machines. Reading: Chapters 8 & 9

Computability Theory

TURING MAHINES

CS481F01 Solutions 8

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

Lecture 12: Mapping Reductions

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

ECS 120 Lesson 18 Decidable Problems, the Halting Problem

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 27

} Some languages are Turing-decidable A Turing Machine will halt on all inputs (either accepting or rejecting). No infinite loops.

Theory of Computation p.1/?? Theory of Computation p.2/?? We develop examples of languages that are decidable

Turing Machines Part III

Computational Models Lecture 8 1

Computational Models Lecture 8 1

6.045: Automata, Computability, and Complexity Or, Great Ideas in Theoretical Computer Science Spring, Class 10 Nancy Lynch

Intro to Theory of Computation

Computational Models Lecture 8 1

CSCE 551: Chin-Tser Huang. University of South Carolina

Turing Machines. Lecture 8

4.2 The Halting Problem

Turing Machine Recap

CS154, Lecture 10: Rice s Theorem, Oracle Machines

A non-turing-recognizable language

CSE 105 Theory of Computation

17.1 The Halting Problem

Computable Functions

Undecidable Problems and Reducibility

6.045J/18.400J: Automata, Computability and Complexity. Quiz 2. March 30, Please write your name in the upper corner of each page.

Theory of Computation CS3102 Spring 2014 A tale of computers, math, problem solving, life, love and tragic death

Non-emptiness Testing for TMs

Computability and Complexity

Exam Computability and Complexity

Decidability. Human-aware Robotics. 2017/10/31 Chapter 4.1 in Sipser Ø Announcement:

CS 301. Lecture 18 Decidable languages. Stephen Checkoway. April 2, 2018

A Note on Turing Machine Design

Theory of Computation

Definition: Let S and T be sets. A binary relation on SxT is any subset of SxT. A binary relation on S is any subset of SxS.

Decidability: Reduction Proofs

CSE 311: Foundations of Computing. Lecture 26: More on Limits of FSMs, Cardinality

Computation Histories

Proving languages to be nonregular

Computability and Complexity

CSCE 551 Final Exam, Spring 2004 Answer Key

Decision Problems with TM s. Lecture 31: Halting Problem. Universe of discourse. Semi-decidable. Look at following sets: CSCI 81 Spring, 2012

CSE 105 Theory of Computation

CS5371 Theory of Computation. Lecture 15: Computability VI (Post s Problem, Reducibility)

Elementary Recursive Function Theory

Informal Statement Calculus

Introduction to Languages and Computation

Complexity Theory Part I

CS3719 Theory of Computation and Algorithms

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Cardinality of sets. Cardinality of sets

CpSc 421 Homework 9 Solution

1 More finite deterministic automata

DM17. Beregnelighed. Jacob Aae Mikkelsen

Theory of Computation

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

LECTURE 22: COUNTABLE AND UNCOUNTABLE SETS

Reducability. Sipser, pages

CHAPTER 8: EXPLORING R

Introduction to Turing Machines

CSE 105 THEORY OF COMPUTATION

Harvard CS 121 and CSCI E-121 Lecture 14: Turing Machines and the Church Turing Thesis

Midterm Exam 2 CS 341: Foundations of Computer Science II Fall 2016, face-to-face day section Prof. Marvin K. Nakayama

CS5371 Theory of Computation. Lecture 10: Computability Theory I (Turing Machine)

CS6901: review of Theory of Computation and Algorithms

Decidable Languages - relationship with other classes.

20.1 2SAT. CS125 Lecture 20 Fall 2016

A Universal Turing Machine

1 Computational Problems

CS 246 Review of Proof Techniques and Probability 01/14/19

CSE 105 THEORY OF COMPUTATION

CS20a: Turing Machines (Oct 29, 2002)

1 Functions, relations, and infinite cardinality

There are two main techniques for showing that problems are undecidable: diagonalization and reduction

Transcription:

CS 125 Section #10 (Un)decidability and Probability November 1, 2016 1 Countability Recall that a set S is countable (either finite or countably infinite) if and only if there exists a surjective mapping N S, i.e. if and only if we can enumerate the elements of S. This means that the following are countable: Any subset of a countable set. A countable union of countable sets. The direct product of two countable sets. The image of a countable set under a set function. Exercise. Are the following sets countable or not? 1. The set of all infinite binary sequences {0, 1} N. 2. The set of real numbers R. 3. The set of rational numbers Q. 4. The set of all English words (that is, all finite strings that can be written using the English alphabet). 5. The set of all English sentences. 1. Uncountable, by a diagonalization argument. 2. Uncountable; this is (almost) the same set as the previous problem. 3. Countable; this is the image of N N given by (n, m) n/m. 4. Countable; this is the countable union of the sets S i, where S i is the set of words of length i and hence is finite for each i. 5. Countable; this is similar to the previous problem, but we can take the union over sentence length, rather than word length. Exercise. You are in a spaceship A standing still at the origin of R 3. There is an enemy spaceship B somewhere in space that you wish to attach a spy probe to. This works by launching the probe in such a way that it ends up hitting B; there is also some constant amount of time T you have to wait between launching two probes. Suppose your intelligence is intelligent enough to know that B is moving with constant velocity v whose components are rational numbers, and, moreover, B is in some point p with rational coordinates at some point of time, but not intelligent enough to know v or p. Fortunately, you have a countably infinite supply of spy probes. Show that you have an algorithm that guarantees success in finite time. 1

For each possible combination of v and p and for each time t, there is a velocity v s such that a spy probe launched at time t with velocity v s hits B. Now it remains to observe that (v, p) Q 6, which is countable. This is useful because the class of decidable languages is countable, as is the class of recognizable languages; consequently, because the number of languages is uncountable, we know that there exist undecidable and unrecognizable languages. In fact, most languages are both undecidable and unrecognizable. 2 Decidability and Undecidability Recall that: A Turing Machine M decides a language L if it halts on every input and L is the set of words for which M has output 1. A language is decidable if there is a TM for which this is the case. A Turing Machine M recognizes a language L, where L is the set of words for which M has output 1. A language is recognizable if there is a TM for which this is the case. 2.1 Diagonalization Theorem 1. The languages A T M, A W R = { M, w : M accepts the input w}, where M is either a TM or a word-ram, respectively, are not decidable. Proof. (Diagonalization Argument) Assume { M, w : M accepts the input w} is decidable. Then the language D = { M : M accepts M } is decidable, hence D = { M : M does not accept M } is decidable. Suppose D is decidable by M 1, then M 1 D iff M 1 accepts M 1 iff M 1 D, which is a contradiction. We similarly showed via reduction that HALT T M, the problem of deciding whether a Turing Machine M halts on a string w, is undecidable even if M or w (but not both!) is fixed. Exercise. Show that a language L Σ is r.e. iff there is a computable function f : N Σ whose image is exactly L. That is, we can enumerate the elements of L as {f(0), f(1), f(2),...} for some recursive (=computable) function f. ( = ) If L is r.e., we know there is a Turing machine M that accepts exactly the strings in L. If L is a finite language, it is straightforward to make an enumerator M for L. Otherwise, we can enumerate all elements of L using the following enumerator M : choose some computable ordering on all strings in Σ, and for i going from 1 to infinity, let M run M on the first i strings for i steps. Then the n-th string that M accepts in the course of this algorithm is the value of f(n). ( = ) Suppose we have an enumerator M for L. Then a recognizer M for L on input x is obtained by simply computing f(0), f(1),... in order, and accepting if f(n) = x for some n. 2

2.2 Reductions Theorem 2. If L 1 m L 2 and L 1 is undecidable, then so it L 2. Exercise. Reductions can be tricky to get the hang of, and as we have seen with NP-completeness, you want to avoid going the wrong way with them. In which of these scenarios does L 1 m L 2 provide useful information (and in those cases, what may we conclude)? (a) L 1 s decidability is unknown and L 2 is undecidable (b) L 1 s decidability is unknown and L 2 is decidable (c) L 1 is undecidable and L 2 s decidability is unknown (d) L 1 is decidable and L 2 s decidability is unknown (a) Nothing. (b) L 1 is decidable. This is in Sipser. (c) Undecidable. Corollary to the above question. (d) Nothing. Exercise. Let S = { M : M is a TM such that L(M) = {ε}}. Using mapping reductions, prove that S is neither r.e. nor co-r.e. To show S is not r.e., we reduce from a non-r.e. language. One such is A TM. The reduction works by sending a given Turing machine M and string x to a Turing machine M x which accepts ε, and for any other input runs M on x. Then observe that if M does not accept x, L(M x ) = {ε}, and converesely, if L(M x ) = {ε}, then M does not accept x. To show that S is not co-r.e., it suffices to reduce from a non-r.e. language to the complement of S. We reduce from A TM again: given M, x, we get a Turing machine M x such that M x rejects everything different from ε, and on ε runs M on x. Then if M does not accept x, L(M x ) =, and conversely, if L(M x ) =, M does not accept x. 2.3 Rice s Theorem Theorem 3 (Rice s theorem). Let P be any subset of the class of r.e. languages such that P and its complement are both nonempty. Then the language L P = { M : L(M) P} is undecidable. Intuitively, Rice s theorem states that Turing machines cannot test whether another Turing machine satisfies a (nontrivial) property. For example, let P be the subset of the recursively enumerable languages which contains the string a. Then Rice s theorem claims that there is no Turing machine which can decide whether a Turing machine accepts a. 3

Cautionary note: Rice s theorem only tells you about the language accepted by the Turing machine, not about how the Turing machine performs its computation. For example, Rice s theorem is not applicable to the first exercise in the next section. 2.4 Exercises: are these languages decidable? Exercise. L = { M, x : At some point it its computation on x, M re-enters its start state} Undecidable. Assume for the sake of contradiction that this language is decidable. We will show that a decider for this language can decide the halting problem. Given an input M, w, modify M so that the start state is split into two states, and the new start state has no incoming transitions except for those we will specifically add in this construction. Also, change the accept and reject states so that they are no longer accept / reject states, but just normal states that (effectively) epsilon transition to the new start state. Make new accept / reject states that are inaccessible. Now the machine re-enters its start state iff the original machine would have halted. This construction was a finite transformation, and something a TM could do, so by combining this machine with a decider for the language in question, we could decide the halting problem. Contradiction. Exercise. L = { M : M on input i, outputs the ith digit of 2 in binary} First, let L be the language which consists of the integers i for which the ith digit of 2 is a 1. Then, we have that L = { M : L(M) = L }. Note that L is r.e., since we can compute 2 to an arbitrary number of digits of precision using standard arithmetic. Consequently, by Rice s theorem, L is not decidable. Exercise. L = { x, y : f(x) = y} where f is a fixed computable function. Decidable. Given x, y, compute f(x) and compare it to y. 3 Probability Review A discrete random variable X which could take the values in some set S can be described by the probabilities that it is equal to any particular s S; we write this as P(X = s). Two random variables X, Y are independent if their outcomes are unrelated; i.e. if for all possible pairs of outcomes x, y, P((X, Y ) = (x, y)) = P(X = x)p(y = y). Then, its expected value E(X) is the average value it takes on, with E(X) = s S s P(X = s). 4

For example, the expected value of a standard die is 1 6 (1 + 2 + 3 + 4 + 5 + 6) = 3.5. As it turns out, it is true that E(aX + by ) = ae(x) + be(y ), even if X and Y are not independent (for example, you can easily verify that this is true when X = Y ). This is known as linearity of expectation. Two types of common random variables are: Geometric random variable: the number of tries until some event which happens with probability p on each try occurs. When computing with geometric random variables, the following facts often come in handy: p i = 1 n 1 p (for 0 < p < 1) and p i = 1 pn+1 1 p i=0 Binomial random variable: the number of heads given by n random coin flips We can prove a few useful facts using the definitions above. i=0 (for all p). Exercise. Prove that for random numbers X which only take on nonnegative integer values, E(X) = P(X > j). j=0 We manipulate the definition of the expectation: as wanted. E(X) = kp(x = k) = k=0 j=1 k=j P(X = k) = P(X j) = j=1 P(X > j) j=0 Exercise. Show that if some event happens with probability p, the expected number of tries we need in order to get that even to occur is 1/p. By the previous exercise, and noting that the probability that an event still has not happened after i tries is (1 p) i, the expectation is i=1 (1 p)i = 1 p. Alternatively, the probability that the event happens after exactly the ith try is (1 p) i 1 p. Thus, we can compute E(# of tries) = i(1 p) i 1 p. i=1 Additionally, one useful inequality about random variables is Markov s inequality: for any nonnegative random variable X and λ > 0, P(X > λ EX) < 1 λ. 5

In other words, this indicates that the probability of X being significantly larger than its expectation is small. For example, if an event happens with probability p, then Markov s inequality tells us that P(more than x tries are needed for the event to happen) < p x. (Chebyshev s inequality, which can be proved using Markov s inequality, gives a somewhat similar bound which also deals with the case that X is significantly smaller than its expected value.) 6