5 3 Sensing Watson-Crick Finite Automata

Size: px
Start display at page:

Download "5 3 Sensing Watson-Crick Finite Automata"

Transcription

1 Benedek Nagy Department of Computer Science, Faculty of Informatics, University of Debrecen, Hungary 1 Introduction DNA computers provide new paradigms of computing (Păun et al., 1998). They appeared at the end of the last century. In contrast, the theory of finite automata is well developed and intensively used in both theory and practice. The Watson-Crick automata, introduced in (Freund et al., 1997), relate to both fields. They are important in the field of (DNA) computing and have important relation to formal language and automata theory as well. Watson-Crick automata are (theoretical) devices that have two reading heads to scan the two strands of a DNA molecule and decide if the molecule is acceptable (i.e., belongs to a desired class of languages). Enzymes slide across the backbones of DNA, usually from the 5 to 3 ends of the strand, therefore 5 3 automata are of particular interest. We investigate several variations of Watson-Crick automata in which both heads of the automata read the doubled DNA strand from the 5 to 3 direction. In sensing variation, the heads sense if they meet (i.e., their distance is less than a fixed small distance). The accepted language classes of these automata are analyzed and compared to well-known language classes, such as languages of the Chomsky hierarchy. Variations such as deterministic, all-final, simple, 1-limited, and stateless 5 3 Watson-Crick automata are also detailed. From a practical point of view, the stateless variant is likely the most realistic, however all models are of theoretical interest. 1.1 Biological Background and Motivation A DNA molecule consists of two strands. A strand can be viewed as a (directed) sequence of bases. Every DNA is built up by four bases: Adenine (A), Cytosine (C), Guanine (G), and Thymine (T ). Watson and Crick discovered the double helix structure of DNA and the fact that the two strands are elementwise complementary of each other. Therefore, any of the strands uniquely determines the other strand. The DNA strands are not without direction. The bases are not symmetrical molecules but have so-called 5 and 3 ends that always bond in an alternating order while equal ends never bond. Thus, on one end of the strand, there is a 5 side, and on the other end, a 3 side of a base. By geometrical reasons, two complementary strands can only align with their opposite ends next to each other, i.e., a 5 and a 3 end on either side. Various enzymes can move along DNA strands scanning (checking) it. These proteins that have a structure that enables them to move along such a double strand are likely to do so in opposite directions, if they are on opposite sides of the double strand. In nature, the 5 to 3 direction is preferred, both DNA and RNA polymerase use this direction. The proteins get their meaning in this way (codons at the synthesis of proteins, see, for instance, (Clote & Backofen, 2000)) and mrna is read in this direction by ribosome. There are enzymes that can act on both strands at the same time, for instance, the DNA Polymerase

2 III working in this way: its arms form a loop in three dimension and works on both strands of the original molecule in the replication fork. In this chapter, we use a theoretical model reading DNA molecules which model works as automata (from a computational point of view). 1.2 Computer Science Background: Formal Languages and Automata Theory The theory of formal languages has been widely developed since the 1950s. Rewriting systems and generative grammars are widely used, and their theories are well-known. Chomsky introduced his grammar families with the aim of describing natural languages. However, although the grammar families are widely used in computer science, specially in computer programming languages (compilers) and other fields, this aim has yet to be achieved. Automata are introduced to model the human brain. This problem is proved to be much harder than originally thought. Nevertheless, several other phenomena and processes can be modeled by automata. The Turing-machine, specially its universal variation, with von Neumann s theory provided the basis for electronic computers. Almost all types of automata have the following features: there is an input tape containing the input (because it is a sequence of symbols, it is usually called word) and there is a finite control (finite number of states). The work of these machines goes in a discrete time scale: in a movement/transition, the machine (may) read the next letter of the input and make some changes as follows. It may change its state and make local modifications in an external storage (tape, pushdown stack, etc.), if any. In the next step, it then repeats the process. The transition function is finitely defined, i.e., it can be given by a finite table. If there is no external storage, the machine is called finite automaton or finite-state machine. There are several kinds of automata. In this paper, we will use machines that work on a given input and finally make a decision (yes/no). (In some cases, such as in Turing-Machines where only the yes case is important, the machine may not stop and may thus work forever for some input strings.) These machines are used to decide if the input belongs to a given language (the decision yes means that the input is accepted), in this way, they are defining languages by the set of accepted input strings. Therefore, the connection between automata and formal languages is straightforward. Finite automata, both deterministic and non-deterministic versions, recognize regular languages exactly. For context-free languages, the concept of pushdown automata fits. Deterministic pushdown automata accept deterministic context-free (dcf) languages. In the literature, the one-turn pushdown automata are investigated to accept linear languages. The deterministic one-turn pushdown automata characterize the deterministic linear (dlin) languages. 1.3 DNA-Computing: a Link between Biology and Computer Science Adleman showed that DNA molecules can be used to solve mathematical/computational problems, i.e., it is not only Nature that can implement computations by DNA molecules, but humans also can. In his groundbreaking experiment (Adleman, 1994), he solved the Hamiltonian Path Problem for directed graphs by a DNA computation. Let a directed graph be given with designated input and output vertices. A path from the input vertex to the output vertex is Hamiltonian if it contains every vertex exactly once. In general, the Hamiltonian Path Problem consists of deciding whether or not an arbitrarily given graph has a Hamiltonian path. Although various algorithms have been developed for giving solutions and have been successful for some special classes of graphs, all of them have an exponential worst-case complexity for general directed graphs. Thus, in the general case, all known algorithms essentially amount to an exhaustive search. Indeed, the Hamiltonian Path Problem has been shown to be an NPcomplete problem. Unless P=NP, there is no efficient algorithm that solves every instance of the problem on a traditional computer (or on a Turing machine). Adleman solved the Hamiltonian Path Problem for a relatively

3 Benedek Nagy small directed graph (Adleman, 1994). However, the solution is, at least in principle, applicable to larger graphs as well. The two most important features of DNA computations are the massive parallelism and the Watson-Crick complementarity. Now, we recall the method briefly. Each vertex of the graph is associated with a random strand of DNA of length 20 (admitting some special, easily verifiable properties). The edges of the graph are also associated with 20-mer single stranded DNAs obtainable as Watson-Crick complements of the second and first halves of the DNA strands encoding the starting and the ending points of the edges, respectively. In this way, the main point of Adleman s experiment was that a large number of DNA strands of vertices and edges were mixed together in a single ligation reaction. This reaction caused the formation of DNA molecules that could be viewed as encodings of random paths through the graph. By measuring the length of the DNA strands, those paths which have lengths differing from that of a Hamiltonian path are rejected. Finally, for each vertex, by filtering out those paths that do not contain the actual vertex, only strands coding Hamiltonian paths remain. The method of checking whether there are any solutions at all is called detecting. Essentially, the algorithm carries out an exhaustive search. The massive parallelism of the DNA strands takes care of the undesirable nondeterminism. The complementarity is applied to assure that the constructed sequences of edges are indeed paths in the graph. Based on Adleman s result, Lipton described a solution to another NP-complete problem, namely, the SAT (SATisfiability of Boolean logical formulae) problem. In his paper (Lipton, 1995), he used graphs in which the maximal paths are coding truth-value assignments. Several biochemical processes have been abstracted to formal operations, and these have been used to define various new models of computation. The main motivation for this was the hope of actually building biocomputers based on the theoretical models that have been developed. Thus far, however, it has been mainly the theoretical side that has profited from this interchange by way of the numerous new formal mechanisms and models inspired by processes observed in nature or in laboratories. Several models have been developed for how DNA molecules can (at least theoretically) be used for computing purposes. The two most important features that can be used, as previously mentioned, are the Watson-Crick complementarity and the massive parallelism. Most of these models (such as sticker systems, insertion-deletion systems, H-systems, etc.) gain power from the massive parallelism. There are also models using some special operations motivated by DNA operations observed in a living cell (such as formal operations of gene assembly) (Păun et al., 1998; Dassow et al., 2002; Ehrenfeucht et al., 2004). One main class of these new devices is formed by Watson-Crick automata, WK-automata for short (the abbreviation comes from the names Watson-CricK), first introduced in (Freund et al., 1997), this model uses the complementarity relation. WK-automata are (finite) automata working on a Watson-Crick tape, i.e., having a double-stranded DNA molecule as input (instead of a string on a tape). It has a reading head for each strand. Molecules that actually move along DNA strands in the manner of reading heads already exist in reality, namely, enzymes acting on these strands. However, because DNA strands are directed, enzymes that move along a double strand do it in opposite directions if they are on opposite sides of the double strand. This suggests that the two heads of a Watson-Crick automaton would be more probable to do exactly that. This is what inspired the introduction of 5 3 Watson-Crick automata in (Nagy, 2007a; Nagy, 2007b). In these devices, the two heads start on opposite

4 ends of the strand. This means they actually read in the same biochemical direction, but in the opposite physical direction. The idea of letting the two heads run in opposite directions was already mentioned in the book on DNA Computing (Păun et al., 1998) and also in a later article (Păun, 2000), where these devices were called reverse Watson Crick automata. However, their power was not investigated in more detail. After this general overview, we give some details about this chapter. We are dealing with WK-automata. The problem we are solving in this paper is the following: what can be the sets of molecules that are accepted by various classes of our automata model? Our work is also motivated from a computational point of view. The conventional automata with several heads have been studied before extensively; one of the earliest works is by Ibarra (Ibarra, 1973) and research on the topic is still on-going (Frisco & Ibarra, 2009). The class of twohead automata reading the input moving their heads in opposite mathematical/physical direction is an interesting theoretical automata model that somehow fits the hierarchy of other automata models. Moreover, in this chapter, we are working with a model that finishes the reading process of the input by sensing the meeting of the heads. Since the two strands of a DNA molecule are uniquely determined by each other, one may naturally consider that the point of decision (i.e., the end of the process) of our automata can be finished at this point: all the information are read from the input. In this chapter, we are interested in the language classes accepted by various models of our sensing 5 3 WK automata. A hierarchy of these classes and their relation to other well-known language classes will be shown. The structure of this chapter is as follows: in the next section, we give formal definitions and formal descriptions of the model. In Section 3, we detail the results about languages of our automata, including comparisons to languages of other automata. Finally, concluding remarks and the bibliography close the chapter. 2 Basic Definitions and Notions In this section, we start with a little chemistry and biology by describing DNA molecules (based on the textbooks (Calladine et al., 2004; Ádám et al., 2001; Clote & Backofen, 2000)), after which we recall some definitions from formal language theory and define a family of our automata. Five chemical elements can be found in DNA molecules Hydrogen (1 connection), Oxygen (2 connections), Nitrogen (3 connections), Carbon (4 connections), and Phosphorus (5 connections). They can be connected in various ways by covalent bond (one of the strongest chemical bonds). There are four possible bases: Adenine, Cytosine, Guanine and Thymine. (We will refer them by their initials.) The bases are connected to sugar, which is connected to the Phosphate group. Each nucleotide has three components: a base (that can be four types in a DNA molecule), a sugar (that are the same in every nucleotide), and a Phosphate (that are also identical in every nucleotide). Therefore, nucleotides are usually referred by their base. The sugar has five Carbon atoms which can be identified as 1 through 5. The Phosphate is connected to the 5 Carbon. Two nucleotides can be bonded through the Phosphate group (a water molecule H 2 O appears; this process is catalyzed by the Ligase enzyme in a DNA molecule): the connection goes from the 3 Carbon to the Phosphate of the next nucleotide. Thus, a DNA strand is a directed sequence of nucleotides and can be interpreted as a string over the alphabet {A,C,G,T }. The chemical structure of the basis allows them to connect by Hydrogen-bond [it is the strongest type 2 chemical bond (connection), although it is about 10 times weaker than the covalent bond building each connection in a DNA strand] as we detail below. According to the famous Nobel prize winner observation of Watson and Crick, the DNA has a double helix structure, i.e., two strands of DNA are connected by Hydrogen bonds. Actually, there are only two ways for the bases to form Hydrogen bonds: A and T can be bonded by two Hydrogen bonds, while G and C can be bonded by three Hydrogen bonds. The pairs (A,T ), (C,G) are called Watson-Crick complement of each other, because a nucleotide can be bonded only with one that is its com-

5 Benedek Nagy plement. This fact means that each nucleotide must have its uniquely defined complement on the other strand. Moreover, by geometric reasons, the two strands of a DNA must have opposite chemical directions. In computational point of view, it is important to note that, theoretically, every DNA strand is possible without any restriction. We do not care about the three-dimensional shapes of DNA molecules (neither for the helix spiral nor for any secondary structure) because we are using abstract models. In our point of view, a DNA strand is a string built up from amino acids as an alphabet. We now define these structures formally. Let V be an alphabet, i.e., a finite non-empty set of symbols (that are usually called letters) and ρ V V be its complementary relation. For instance, V = {A,C,G,T } is usually used in DNA computing with the Watson- Crick complementary relation {(T, A),(A, T ),(C, [ G),(G,C)}. ] The strings built up by complementary pairs of w1 letters are double strands (of DNA). The notation will be used for double strands, where w 1 is the upper w 2 strand from the 5 to 3 direction and w 2 is the lower strand from the 3 to 5 direction. The set of all possible double-stranded DNA will be denoted by WK ρ (V ). The sets of these strings are the (formal) languages over the DNA-alphabet V. In this paper, the sign λ refers to the empty word (that is the empty string without any letters). Now we recall some language families related to the Chomsky hierarchy; for full definitions, refer to (Hopcroft & Ullmann, 1979; Rozenberg & Salomaa, 1997). A grammar is a construct G = (N,V, S, H), where N and V are the non-terminal and terminal alphabets, respectively; S N is the initial letter; and H is a finite set of derivation rules. A rule is a pair written in the form v w, with v (N V) N(N V) and w (N V). Let G be a grammar and v,w (N V). Then v w is a direct derivation if and only if there exist v 1,v 2,v,w (N V ) such that v = v 1 v v 2, w = v 1 w v 2, and v w H. The transitive and reflexive closure of is the derivation relation : it holds if and only if either v = u or there is a finite sequence of words over (N V) that connects them as v = v 0,v 1,...v m = u in which v i v i+1 is a direct derivation for each 0 i < m. The language generated by a grammar G is the set of terminal words that can be derived from the initial letter: L(G) = {w S w w V }. Two grammars are equivalent if they generate the same language modulo λ. (From now on we do not care whether λ L or not. Note that in DNA computing, the empty word is usually not considered as a possible word of any languages because it does not refer to a molecule.) Depending on the possible structures of the derivation rules, the following important classes are defined; see, for example, (Amar & Putzolu, 1964; Amar & Putzolu, 1965; Hopcroft & Ullmann, 1979). Type 0, without any restriction: all generative grammar are type 0 by definition. Type 1, or context-sensitive (CS) grammars: for every rule, the next scheme holds: uav uwv with A N and u,v,w (N V ),w λ. Type 2, or context-free (CF) grammars: for every rule, the next scheme holds: A v with A N and v (N V ). Linear (Lin) grammars: each rule is one of the next forms: A v, A vbw; where A,B N and v,w V. k-rated linear (k-lin) grammars: it is a linear grammar with a fixed rational number k such that for each rule of the form: A vbw, the equality k = w v = k holds ( v denotes the length of v). Specially with k = 1 Even-linear (1-Lin) grammars: each rule is one of the following forms: A v, A w 1 Bw 2 ; where A,B N and v,w V and the length of w 1 equals with the length of w 2 for each rule. Specially with k = 0

6 Type 3, or regular (Reg) grammars: each derivation rule is one of the following forms: A w, A wb; where A,B N and w V. The language family regular/linear/context-free/context-sensitive, etc. contains all languages that can be generated regular/linear/context-free/context-sensitive etc. grammars. We use the term fix-rated linear languages (fix-lin) for the languages that can be generated by a k-rated linear grammar for any rational value of k (k Q). The term recursively enumerable language is used for all languages that can be generated by (type 0) generative grammars (the abbreviation RE is used for this family). Chomsky defined the type 0, type 1, type 2, linear, and type 3 language classes, while Amar and Putzolu defined the even-linear and k-rated linear classes in (Amar & Putzolu, 1964) and in (Amar & Putzolu, 1965), respectively. These language classes form the (extended) Chomsky hierarchy: Reg k-lin fix-lin Lin CF CS RE. From a computational point of view, smaller classes have nice properties, e.g., there are fast algorithms for them, etc. However, their expressive power is relatively small, and it is possible that one cannot describe the desired phenomenon by using them. Larger families have no efficient algorithms (in some cases, some of the problems are simply undecidable, i.e., there is no algorithm at all, for instance, that can compute the complement of a general RE language). Usually, these language classes can be generated by grammars having more restrictions on their rules than in the definitions. Now, we present some so-called normal forms. Lemma 1 Every linear grammar has an equivalent grammar in which all rules are in forms of A ab,a Ba,A a. Every k-rated linear grammar has an equivalent one in which for every rule of the form A vbw, where v = n and w = m are relative primes and for all rules of the form A u: u < n + m holds. Specifically, every even-linear grammar has an equivalent grammar in which all rules are in forms A abb,a a, A λ. Every regular language can be generated by grammar having only rules of types A ab,a a (A,B N,a,b V ). Proof. Introducing new non-terminals, each original rule that is not in the desired form can be replaced by a sequence of rules in the desired forms. As we already mentioned, there are various types of automata that accept some of the basic formal language families. Now the finite state machines are recalled both deterministic and non-deterministic versions (Rozenberg & Salomaa, 1997). A 5-tuple A = (V,Q,s,F,δ) is a finite state machine or a finite automaton, with the (input) alphabet V ; Q is the finite (non-empty) set of states; s Q is the initial state; and F Q is the set of final (or accepting) states. The function δ is the transition function. If δ : Q (V λ) 2 Q, then the device is the non-deterministic finite automaton. If δ : Q V Q, then the machine is called deterministic finite automaton. A word w is accepted by a finite automaton if there is a run starting with s, ending in a state in F, and the symbols of the transitions of the path yield w. Now we define formally the finite automata working on the DNA molecule. Watson-Crick finite automaton (WK automaton) works on a Watson-Crick tape, which is a double-stranded sequence (or molecule) in which the lengths of the strands are equal and the elements of the strands are pairwise complements of each other: [ a1 b 1 ][ a2 b 2 ]... [ an b n ] = [ a1 a 2... a n b 1 b 2... b n ] with a i,b i V and (a i,b i ) ρ (i = 1,...,n).

7 Benedek Nagy Formally, a WK automaton is M = (V,ρ,Q,s,F,δ), where ρ V V is a symmetric relation (usually the natural Watson-Crick complementarity relation), and V,Q,s, and F are the same as at finite automata. Transition mapping is usually defined as δ : Q ( ) ( V V 2 Q, and δ q, ( w 1 ) ) w 2 is non-empty only for a finite number of triplets (q,w 1,w 2 ). The elementary difference between finite automata and WK automata besides the doubled tape is the number of heads. The WK automata scan each of the two strands separately in a correlated manner. With the definition of WK automata, so far, we have left open the interpretation of δ and the condition of acceptance of a word. This interpretation provides the difference of the traditional WK automata (Freund et al., 1997; Păun et al., 1998; Petre, 2003; Kuske & Weigel, 2004) and our 5 3 model. In the traditional model, the two heads start from the same end of the molecule and go in the same physical direction, i.e., a head reads a strand from its 5 end to the direction of the 3 end, while the other reads in the opposite chemical/biological direction, that is, from the 3 end of the other strand to its 5 end. This is a natural choice for computer scientists, having the input on a tape, or having a machine that produces the input letter by letter in a discrete time scale. However, it is not at all motivated by any biological processes, or (bio)chemical processes in a living cell or in a laboratory. Therefore, in this study, we use the biologically natural direction for both heads in our definition, and, in turn, get different classes of accepted languages. The sensing will also have a special role in this case. In a 5 3 WK automaton, both heads start from the 5 end of the appropriate strand. Physically/mathematically and from a computing point of view, they read the double-stranded sequence in opposite directions, while biochemically, they go to the same direction (see Figure 1). A 5 3 WK automaton is sensing if the heads sense that they are meeting (i.e., they are close enough to meet in the next step or there is a possibility to read strings at overlapping positions). In the basic version of sensing 5 3 WK automata, the process of the input sequence ends if for all pairs of the sequence one of the letters is read. Due to the complementary relation, the information stored in the DNA is fully processed and the automaton makes a decision on the acceptance. (We note here that the sensing does not have this kind of natural meaning for traditional WK automata.) In the full reading version, both heads read the whole strand from the end 5 to the end 3. (That is the normal case for traditional WK automata, too.) Now, we specialize the function δ and the condition of acceptance to get 5 3 sensing WK automata in a formal way. ( In the usual WK automata, the state transition is a mapping of the form Q ( ) ) V V 2 Q. In a transition q δ ( q, ( w 1 w 2 ) ), we call r l = w 1 and r r = w 2 the left and right radius of the transition (they are the lengths of the strings that the head from left to right and from right to left read in this step), respectively. The value r = r l +r r is the radius of the transition. Here, we note that WK automata may read more than one symbol by any of its heads in a transition. This fact is biologically motivated because some enzymes can recognize longer subsequences. Opposite to this, in traditional finite state machines, the head can read only one symbol in a step. However, since ( δ q, ( w 1 ) ) w 2 is non-empty only for finitely many triplets of (q,w 1,w 2 ), there is a transition (maybe more) with maximal radius for a given WK automaton. Let δ be extended by the sensing condition in the following way: Let r be the maximum of( the values r l + r r for the values given in the transition function of the original WK automaton. Then, let δ : Q ( ) ) V V D 2 Q, where D is the sensing distance set {,0,1,...,r,+ }. This set gives the distance of the two heads between 0 and r, and gives + when the heads are further than r. (In full reading version, it is when the heads are after their meeting point.) Trivially, this automaton is finite, and D can be used only to control the sensing, i.e., the appropriate meeting of the heads. To describe the work of the automata, we use the concept of configuration. A configuration ( w 1 w 2 ) (q,s) ( w 1 w 2) consists of the state q, the actual

8 Figure 1: A possible figure of a 5 3 Watson-Crick automaton. The two strands of the DNA molecule is read by an enzyme simultaneously. The first head has already read CT GTAGC and is reading G, while the second head has read T GAGC and is reading T. [ ] w1 w sensing distance s, and the input 1 w 2 w WK ρ (V ) in such a way that the first head (upper strand) has already 2 processed the part w 1 while the second head (lower strand) has already processed w 2. A step of the automaton, according to the state transition function, can be the following two types: Normal steps: ( w 1 ) ( w 2 y (q,+ ) xw ) ( 1 w w1 ) x 2 w 2 (q,s) ( w 1 yw for 2) w1,w 2,w 1,w 2,x,y V with w 2 y w 1 > r, q,q [ ] w1 xw Q if and only if ( 1 w 2 yw WK ρ (V ) and q δ q, ( ) ) x y,+, and the sensing distance { 2 w2 w s = 1 x, if w 2 w 1 x r; +, in other cases. In full reading version, the steps after the meeting of the heads can analogously be described with instead of +. Moreover, in these steps, the value of the sensing distance remains. We now describe the sensing steps: ( w 1 ) ( w 2 y (q,s) xw ) ( 1 w w1 ) x 2 w 2 (q,s ) ( w 1 yw, with the value { 2) s x y, if s x y 0; s =, in other cases.

9 Benedek Nagy [ ] w1 xw This step can be done if and only if ( 1 w 2 yw WK ρ (V ) and q δ q, ( ) ) x y,s. 2 The reflexive and transitive closure of the relation will be denoted by. The accepted language can be defined by the final accepting configurations that can be reached from the initial one: ] A doubled strand ( λ ) w (q0 2,s 0 ) ( [ w 1 ) λ w 1 [ w1 w 2 ] (q f,0) w 2 [ w1 is accepted by a basic variation sensing 5 3 WK automaton M if and only if [ ] [ ][ ] [ ] w 1 w for q f F, where 1 w 1 w1 = with the proper value of w 2 s 0 (it is + if w 1 > r, elsewhere ] it is w 1 ) because the full input is processed by the time the heads meet. A doubled strand is accepted by a full-reading sensing 5 w 3 WK automaton M if and only if ( 2 λ ) w (q0 2,s 0 ) ( w 1 ) ( λ w 1 ) λ (q f, ) ( ) λ w 2 for q f F with the proper value of s 0. The restricted versions F, N, S, 1, FS, F1, NS, and N1 can be defined by the following restrictions. N : stateless, i.e., with only one state: if Q = F = {q 0 }; F : all-final, i.e., with only final states: if Q = F; ( ( ( S : simple (at most one head moves in a step) δ : Q V ) ( {λ} {λ} ) )) V 2 Q ; ( ( ( 1 : 1-limited (exactly one letter is being read in a step) δ : Q V ) ( {λ} {λ} ) )) V 2 Q. Further variations, such as NS, FS, N1, and F1 WK automata, can be identified in a straightforward way by using multiple constraints. In type 1, i.e., in 1-limited version D = {0,1,+ } due to the restricted radius of the allowed transitions (and D = {,0,1,+ } in full reading version). We can also define deterministic sensing 5 3 WK automata. An automaton (of any of the previously described versions) is deterministic if there is at most one possible step to continue its run in any configuration that can occur in any procedure. w 2 w 2 w 2 3 Language Families Accepted by Sensing 5 3 WK Automata In the previous section we have defined various models of 5 3 WK automata. In this section, we analyze these models and compare their accepting capacities. First, we consider the basic variation and its restricted forms. These automata finish the process on the input word when the heads meet. We now show that some of the restrictions do not restrict the accepting power of these machines. Theorem 1 The accepted language classes of the F, S, 1, FS, and F1 sensing 5 3 WK finite automata are equal to the class of languages that can be accepted by sensing 5 3 WK finite automata (without restrictions). Ideas of proofs. By introducing new intermediate states, every step of the arbitrary automata can be divided into two steps in which only one of the heads can move (S). By introducing new intermediate states, these steps can be simulated by a sequence of transitions in which only a head steps by a letter (1). (λ-transitions can be eliminated in the usual way.) All states can be final because the finishing transitions are sensing transitions, and they can be

10 allowed only in accepting runs (F). By combining the previous arguments and constructions, equivalent FS and F1 versions can also be constructed to any original sensing 5 3 WK finite automata. The following connection to the language classes of the Chomsky hierarchy can also be proven in a constructive manner (Nagy, 2007a). Theorem 2 The languages that can be accepted by sensing 5 3 WK finite automata and the class of linear languages are the same. Now, we present real hierarchy results by a sequence of strict inclusions. Proposition 1 The language class accepted by N1 sensing 5 3 WK automata is strictly included in the class of languages that can be accepted by NS sensing 5 3 WK automata. Proof. These machines have exactly one state. The difference of the N1 and NS machines is that the latter ones may read several letters in a transition by the reading head, while N1 machines must read the input letter by letter. The {[ inclusion ] is clear by the definition. Now, we present a language that proves the strict inclusion. The language n bb n N} can be accepted by the NS machine reading two letters in every step, but it is impossible to aa accept it with any N1 machine. In fact, N1 machines cannot count. (There is only one state and in every transition only one of the heads reads exactly one letter, i.e., nucleotide.) Proposition 2 The language class accepted by NS sensing 5 3 WK automata is strictly included in the class of languages that can be accepted by N sensing 5 3 WK automata. Proof. These machines still have only one state. The difference between NS and N machines is that the former ones use their heads separately while the latter ones may use both heads in a transition. This inclusion is also clear by the definition. There is an N sensing 5 3 WK automaton that accepts the language of palindromes, i.e., the language formed by words that are the same when read from the opposite direction. This machine reads complement letters by the heads in every transition, but maybe the very last (sensing) step (reading the letter exactly in the middle of the word). On the other hand, this language cannot be accepted by any NS machines because the machine cannot even remember which head was used in the last transition. Proposition 3 The language class accepted by N sensing 5 3 WK automata is strictly included in the class of languages that can be accepted by arbitrary sensing 5 3 WK automata. Proof. This inclusion is also trivial by the definition. On the other side, there are linear languages that cannot be accepted by machines {[ having only one ] state. We now present a language that proves this strict inclusion. The a regular language L abcd = n b m c k d i d n c m b k a i n,m,k,i N} cannot be accepted by an N machine. Proposition 4 The language class accepted by NS sensing 5 3 WK automata is strictly included in the class Reg of regular languages. Proof. Observe that NS automata can only accept languages of the form (x 1 + x x i ) (z z k )(y 1 + y y j ), where the strings x l are the strings that the first head can read in a normal (non-sensing) transition, while y l corresponds to the complement of the strings that the second head can read in a normal step. Finally, the strings z l correspond to finishing steps using sensing steps. It remains to show that the inclusion is strict: Our previously used regular language, the language L abcd, cannot be accepted by these machines.

11 Benedek Nagy Proposition 5 The class of regular languages are strictly included in the class of languages that can be accepted by arbitrary sensing 5 3 WK automata. Proof. The inclusion can easily be seen: traditional finite automata can be simulated by WK machines in which the lower head reads λ in each step. To prove the strictness of the inclusion, we recall that the language of palindromes is not regular. Furthermore, let us consider the deterministic versions of sensing 5 3 WK automata. The language class defined by deterministic arbitrary sensing 5 3 WK automata is denoted by 2detLin (see (Nagy, 2009) for detailed properties of this family). First, some connections to the extended Chomsky hierarchy are shown. Theorem 3 All fix-rated linear languages can be accepted by deterministic sensing 5 3 WK automata. Idea of proof. It is clear that any k-rated linear language (for any value of k) can be accepted by a sensing 5 3 WK automaton. Moreover, the distance of the heads from the endpoints of the molecule is determined by k and the number of steps that are already given (until the heads are close enough to each other to finish the process if the input is accepted). Based on these facts, a similar construction, the so-called set-construction, works in the same manner for the determinization of a non-deterministic finite automaton (see, for instance, (Rozenberg & Salomaa, 1997)). Proposition 6 There are 2detLin languages that are not fix-rated linear. Proof. An example for such a language is given by the linear grammar ({S},{a,b},S,{S aasa,s bsb,s aaasbb,s bbbbsa,s λ}). This language can be accepted by a deterministic N machine, but it is not k-rated linear with any k Q. Since all these languages are trivially linear, there is a further extension of the hierarchy: Reg k-lin fix-lin 2detLin Lin CF CS RE. Furthermore, in this section, we present some hierarchy results among the concerned variations of deterministic WK automata. Proposition 7 Deterministic NS and N1 automata are more restricted than their non-deterministic versions. Proof. Observe that in deterministic NS and in N1 automata, only one of the heads can move in a step with the sensing distance +, the other head must wait for sensing. In this way, only special regular languages (x 1 + x x i ) (z z k )) can be accepted, where x p,z r V for NS machines, and x p,z r V for N1 machines (p = 1,..,i;r = 1,..,k). Based on analogous results for the non-deterministic (i.e., not necessarily deterministic) versions, we have the following results (the same separating languages work here as in the proof of Propositions 1 and 2). Proposition 8 Deterministic N machines are more powerful than deterministic NS machines. Deterministic NS machines are more powerful than deterministic N1 machines. Proposition 9 Deterministic arbitrary machines are more powerful than deterministic N machines.

12 Proof. Let us consider the following example. The language generated by the grammar ({S,A},{a,b},S,{S asa,s bab,a aab,a baa,a λ}) cannot be accepted by any N machines, but it is accepted by deterministic WK automata because it is 1-linear (even-linear). Some of the restrictions do not restrict the accepting power of the deterministic sensing 5 3 WKautomata: Theorem 4 The accepted language classes of the deterministic F, S, 1, FS, and F1 sensing 5 3 WK finite automata are equal to the class of languages that can be accepted by deterministic sensing 5 3 WK finite automata (without further restrictions). Idea of proof. The proof is similar as in Theorem 1, using the fact that in the deterministic version, there is at most one applicable transition in any time. Finally, we show that non-deterministic machines are more powerful than deterministic machines even if they only have one state. Proposition 10 Non-deterministic N machines are more powerful than deterministic N machines. Proof. Let us consider the language generated by the grammar ({S},{a,b},S, {S aasb,asb,s λ}). It can be accepted by a stateless machine, but it cannot be done with a deterministic machine. 3.1 Full-Reading Variants In this subsection, we analyze the full reading variation of the automata. In this variation, each of the heads reads the whole word but in different directions. We will also show that these machines are more powerful than the ones finishing the process at the meeting point of the heads. Theorem 5 Let us consider any of the variations of sensing 5 3 automata. The language class accepted by full reading mode of the same variation includes the language class that is accepted by runs until the heads meet. Proof. Until the heads meet, the automaton works in the same way as the original 5 3 WK machine accepting the language at the meeting point. If the word is accepted, then, and only then, will both heads step forward to finish the whole word (by sensing parameter 0 and ). As a special consequence, using the general version, we have the following: Corollary 1 Every linear language is accepted by a full reading sensing 5 3 WK automaton. Moreover, the full reading automata can accept{[ more languages. ] They can also accept some non-contextfree languages. The language of multiple agreements n b n c n a a n b n c n n N}, where V = {a,b,c,a,b,c } with the complementary relation ρ = {(a,a ),(a,a),(b,b ),(b,b),(c,c ),(c,c)} is accepted by the following deterministic ( automaton: (V,ρ,{s,q 1,q 2,q 3, f },s,{s, f },δ) with transition function δ: δ s, ( ) ) ( λ c,+ = {q 1 }, δ q 1, ( ) ) λ c,+ = {q 1 }, δ ( q 1, ( ) ) a b,+ = {q2 }, δ ( q 1, ( ) ) a b,2 = {q2 }, δ ( q 2, ( ) ) a b,+ = {q2 }, δ ( q 2, ( ) ) ( a b,2 = {q2 }, δ q 2, ( ) ) ( b λ,0 = {q 3 }, δ q 3, ( ) ) b λ, = {q 3 }, δ ( q 3, ( ) ) ( ( c a, = { f }, δ f, c ) ) a, = { f }.

13 Benedek Nagy Another {[ important ] language of the mildly context-sensitive language classes, the language of cross dependencies n b m c n d m a a n b m c n d m n,m N}, can be accepted by these machines (V = {a,b,c,d,a,b,c,d } with the complementary relation of the letters with their primed versions). These two non context-free language examples are linguistically interesting and important. We also present some results concerning various types of full reading 5 3 WK machines. Theorem 6 The language classes accepted by S full reading sensing 5 3 WK automata and by 1 full reading sensing 5 3 WK automata are the same as the class accepted by arbitrary full reading sensing 5 3 WK automata Proof. The proof goes in the same way as in Theorem 1. There are some very restricted versions that can accept only specific regular languages: Proposition 11 Deterministic N1 full reading sensing 5 3 WK automata accept languages of the form V1 V 2 and V 2 V1 with V 1,V 2 V. The language class accepted by N1 full reading sensing 5 3 WK automata can be written as V1 V 2V 3 V4 with V 1,V 2,V 3,V 4 V. Proof. In deterministic N1 machines, only one of the heads can make steps with sensing parameter +. When this head (almost) finishes its strand, then another type of step is allowed by sensing parameter 1. The other head can start the process only by sensing steps (parameter value 1 or 0) and can continue with sensing value. The accepted language of the full reading N1 machine can be described by a similar regular expression as at the non full reading version: (x 1 + x x i ) (z z k )(t t l )(y 1 + y y j ) with x r V,r = 1,..,i;z p V, p = 1,..,k;t m V,t = 1,..,l;y q V,q = 1,.., j, where x r can be read by the first head with sensing parameter + (and its complementer can be read by the second head with sensing parameter ); the complement of y q can be read by the lower head before the sensing step and by the upper head after the sensing step; and z p can be read by sensing step with sensing parameter 1 (or its complement by the second head with sensing parameter 0); while t m can be read by the first head with sensing parameter 0 or its complement can be read by the second head by sensing parameter 1. Corollary 2 The language class accepted by N1 full reading sensing 5 3 WK automata is strictly included in the language class accepted by NS full reading sensing 5 3 WK automata. Without reaching completeness, we provide some further hierarchy results. Proposition 12 The language family accepted by NS full reading sensing 5 3 WK automata is strictly included in the class accepted by N full reading sensing 5 3 WK automata. Proof. With the NS machine, the language of palindromes cannot be accepted even if it is a full reading machine. Proposition 13 The language family accepted by N full reading sensing 5 3 WK automata is strictly included in the class accepted by F full reading sensing 5 3 WK automata.

14 Proof. All linear languages can be accepted by F full reading sensing 5 3 WK automata, which decide when the heads meet and allow only to finish the process in case of acceptance (in a similar way as argued in the proof of Theorem 5). This cannot be done without states (see the proof of Proposition 3). Finally, in this subsection, we compare the classes of languages accepted by 5 3 WK automata to the context-sensitive class. Theorem 7 Every language that can be accepted by any 5 3 WK machine is context-sensitive. Proof. A linear bounded Turing machine (LBA) can easily simulate the work of a 5 3 WK automaton using marker symbols to indicate the places of the heads because 5 3 WK automata cannot modify the input. Since LBAs accept the context-sensitive languages exactly, the statement of the theorem holds. 3.2 BWK-Automata as a Special Case of 1-Limited Automata In computer science, the automata can only read one symbol in a transition, hence it is interesting to see what languages are accepted by WK-automata with this restriction. As we have seen, all 5 3 WK languages are accepted by F1 (and so, by 1) 5 3 WK automata; and all languages that are accepted by full-reading 5 3 WK variants can also be accepted by full reading WK automata. Now, we introduce a new variation, in which both heads must read a symbol in every step until their meeting (Nagy, 2007a; Nagy, 2007b). We note here that traditional WK automata having restriction to step both heads by a symbol in each transition (i.e., moving heads to the same position at each step) does not gain any power by the two-heads; it can only simulate traditional finite state machines, and so, it accepts the regular languages exactly. Opposite to this, 5 3 WK-automata is more powerful, as we will see. Let 5 3 BWK denote those WK automata in which both heads must step at every transition step by a symbol (but the possible case if only one letter is not processed yet). Thus, a 5 3 BWK automaton has transitions type B δ ( A, ( ) ( ( a b),k with k {+,2} and C δ A, a ) λ),1. For this special class of automata, we provide a result that is analogous to Theorem 2. Theorem 8 The class of 5 3 BWK automata accepts the even-linear languages exactly. The automaton has transitions without finishing the input word, only type B δ ( A, ( a b),k ) (k {+,2}). It means that both heads must read a symbol at the same time. The only exception is when the input has only one unread letter, then the word can be finished by a sensing transition: only the first head steps and the word can be accepted. From Theorem 1, we know that these automata can be translated to S 5 3 WK automata. In fact, the resulting automata are WK automata. In these automata, the heads step in alternating order (starting from the first one). Moreover, there is an equivalent BWK machine for every WK automaton with the property that its heads are stepping alternately. Now, let us analyze the deterministic version of these restricted automata. Theorem 9 The class of deterministic 5 3 BWK automata accepts the even-linear languages exactly. Proof. It is trivial (and it is a consequence of the previous theorem) that the languages accepted by the deterministic version must be even-linear languages. Now, we will prove that all even-linear languages can be accepted

15 Benedek Nagy by deterministic BWK automata. On one hand, by Theorem 3 and Theorem 4, even-linear languages are clearly accepted by deterministic WK automata. On the other hand, the order of the steps can be fixed as an alternating order, therefore a deterministic BWK machine can also do the acceptance. Let us consider the full-reading version of these machines. Since both heads need to read the whole strand, in these machines, both heads are stepping in every transition. At transitions where the distance of the heads are only one, both heads read the middle symbols at the same time (equivalently, the first head reads the letter, the second head makes only a step because it can and must read only the complement letter of the one being read by the first head). Full reading sensing 5 3 BWK automata only have transitions type B δ ( A, ( ) a b),k, with a,b V, k {+,2,1, }. As we have already seen, the words of even-linear languages can be accepted when the heads meet. Now we show that this variation does not gain any power by full-reading. One may actually divide the run of the automaton on any input into three phases. In the first phase the sensing parameter has value + (this phase is exactly the same at normal 5 3 automata). In the second phase, the heads meet (it is similar to the normal case, when the run is finished in this phase by the exact meeting of the heads). The sensing parameter may have values 2, 1, or 0 at this phase. Until the parameter gets value 1 or 0, the machine works exactly in the same way as the non full-reading version. At value 0, the full-reading automata continues the work and changes the parameter to opposite to the non full-reading version which stops at this place. At the possible value 1 of the parameter, the full-reading machine does not finish the run by stepping only with the first head, but both heads step and the sensing parameter changes to. In the third phase, the automata is running and finishes the input with sensing parameter. In this third phase, the heads are stepping exactly the complement places of the first phase, i.e., exactly the same places but on the other strands. In the first two phases, the full-reading BWK automata do similar things as the original version, but in the third phase, there is a possibility of checking another property than the one checked in the first two phases (this property can be viewed as a characteristic property of an even-linear language). However, because the pair of places of the heads are exactly the same in the third phase as the pair of places they already visited, this two properties can be checked at the same time. This could be done exactly in the same way as the proof of the fact that regular languages are closed under the intersection (intersection automata construction: the pairs of states form the set of states in the new machine). In this way, a non full-reading BWK automaton can effectively be constructed from any full-reading BWK that accepts the same language. We can state this fact in a formal way: Theorem 10 The class of full reading sensing 5 3 BWK automata (both for the deterministic and for the non-deterministic version) accepts the even-linear languages exactly. Through the construction of intersection automata for BWK automata, it also can be proven that the class of even-linear languages are closed under intersection. (This fact is already known, see (Amar & Putzolu, 1964)). Finally, we summarize the hierarchy of the language classes accepted by various types of 5 3 WK automata and their relation to the Chomsky hierarhcy in Figure 2. 4 Summary and Concluding Remarks In this chapter, after introducing the topic from various points of view, several results were shown regarding the hierarchy of language classes accepted by various versions of the sensing 5 3 WK automata. Nice automata

5 3 Watson-Crick Automata with Several Runs

5 3 Watson-Crick Automata with Several Runs 5 3 Watson-Crick Automata with Several Runs Peter Leupold Department of Mathematics, Faculty of Science Kyoto Sangyo University, Japan Joint work with Benedek Nagy (Debrecen) Presentation at NCMA 2009

More information

Language Recognition Power of Watson-Crick Automata

Language Recognition Power of Watson-Crick Automata 01 Third International Conference on Netorking and Computing Language Recognition Poer of Watson-Crick Automata - Multiheads and Sensing - Kunio Aizaa and Masayuki Aoyama Dept. of Mathematics and Computer

More information

Final exam study sheet for CS3719 Turing machines and decidability.

Final exam study sheet for CS3719 Turing machines and decidability. Final exam study sheet for CS3719 Turing machines and decidability. A Turing machine is a finite automaton with an infinite memory (tape). Formally, a Turing machine is a 6-tuple M = (Q, Σ, Γ, δ, q 0,

More information

How to Pop a Deep PDA Matters

How to Pop a Deep PDA Matters How to Pop a Deep PDA Matters Peter Leupold Department of Mathematics, Faculty of Science Kyoto Sangyo University Kyoto 603-8555, Japan email:leupold@cc.kyoto-su.ac.jp Abstract Deep PDA are push-down automata

More information

MA/CSSE 474 Theory of Computation

MA/CSSE 474 Theory of Computation MA/CSSE 474 Theory of Computation CFL Hierarchy CFL Decision Problems Your Questions? Previous class days' material Reading Assignments HW 12 or 13 problems Anything else I have included some slides online

More information

Pushdown Automata. Chapter 12

Pushdown Automata. Chapter 12 Pushdown Automata Chapter 12 Recognizing Context-Free Languages We need a device similar to an FSM except that it needs more power. The insight: Precisely what it needs is a stack, which gives it an unlimited

More information

Introduction to Turing Machines. Reading: Chapters 8 & 9

Introduction to Turing Machines. Reading: Chapters 8 & 9 Introduction to Turing Machines Reading: Chapters 8 & 9 1 Turing Machines (TM) Generalize the class of CFLs: Recursively Enumerable Languages Recursive Languages Context-Free Languages Regular Languages

More information

The Computational Power of Extended Watson-Crick L Systems

The Computational Power of Extended Watson-Crick L Systems The Computational Power of Extended Watson-Crick L Systems by David Sears A thesis submitted to the School of Computing in conformity with the requirements for the degree of Master of Science Queen s University

More information

Blackhole Pushdown Automata

Blackhole Pushdown Automata Fundamenta Informaticae XXI (2001) 1001 1020 1001 IOS Press Blackhole Pushdown Automata Erzsébet Csuhaj-Varjú Computer and Automation Research Institute, Hungarian Academy of Sciences Kende u. 13 17, 1111

More information

Watson-Crick ω-automata. Elena Petre. Turku Centre for Computer Science. TUCS Technical Reports

Watson-Crick ω-automata. Elena Petre. Turku Centre for Computer Science. TUCS Technical Reports Watson-Crick ω-automata Elena Petre Turku Centre for Computer Science TUCS Technical Reports No 475, December 2002 Watson-Crick ω-automata Elena Petre Department of Mathematics, University of Turku and

More information

Foundations of Informatics: a Bridging Course

Foundations of Informatics: a Bridging Course Foundations of Informatics: a Bridging Course Week 3: Formal Languages and Semantics Thomas Noll Lehrstuhl für Informatik 2 RWTH Aachen University noll@cs.rwth-aachen.de http://www.b-it-center.de/wob/en/view/class211_id948.html

More information

UNIT-VIII COMPUTABILITY THEORY

UNIT-VIII COMPUTABILITY THEORY CONTEXT SENSITIVE LANGUAGE UNIT-VIII COMPUTABILITY THEORY A Context Sensitive Grammar is a 4-tuple, G = (N, Σ P, S) where: N Set of non terminal symbols Σ Set of terminal symbols S Start symbol of the

More information

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY

CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY CSE355 SUMMER 2018 LECTURES TURING MACHINES AND (UN)DECIDABILITY RYAN DOUGHERTY If we want to talk about a program running on a real computer, consider the following: when a program reads an instruction,

More information

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY No part of this publication may be reproduced or distributed in any form or any means, electronic, mechanical, photocopying, or otherwise without the prior permission of the author. GATE SOLVED PAPER Computer

More information

Hierarchy Results on Stateless Multicounter 5 3 Watson-Crick Automata

Hierarchy Results on Stateless Multicounter 5 3 Watson-Crick Automata Hierarchy Results on Stateless Multicounter 5 3 Watson-Crick Automata Benedek Nagy 1 1,László Hegedüs,andÖmer Eğecioğlu2 1 Department of Computer Science, Faculty of Informatics, University of Debrecen,

More information

Computational Models - Lecture 4

Computational Models - Lecture 4 Computational Models - Lecture 4 Regular languages: The Myhill-Nerode Theorem Context-free Grammars Chomsky Normal Form Pumping Lemma for context free languages Non context-free languages: Examples Push

More information

Lesson Overview The Structure of DNA

Lesson Overview The Structure of DNA 12.2 THINK ABOUT IT The DNA molecule must somehow specify how to assemble proteins, which are needed to regulate the various functions of each cell. What kind of structure could serve this purpose without

More information

Introduction to Theory of Computing

Introduction to Theory of Computing CSCI 2670, Fall 2012 Introduction to Theory of Computing Department of Computer Science University of Georgia Athens, GA 30602 Instructor: Liming Cai www.cs.uga.edu/ cai 0 Lecture Note 3 Context-Free Languages

More information

On P Systems with Active Membranes

On P Systems with Active Membranes On P Systems with Active Membranes Andrei Păun Department of Computer Science, University of Western Ontario London, Ontario, Canada N6A 5B7 E-mail: apaun@csd.uwo.ca Abstract. The paper deals with the

More information

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc.

The Pumping Lemma. for all n 0, u 1 v n u 2 L (i.e. u 1 u 2 L, u 1 vu 2 L [but we knew that anyway], u 1 vvu 2 L, u 1 vvvu 2 L, etc. The Pumping Lemma For every regular language L, there is a number l 1 satisfying the pumping lemma property: All w L with w l can be expressed as a concatenation of three strings, w = u 1 vu 2, where u

More information

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP

CS Lecture 29 P, NP, and NP-Completeness. k ) for all k. Fall The class P. The class NP CS 301 - Lecture 29 P, NP, and NP-Completeness Fall 2008 Review Languages and Grammars Alphabets, strings, languages Regular Languages Deterministic Finite and Nondeterministic Automata Equivalence of

More information

Context Free Languages. Automata Theory and Formal Grammars: Lecture 6. Languages That Are Not Regular. Non-Regular Languages

Context Free Languages. Automata Theory and Formal Grammars: Lecture 6. Languages That Are Not Regular. Non-Regular Languages Context Free Languages Automata Theory and Formal Grammars: Lecture 6 Context Free Languages Last Time Decision procedures for FAs Minimum-state DFAs Today The Myhill-Nerode Theorem The Pumping Lemma Context-free

More information

Chapter 1. Formal Definition and View. Lecture Formal Pushdown Automata on the 28th April 2009

Chapter 1. Formal Definition and View. Lecture Formal Pushdown Automata on the 28th April 2009 Chapter 1 Formal and View Lecture on the 28th April 2009 Formal of PA Faculty of Information Technology Brno University of Technology 1.1 Aim of the Lecture 1 Define pushdown automaton in a formal way

More information

Fundamentele Informatica II

Fundamentele Informatica II Fundamentele Informatica II Answer to selected exercises 5 John C Martin: Introduction to Languages and the Theory of Computation M.M. Bonsangue (and J. Kleijn) Fall 2011 5.1.a (q 0, ab, Z 0 ) (q 1, b,

More information

Non-emptiness Testing for TMs

Non-emptiness Testing for TMs 180 5. Reducibility The proof of unsolvability of the halting problem is an example of a reduction: a way of converting problem A to problem B in such a way that a solution to problem B can be used to

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION

FORMAL LANGUAGES, AUTOMATA AND COMPUTATION FORMAL LANGUAGES, AUTOMATA AND COMPUTATION DECIDABILITY ( LECTURE 15) SLIDES FOR 15-453 SPRING 2011 1 / 34 TURING MACHINES-SYNOPSIS The most general model of computation Computations of a TM are described

More information

Further discussion of Turing machines

Further discussion of Turing machines Further discussion of Turing machines In this lecture we will discuss various aspects of decidable and Turing-recognizable languages that were not mentioned in previous lectures. In particular, we will

More information

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory CS 275 Automata and Formal Language Theory Course Notes Part II: The Recognition Problem (II) Chapter II.4.: Properties of Regular Languages (13) Anton Setzer (Based on a book draft by J. V. Tucker and

More information

Undecidable Problems and Reducibility

Undecidable Problems and Reducibility University of Georgia Fall 2014 Reducibility We show a problem decidable/undecidable by reducing it to another problem. One type of reduction: mapping reduction. Definition Let A, B be languages over Σ.

More information

DM17. Beregnelighed. Jacob Aae Mikkelsen

DM17. Beregnelighed. Jacob Aae Mikkelsen DM17 Beregnelighed Jacob Aae Mikkelsen January 12, 2007 CONTENTS Contents 1 Introduction 2 1.1 Operations with languages...................... 2 2 Finite Automata 3 2.1 Regular expressions/languages....................

More information

Rational graphs trace context-sensitive languages

Rational graphs trace context-sensitive languages Rational graphs trace context-sensitive languages Christophe Morvan 1 and Colin Stirling 2 1 IRISA, Campus de eaulieu, 35042 Rennes, France christophe.morvan@irisa.fr 2 Division of Informatics, University

More information

P systems based on tag operations

P systems based on tag operations Computer Science Journal of Moldova, vol.20, no.3(60), 2012 P systems based on tag operations Yurii Rogozhin Sergey Verlan Abstract In this article we introduce P systems using Post s tag operation on

More information

Running head: Watson-Crick Quantum Finite Automata. Title: Watson-Crick Quantum Finite Automata. Authors: Kingshuk Chatterjee 1, Kumar Sankar Ray 2

Running head: Watson-Crick Quantum Finite Automata. Title: Watson-Crick Quantum Finite Automata. Authors: Kingshuk Chatterjee 1, Kumar Sankar Ray 2 Running head: Watson-Crick Quantum Finite Automata Title: Watson-Crick Quantum Finite Automata Authors: Kingshuk Chatterjee 1, Kumar Sankar Ray Affiliations: 1, Electronics and Communication Sciences Unit,

More information

P Systems with Symport/Antiport of Rules

P Systems with Symport/Antiport of Rules P Systems with Symport/Antiport of Rules Matteo CAVALIERE Research Group on Mathematical Linguistics Rovira i Virgili University Pl. Imperial Tárraco 1, 43005 Tarragona, Spain E-mail: matteo.cavaliere@estudiants.urv.es

More information

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages

COMP/MATH 300 Topics for Spring 2017 June 5, Review and Regular Languages COMP/MATH 300 Topics for Spring 2017 June 5, 2017 Review and Regular Languages Exam I I. Introductory and review information from Chapter 0 II. Problems and Languages A. Computable problems can be expressed

More information

Theoretical Computer Science. Complexity of multi-head finite automata: Origins and directions

Theoretical Computer Science. Complexity of multi-head finite automata: Origins and directions Theoretical Computer Science 412 (2011) 83 96 Contents lists available at ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs Complexity of multi-head finite automata:

More information

arxiv: v2 [cs.fl] 29 Nov 2013

arxiv: v2 [cs.fl] 29 Nov 2013 A Survey of Multi-Tape Automata Carlo A. Furia May 2012 arxiv:1205.0178v2 [cs.fl] 29 Nov 2013 Abstract This paper summarizes the fundamental expressiveness, closure, and decidability properties of various

More information

X-machines - a computational model framework.

X-machines - a computational model framework. Chapter 2. X-machines - a computational model framework. This chapter has three aims: To examine the main existing computational models and assess their computational power. To present the X-machines as

More information

Computation Histories

Computation Histories 208 Computation Histories The computation history for a Turing machine on an input is simply the sequence of configurations that the machine goes through as it processes the input. An accepting computation

More information

Computational Models: Class 5

Computational Models: Class 5 Computational Models: Class 5 Benny Chor School of Computer Science Tel Aviv University March 27, 2019 Based on slides by Maurice Herlihy, Brown University, and modifications by Iftach Haitner and Yishay

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY 15-453 FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY REVIEW for MIDTERM 1 THURSDAY Feb 6 Midterm 1 will cover everything we have seen so far The PROBLEMS will be from Sipser, Chapters 1, 2, 3 It will be

More information

CS5371 Theory of Computation. Lecture 7: Automata Theory V (CFG, CFL, CNF)

CS5371 Theory of Computation. Lecture 7: Automata Theory V (CFG, CFL, CNF) CS5371 Theory of Computation Lecture 7: Automata Theory V (CFG, CFL, CNF) Announcement Homework 2 will be given soon (before Tue) Due date: Oct 31 (Tue), before class Midterm: Nov 3, (Fri), first hour

More information

Introduction to Formal Languages, Automata and Computability p.1/42

Introduction to Formal Languages, Automata and Computability p.1/42 Introduction to Formal Languages, Automata and Computability Pushdown Automata K. Krithivasan and R. Rama Introduction to Formal Languages, Automata and Computability p.1/42 Introduction We have considered

More information

CISC4090: Theory of Computation

CISC4090: Theory of Computation CISC4090: Theory of Computation Chapter 2 Context-Free Languages Courtesy of Prof. Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Spring, 2014 Overview In Chapter

More information

MTAT Complexity Theory October 13th-14th, Lecture 6

MTAT Complexity Theory October 13th-14th, Lecture 6 MTAT.07.004 Complexity Theory October 13th-14th, 2011 Lecturer: Peeter Laud Lecture 6 Scribe(s): Riivo Talviste 1 Logarithmic memory Turing machines working in logarithmic space become interesting when

More information

Theory of Computation

Theory of Computation Thomas Zeugmann Hokkaido University Laboratory for Algorithmics http://www-alg.ist.hokudai.ac.jp/ thomas/toc/ Lecture 14: Applications of PCP Goal of this Lecture Our goal is to present some typical undecidability

More information

Theory of Computation (Classroom Practice Booklet Solutions)

Theory of Computation (Classroom Practice Booklet Solutions) Theory of Computation (Classroom Practice Booklet Solutions) 1. Finite Automata & Regular Sets 01. Ans: (a) & (c) Sol: (a) The reversal of a regular set is regular as the reversal of a regular expression

More information

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of

More information

P is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k.

P is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k. Complexity Theory Problems are divided into complexity classes. Informally: So far in this course, almost all algorithms had polynomial running time, i.e., on inputs of size n, worst-case running time

More information

The Power of One-State Turing Machines

The Power of One-State Turing Machines The Power of One-State Turing Machines Marzio De Biasi Jan 15, 2018 Abstract At first glance, one state Turing machines are very weak: the Halting problem for them is decidable, and, without memory, they

More information

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska LECTURE 5 CHAPTER 2 FINITE AUTOMATA 1. Deterministic Finite Automata DFA 2. Nondeterministic Finite Automata NDFA 3. Finite Automata

More information

Theory of Computation

Theory of Computation Theory of Computation Lecture #2 Sarmad Abbasi Virtual University Sarmad Abbasi (Virtual University) Theory of Computation 1 / 1 Lecture 2: Overview Recall some basic definitions from Automata Theory.

More information

Part I: Definitions and Properties

Part I: Definitions and Properties Turing Machines Part I: Definitions and Properties Finite State Automata Deterministic Automata (DFSA) M = {Q, Σ, δ, q 0, F} -- Σ = Symbols -- Q = States -- q 0 = Initial State -- F = Accepting States

More information

Undecidability COMS Ashley Montanaro 4 April Department of Computer Science, University of Bristol Bristol, UK

Undecidability COMS Ashley Montanaro 4 April Department of Computer Science, University of Bristol Bristol, UK COMS11700 Undecidability Department of Computer Science, University of Bristol Bristol, UK 4 April 2014 COMS11700: Undecidability Slide 1/29 Decidability We are particularly interested in Turing machines

More information

Turing machines and linear bounded automata

Turing machines and linear bounded automata and linear bounded automata Informatics 2A: Lecture 30 John Longley School of Informatics University of Edinburgh jrl@inf.ed.ac.uk 25 November 2016 1 / 17 The Chomsky hierarchy: summary Level Language

More information

Note Watson Crick D0L systems with regular triggers

Note Watson Crick D0L systems with regular triggers Theoretical Computer Science 259 (2001) 689 698 www.elsevier.com/locate/tcs Note Watson Crick D0L systems with regular triggers Juha Honkala a; ;1, Arto Salomaa b a Department of Mathematics, University

More information

Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs

Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs Harry Lewis October 8, 2013 Reading: Sipser, pp. 119-128. Pushdown Automata (review) Pushdown Automata = Finite automaton

More information

On Controlled P Systems

On Controlled P Systems On Controlled P Systems Kamala Krithivasan 1, Gheorghe Păun 2,3, Ajeesh Ramanujan 1 1 Department of Computer Science and Engineering Indian Institute of Technology, Madras Chennai-36, India kamala@iitm.ac.in,

More information

Theory of Computation

Theory of Computation Thomas Zeugmann Hokkaido University Laboratory for Algorithmics http://www-alg.ist.hokudai.ac.jp/ thomas/toc/ Lecture 10: CF, PDAs and Beyond Greibach Normal Form I We want to show that all context-free

More information

CS481F01 Prelim 2 Solutions

CS481F01 Prelim 2 Solutions CS481F01 Prelim 2 Solutions A. Demers 7 Nov 2001 1 (30 pts = 4 pts each part + 2 free points). For this question we use the following notation: x y means x is a prefix of y m k n means m n k For each of

More information

Theory of Computation Turing Machine and Pushdown Automata

Theory of Computation Turing Machine and Pushdown Automata Theory of Computation Turing Machine and Pushdown Automata 1. What is a Turing Machine? A Turing Machine is an accepting device which accepts the languages (recursively enumerable set) generated by type

More information

Computability and Complexity

Computability and Complexity Computability and Complexity Push-Down Automata CAS 705 Ryszard Janicki Department of Computing and Software McMaster University Hamilton, Ontario, Canada janicki@mcmaster.ca Ryszard Janicki Computability

More information

ECS 120 Lesson 15 Turing Machines, Pt. 1

ECS 120 Lesson 15 Turing Machines, Pt. 1 ECS 120 Lesson 15 Turing Machines, Pt. 1 Oliver Kreylos Wednesday, May 2nd, 2001 Before we can start investigating the really interesting problems in theoretical computer science, we have to introduce

More information

Theory Bridge Exam Example Questions

Theory Bridge Exam Example Questions Theory Bridge Exam Example Questions Annotated version with some (sometimes rather sketchy) answers and notes. This is a collection of sample theory bridge exam questions. This is just to get some idea

More information

Opleiding Informatica

Opleiding Informatica Opleiding Informatica Tape-quantifying Turing machines in the arithmetical hierarchy Simon Heijungs Supervisors: H.J. Hoogeboom & R. van Vliet BACHELOR THESIS Leiden Institute of Advanced Computer Science

More information

CS3719 Theory of Computation and Algorithms

CS3719 Theory of Computation and Algorithms CS3719 Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational tool - analysis

More information

34.1 Polynomial time. Abstract problems

34.1 Polynomial time. Abstract problems < Day Day Up > 34.1 Polynomial time We begin our study of NP-completeness by formalizing our notion of polynomial-time solvable problems. These problems are generally regarded as tractable, but for philosophical,

More information

CPSC 421: Tutorial #1

CPSC 421: Tutorial #1 CPSC 421: Tutorial #1 October 14, 2016 Set Theory. 1. Let A be an arbitrary set, and let B = {x A : x / x}. That is, B contains all sets in A that do not contain themselves: For all y, ( ) y B if and only

More information

CS6901: review of Theory of Computation and Algorithms

CS6901: review of Theory of Computation and Algorithms CS6901: review of Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational

More information

Theory of Computing Tamás Herendi

Theory of Computing Tamás Herendi Theory of Computing Tamás Herendi Theory of Computing Tamás Herendi Publication date 2014 Table of Contents 1 Preface 1 2 Formal languages 2 3 Order of growth rate 9 4 Turing machines 16 1 The definition

More information

Computational Tasks and Models

Computational Tasks and Models 1 Computational Tasks and Models Overview: We assume that the reader is familiar with computing devices but may associate the notion of computation with specific incarnations of it. Our first goal is to

More information

On Rice s theorem. Hans Hüttel. October 2001

On Rice s theorem. Hans Hüttel. October 2001 On Rice s theorem Hans Hüttel October 2001 We have seen that there exist languages that are Turing-acceptable but not Turing-decidable. An important example of such a language was the language of the Halting

More information

On Stateless Multicounter Machines

On Stateless Multicounter Machines On Stateless Multicounter Machines Ömer Eğecioğlu and Oscar H. Ibarra Department of Computer Science University of California, Santa Barbara, CA 93106, USA Email: {omer, ibarra}@cs.ucsb.edu Abstract. We

More information

Deciding Whether a Regular Language is Generated by a Splicing System

Deciding Whether a Regular Language is Generated by a Splicing System Deciding Whether a Regular Language is Generated by a Splicing System Lila Kari Steffen Kopecki Department of Computer Science The University of Western Ontario Middlesex College, London ON N6A 5B7 Canada

More information

Turing machines and linear bounded automata

Turing machines and linear bounded automata and linear bounded automata Informatics 2A: Lecture 29 John Longley School of Informatics University of Edinburgh jrl@inf.ed.ac.uk 25 November, 2011 1 / 13 1 The Chomsky hierarchy: summary 2 3 4 2 / 13

More information

SCHEME FOR INTERNAL ASSESSMENT TEST 3

SCHEME FOR INTERNAL ASSESSMENT TEST 3 SCHEME FOR INTERNAL ASSESSMENT TEST 3 Max Marks: 40 Subject& Code: Automata Theory & Computability (15CS54) Sem: V ISE (A & B) Note: Answer any FIVE full questions, choosing one full question from each

More information

Automata Theory and Formal Grammars: Lecture 1

Automata Theory and Formal Grammars: Lecture 1 Automata Theory and Formal Grammars: Lecture 1 Sets, Languages, Logic Automata Theory and Formal Grammars: Lecture 1 p.1/72 Sets, Languages, Logic Today Course Overview Administrivia Sets Theory (Review?)

More information

ON MINIMAL CONTEXT-FREE INSERTION-DELETION SYSTEMS

ON MINIMAL CONTEXT-FREE INSERTION-DELETION SYSTEMS ON MINIMAL CONTEXT-FREE INSERTION-DELETION SYSTEMS Sergey Verlan LACL, University of Paris XII 61, av. Général de Gaulle, 94010, Créteil, France e-mail: verlan@univ-paris12.fr ABSTRACT We investigate the

More information

60-354, Theory of Computation Fall Asish Mukhopadhyay School of Computer Science University of Windsor

60-354, Theory of Computation Fall Asish Mukhopadhyay School of Computer Science University of Windsor 60-354, Theory of Computation Fall 2013 Asish Mukhopadhyay School of Computer Science University of Windsor Pushdown Automata (PDA) PDA = ε-nfa + stack Acceptance ε-nfa enters a final state or Stack is

More information

Finite Automata. Mahesh Viswanathan

Finite Automata. Mahesh Viswanathan Finite Automata Mahesh Viswanathan In this lecture, we will consider different models of finite state machines and study their relative power. These notes assume that the reader is familiar with DFAs,

More information

Decision Problems with TM s. Lecture 31: Halting Problem. Universe of discourse. Semi-decidable. Look at following sets: CSCI 81 Spring, 2012

Decision Problems with TM s. Lecture 31: Halting Problem. Universe of discourse. Semi-decidable. Look at following sets: CSCI 81 Spring, 2012 Decision Problems with TM s Look at following sets: Lecture 31: Halting Problem CSCI 81 Spring, 2012 Kim Bruce A TM = { M,w M is a TM and w L(M)} H TM = { M,w M is a TM which halts on input w} TOTAL TM

More information

CS 311 Sample Final Examination

CS 311 Sample Final Examination Name: CS 311 Sample Final Examination Time: One hour and fifty minutes This is the (corrected) exam from Fall 2009. The real exam will not use the same questions! 8 December 2009 Instructions Attempt all

More information

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska LECTURE 13 CHAPTER 4 TURING MACHINES 1. The definition of Turing machine 2. Computing with Turing machines 3. Extensions of Turing

More information

Theory of Computation

Theory of Computation Theory of Computation COMP363/COMP6363 Prerequisites: COMP4 and COMP 6 (Foundations of Computing) Textbook: Introduction to Automata Theory, Languages and Computation John E. Hopcroft, Rajeev Motwani,

More information

Advanced topic: Space complexity

Advanced topic: Space complexity Advanced topic: Space complexity CSCI 3130 Formal Languages and Automata Theory Siu On CHAN Chinese University of Hong Kong Fall 2016 1/28 Review: time complexity We have looked at how long it takes to

More information

The View Over The Horizon

The View Over The Horizon The View Over The Horizon enumerable decidable context free regular Context-Free Grammars An example of a context free grammar, G 1 : A 0A1 A B B # Terminology: Each line is a substitution rule or production.

More information

Complexity Results for Deciding Networks of Evolutionary Processors 1

Complexity Results for Deciding Networks of Evolutionary Processors 1 Complexity Results for Deciding Networks of Evolutionary Processors 1 Florin Manea Institut für Informatik, Christian-Albrechts-Universität zu Kiel, D-24098 Kiel, Germany, and Faculty of Mathematics and

More information

Theory of Computation

Theory of Computation Thomas Zeugmann Hokkaido University Laboratory for Algorithmics http://www-alg.ist.hokudai.ac.jp/ thomas/toc/ Lecture 3: Finite State Automata Motivation In the previous lecture we learned how to formalize

More information

Computational Models: Class 3

Computational Models: Class 3 Computational Models: Class 3 Benny Chor School of Computer Science Tel Aviv University November 2, 2015 Based on slides by Maurice Herlihy, Brown University, and modifications by Iftach Haitner and Yishay

More information

Undecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65

Undecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65 Undecidable Problems Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, 2018 1/ 65 Algorithmically Solvable Problems Let us assume we have a problem P. If there is an algorithm solving

More information

Automata Theory. Lecture on Discussion Course of CS120. Runzhe SJTU ACM CLASS

Automata Theory. Lecture on Discussion Course of CS120. Runzhe SJTU ACM CLASS Automata Theory Lecture on Discussion Course of CS2 This Lecture is about Mathematical Models of Computation. Why Should I Care? - Ways of thinking. - Theory can drive practice. - Don t be an Instrumentalist.

More information

Decidability: Church-Turing Thesis

Decidability: Church-Turing Thesis Decidability: Church-Turing Thesis While there are a countably infinite number of languages that are described by TMs over some alphabet Σ, there are an uncountably infinite number that are not Are there

More information

COL 352 Introduction to Automata and Theory of Computation Major Exam, Sem II , Max 80, Time 2 hr. Name Entry No. Group

COL 352 Introduction to Automata and Theory of Computation Major Exam, Sem II , Max 80, Time 2 hr. Name Entry No. Group COL 352 Introduction to Automata and Theory of Computation Major Exam, Sem II 2015-16, Max 80, Time 2 hr Name Entry No. Group Note (i) Write your answers neatly and precisely in the space provided with

More information

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 15 Ana Bove May 17th 2018 Recap: Context-free Languages Chomsky hierarchy: Regular languages are also context-free; Pumping lemma

More information

From Gene to Protein

From Gene to Protein From Gene to Protein Gene Expression Process by which DNA directs the synthesis of a protein 2 stages transcription translation All organisms One gene one protein 1. Transcription of DNA Gene Composed

More information

Chapter 6. Properties of Regular Languages

Chapter 6. Properties of Regular Languages Chapter 6 Properties of Regular Languages Regular Sets and Languages Claim(1). The family of languages accepted by FSAs consists of precisely the regular sets over a given alphabet. Every regular set is

More information

3130CIT Theory of Computation

3130CIT Theory of Computation GRIFFITH UNIVERSITY School of Computing and Information Technology 3130CIT Theory of Computation Final Examination, Semester 2, 2006 Details Total marks: 120 (40% of the total marks for this subject) Perusal:

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY 15-453 FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Chomsky Normal Form and TURING MACHINES TUESDAY Feb 4 CHOMSKY NORMAL FORM A context-free grammar is in Chomsky normal form if every rule is of the form:

More information

CS5371 Theory of Computation. Lecture 14: Computability V (Prove by Reduction)

CS5371 Theory of Computation. Lecture 14: Computability V (Prove by Reduction) CS5371 Theory of Computation Lecture 14: Computability V (Prove by Reduction) Objectives This lecture shows more undecidable languages Our proof is not based on diagonalization Instead, we reduce the problem

More information

Computational Models - Lecture 3

Computational Models - Lecture 3 Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. p. 1 Computational Models - Lecture 3 Equivalence of regular expressions and regular languages (lukewarm leftover

More information