On the Incremental Inference of Context-Free Grammars from Positive Structural Information

Size: px
Start display at page:

Download "On the Incremental Inference of Context-Free Grammars from Positive Structural Information"

Transcription

1 International Journal of ISSN Systems and Technologies IJST Vol.1, No.2, pp KLEF 2008 On the Incremental Inference of Context-Free Grammars from Positive Structural Information Gend Lal Prajapati 1, Narendra S. Chaudhari 2 and Manohar Chandwani 3 1,3 Institute of Engineering & Technology, Department of Computer Engineering Devi Ahilya University Khandwa Road, Indore, PIN , M.P., India 1 gprajapati.iet@dauniv.ac.in, 3 chandwanim1@rediffmail.com 2 School of Computer Engineering Nanyang Technological University Block: N4-2a-32; 50, Nanyang Avenue, Singapore, PIN ASNarendra@ntu.edu.sg Abstract. The primary goal of this paper is to design so called incremental version of the effective model which learns context-free grammars from positive samples of their structural descriptions, where the input sample is fed to the learner in an online manner. This seems to be essential in the context of identification in the limit. Efficient pollynomially bounded updating inference algorithms are presented to achieve good incremental behavior in the inference algorithms mrt and mrc, demonstrated in the effective model. We also modify mrc to infer extended reversible context-free grammars from positive structural samples. Keywords: Context-free grammar (CFG); Extended reversible CFG; Grammatical inference; Reversible skeletal tree automata; Structural sample. 1 Introduction We consider the problem of inferring context-free languages incrementally from positive-only examples. The problem of designing inference algorithms for inferring a correct grammar for the unknown language from input data is at very heart of an important area of machine learning, called grammatical inference. It is known from Gold s negative result [1] on identifiability from positive presentation, that the class of CFGs cannot be identified in the limit from positive presentations. In search of an example presentation mechanism that compensates for the lack of explicit negative information in positive samples, learning CFGs in the framework of identification in the limit from positive samples of their structural descriptions thus becomes common in the literature, where a structural description of a CFG is an unlabelled derivation 15

2 On the incremental inference tree of the grammar. An example with some parenthesis inserted to indicate the shape of the derivation tree of a CFG, or equivalently an unlabelled derivation tree of the CFG is called a structural example or simply a skeleton. The tree automata based well-known efficient learning model for CFGs from positive structural information is early introduced by Sakakibara [4]. He has shown that there exists a class, called reversible CFGs, which can be identified in the limit from positive presentations of structural examples, that is, all and only unlabelled derivation trees of the unknown CFG, and shown that the reversible CFG is a normal form for CFGs, that is, reversible CFGs can generate all the context-free languages. We have improved the Sakakibara s work by producing an effective model [3] for the above problem. Main results include: the output grammar is consistent with the given examples, makes the task of bottom-up parsing easy, runs in O(n 3 ) time in the sum of the sizes of the input examples, achieves O(n 2 ) storage space saving over the closely related model [4] and infers a grammar from positive-only examples efficiently. However, like Sakakibara s model, it also cannot be used for an incremental learning, i.e., in the situation when the input sample is fed to the inference algorithm in an online manner, because they could not update a guess incrementally. In this paper, to solve above computational problem, we present polynomial time updating scheme to achieve good incremental behavior in our effective model [3]. In addition to this, we also demonstrate an inference scheme for extended reversible CFGs from positive samples of their structural descriptions. 2 Inference Algorithms Now we are going to present our inference scheme. We utilize all required definitions and notations that are relevant to our work from [2, 3]. We first present updating algorithms in order to prepare the inference algorithms mrt [3] for reversible tree automata from positive samples and mrc [3] for reversible CFGs from positive samples of their structural descriptions, to update a hypothesis when the training examples are supplied to the inference algorithm in online basis. Next we modify mrc to infer extended reversible CFGs from positive structural samples. A CFG G = (N,, P, S) is known to be extended reversible iff for P' = P { S a a }, G' = (N,, P', S) is reversible[4]. It can be easily shown (by using the Theorem 1 and Theorem 2 in [3]) that the inference algorithms sketched in this section have polynomial bounded update time. 2.1 Updating Algorithm mirt for Tree Automata 16

3 It is useful in the context of the identification in the limit to show that mrt may be modified to have good incremental behavior. That is, given the output A f = Bs(S + )/π' f computed by mrt on input S +, where π' f is the final partition found by mrt beginning with the trivial partition of the set of states of Bs(S + ) and given a new nonempty set of skeletons S ++, we may update A f to be the output computed by mrt on input S = S + S ++. The method for achieving this is illustrated in the algorithm mirt shown as Algorithm 1. Our main aim is to reduce the overheads needed to compute the partition already been found by mrt on input S +. This is guaranteed by the initializations of mirt before entering to the merging portion. Increase i by 1; Algorithm 1: Input: the output A'/π' f computed by mrt on input a nonempty positive sample S + beginning with the base tree automaton A' = Bs(S + ) = (Q', V, δ', F' ) and a new nonempty set of skeletons S ++, where π' f is the final partition of the set Q' of states of A' constructed by mrt; Output: a reversible skeletal tree automaton A such that A = mrt(s + S ++ ); Method: Let Bs(S ++ ) = (Q'', V, δ'', F'' ); Let A = (Q = Q' Q'', V, δ = δ' δ'', F = F' F'' ); Let π' 0 be the trivial partition of Q Q' ; Let π 0 = π' 0 π' f ; Let B f π' f be the block such that B f F' Ø; Let π 1 be π 0 with B f and all blocks B(q, π 0 ) such that q F F' merged; Let i = 1; do Let j = i; for all p of the form p = σ(u 1,, u k ) Q for all q of the form q = σ(u' 1,, u' k ) Q { p} if B(p, π i ) = B(q, π i ) then if u l, u' l Q and B(u l, π i ) B(u' l, π i ) for some l (1 l k) and B(u j, π i ) = B(u' j, π i ) or u j = u' j Σ for 1 j k and j l then Let π i+1 be π i with B(u l, π i ) and B(u' l, π i ) merged; Increase i by 1; fi /* End of if */ else if B(u j, π i ) = B(u' j, π i ) or u j = u' j Σ for 1 j k then Let π i+1 be π i with B(p, π i ) and B(q, π i ) merged; 17

4 On the incremental inference fi /* End of if */ fi /* End of if */ end for end for while j i; Let f = i and output the tree automaton A/π f. End of Algorithm. 2.2 Updating Algorithm mirc for Context-Free Grammars Further, in the context of the identification in the limit of reversible CFGs we need to modify the inference algorithm mrc to infer reversible CFGs incrementally from positive samples of their structural descriptions. The algorithm mirc for doing this is sketched in Algorithm 2. Algorithm 2: Input: the output A'/π' f computed by mrt on input a nonempty positive sample S + beginning with the base tree automaton A' = Bs(S + ) = (Q', V, δ', F' ) and a new nonempty set of positive structural examples S ++, where π' f is the final partition of the set Q' of states of A' constructed by mrt; Output: a reversible context-free grammar G such that G = mrc(s + S ++ ); Method: Run mirt on the tree automaton A'/π' f and the set S ++ ; Let G = G'(mIRT(A'/π' f, S ++ )) and output the grammar G. End of Algorithm. 2.3 Inferring Extended Reversible Grammars The inference algorithm mrc' for extended reversible CFGs from positive samples of their structural descriptions is described in Algorithm 3. This is a modification of the algorithm mrc. Algorithm 3: Input : a nonempty positive sample S + of structural descriptions; Output : an extended reversible context-free grammar G; Method : Let S' + = S + { σ(a) a }; Let Uni = S + { σ(a) a }; 18

5 Run mrc on the sample S' + and let G' = (N,, P, S) be mrc(s' + ); Let P' = {S a σ(a) Uni }; Let G = (N,, P P', S) and output the grammar G. End of Algorithm. Now we are going to present the updating algorithm mirc' to have good incremental behavior in mrc' for inferring extended reversible CFGs from positive samples of their structural descriptions. The algorithm mirc' is described in Algorithm 4. Algorithm 4: Input: the output A'/π' f computed by mrt on input a nonempty positive sample S + = S' + { σ(a) a } beginning with the base tree automaton A' = Bs(S + ) = (Q', V, δ', F' ) and a new nonempty set of positive structural examples S' ++, where π' f is the final partition of the set Q' of states of A' constructed by mrt ; Output: an extended reversible context-free grammar G such that G = mrc'(s + S ++ ); Method: Let S ++ = S' ++ { σ(a) a }; Let S = S' + S' ++ ; Let Uni = S { σ(a) a }; Run mirc on the automaton A'/π' f and the set S ++ ; Let G' = (N,, P, S) be mirc(a'/π' f, S ++ ); Let P' = {S a σ(a) Uni }; Let G = (N,, P P', S) and output the grammar G. End of Algorithm. 3 An Example As an example of run, suppose that the algorithm mrc is going to infer the following unknown CFG G U for a simple natural language: Sentence Noun_phrase Verb_phrase Noun_phrase Determiner Noun_phrase2 Noun_phrase2 Noun 19

6 On the incremental inference Noun_phrase2 Adjective Noun_phrase2 Verb_phrase Verb Noun_phrase Determiner the Determiner a Noun girl Noun cat Noun dog Adjective young Verb likes Verb chases. First suppose that the inference algorithm mrc is given the following structural sample S + : σ ( σ ( σ ( the ), σ ( σ ( girl ) ) ), σ ( σ ( likes ), σ ( σ ( a ), σ ( σ ( cat ) ) ) ) ) σ ( σ ( σ ( the ), σ ( σ ( girl ) ) ), σ ( σ ( likes ), σ ( σ ( a ), σ ( σ ( dog ) ) ) ) ). So, mrt first constructs the base tree automaton A = Bs(S + ) = (Q, V, δ, F) as follows: V 0 = {the, girl, likes, a, cat, dog}. Thus all elements of Q of the form q = σ(u 1,, u k ), where special symbol σ Sk, q Q, and u 1,..., u k Q V 0 are shown as follows. q 1 = σ(the) q 2 = σ(girl) q 3 = σ(q 2 ) q 4 = σ(q 1, q 3 ) q 5 = σ(likes) q 6 = σ(a) q 7 = σ(cat) q 8 = σ(q 7 ) q 9 = σ(q 6, q 8 ) q 10 = σ(q 5, q 9 ) q 11 = σ(q 4, q 10 ) F q 12 = σ(dog) 20

7 q 13 = σ(q 12 ) q 14 = σ(q 6, q 13 ) q 15 = σ(q 5, q 14 ) q 16 = σ(q 4, q 15 ) F. The trivial partition of the set Q: 0 = {[q 1 ], [q 2 ], [q 3 ], [q 4 ], [q 5 ], [q 6 ], [q 7 ], [q 8 ], [q 9 ], [q 10 ], [q 11 ], [q 12 ], [q 13 ], [q 14 ], [q 15 ], [q 16 ]}. Now the mrt finds the following final partition f of the set Q beginning with the trivial partition 0 of Q with the property that A/ f is reversible and outputs A/ f. f = { [q 1 ], [q 2 ], [q 3 ], [q 4 ], [q 5 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 9, q 14 ], [q 10, q 15 ], [q 11, q 16 ] }. Let we call NT 1 = [q 4 ], NT 2 = [q 10, q 15 ], NT 3 = [q 1 ], NT 4 = [q 3 ], NT 5 = [q 2 ], NT 6 = [q 5 ], NT 7 = [q 9, q 14 ], NT 8 = [q 6 ], NT 9 = [q 8, q 13 ], NT 10 = [q 7, q 12 ], and S = [q 11, q 16 ]. Then mrc outputs the reversible CFG: S NT 1 NT 2 NT 1 NT 3 NT 4 NT 2 NT 6 NT 7 NT 3 the NT 4 NT 5 NT 5 girl NT 6 likes NT 7 NT 8 NT 9 NT 8 a NT 9 NT 10 NT 10 cat NT 10 dog. Suppose that in the next stage the following examples are added to the sample, S ++ : σ ( σ ( σ ( a ), σ ( σ ( dog ) ) ), σ ( σ ( chases ), σ ( σ ( the ), σ ( σ ( girl ) ) ) ) ) σ ( σ ( σ ( a ), σ ( σ ( dog ) ) ), σ ( σ ( chases ), σ ( σ ( a ), σ ( σ ( cat ) ) ) ) ). 21

8 On the incremental inference Now the algorithm mirc updates the above grammar based on this information by applying the updating algorithm mirt. The mirt finds the reversible tree automaton corresponding to the examples received up to this stage without making overhead computation as shown in the following. On the addition of S ++, the following are new states of the form q = σ(u 1,, u k ): q 17 = σ(chases) q 18 = σ(q 17, q 4 ) q 19 = σ(q 14, q 18 ) (final state) q 20 = σ(q 17, q 9 ) q 21 = σ(q 14, q 20 ) (final state). The automaton A = (Q, V, δ, F) is: V 0 = {the, girl, likes, a, cat, dog, chases}. Q = {q 1, q 2, q 3, q 4, q 5, q 6, q 7, q 8, q 9, q 10, q 11, q 12, q 13, q 14, q 15, q 16, q 17, q 18, q 19, q 20, q 21 }. F = {q 11, q 16, q 19, q 21 }. The trivial partition of the new states: π' 0 = {[q 17 ], [q 18 ], [q 19 ], [q 20 ], [q 21 ] }. Therefore, π 0 = {[q 17 ], [q 18 ], [q 19 ], [q 20 ], [q 21 ]} {[q 1 ], [q 2 ], [q 3 ], [q 4 ], [q 5 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 9, q 14 ], [q 10, q 15 ], [q 11, q 16 ]} = {[q 1 ], [q 2 ], [q 3 ], [q 4 ], [q 5 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 9, q 14 ], [q 10, q 15 ], [q 11, q 16 ], [q 17 ], [q 18 ], [q 19 ], [q 20 ], [q 21 ]}. Finally, the initial partition π 1 to start merging passes is: π 1 = {[q 1 ], [q 2 ], [q 3 ], [q 4 ], [q 5 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 9, q 14 ], [q 10, q 15 ], [q 11, q 16, q 19, q 21 ], [q 17 ], [q 18 ], [q 20 ]}. Then the algorithm mirt beginning with the above partition π 1 obtains the following final partition f and outputs the reversible tree automaton A/ f. f = {[q 1 ], [q 2 ], [q 3 ], [q 4, q 9, q 14 ], [q 5, q 17 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 10, q 15, q 18, q 20 ], [q 11, q 16, q 19, q 21 ]}. 22

9 Next, mirc outputs the following reversible CFG: S NT 1 NT 2 NT 1 NT 3 NT 4 NT 1 NT 6 NT 7 NT 2 NT 9 NT 1 NT 3 the NT 4 NT 5 NT 5 girl NT 6 a NT 7 NT 8 NT 8 cat NT 8 dog NT 9 likes NT 9 chases. Here, NT 1 = [q 4, q 9, q 14 ], NT 2 = [q 10, q 15, q 18, q 20 ], NT 3 = [q 1 ], NT 4 = [q 3 ], NT 5 = [q 2 ], NT 6 = [q 6 ], NT 7 = [q 8, q 13 ], NT 8 = [q 7, q 12 ], NT 9 = [q 5, q 17 ], and S = [q 11, q 16, q 19, q 21 ]. Suppose that in the further stage of the inference process the following examples are added to the sample: σ ( σ ( σ ( a ), σ ( σ ( dog ) ) ), σ ( σ ( chases ), σ ( σ ( a ), σ ( σ ( girl ) ) ) ) ) σ ( σ ( σ ( the ), σ ( σ ( dog ) ) ), σ ( σ ( chases ), σ ( σ ( a ), σ ( σ ( young ), σ ( σ ( girl ) ) ) ) ) ). For this instance the new states are of the form q = σ(u 1,, u k ): q 22 = σ(q 6, q 3 ) q 23 = σ(q 17, q 22 ) q 24 = σ(q 14, q 23 ) (final state) q 25 = σ(q 1, q 13 ) q 26 = σ(young) q 27 = σ(q 26, q 3 ) 23

10 On the incremental inference q 28 = σ(q 6, q 27 ) q 29 = σ(q 17, q 28 ) q 30 = σ(q 25, q 29 ) (final state). The automaton A = (Q, V, δ, F) is: V 0 = {the, girl, likes, a, cat, dog, chases, young}. Q = {q 1, q 2, q 3, q 4, q 5, q 6, q 7, q 8, q 9, q 10, q 11, q 12, q 13, q 14, q 15, q 16, q 17, q 18, q 19, q 20, q 21, q 22, q 23, q 24, q 25, q 26, q 27, q 28, q 29, q 30 }. F = {q 11, q 16, q 19, q 21, q 24, q 30 }. Setting initial partition to go on merging: π 1 = {[q 1 ], [q 2 ], [q 3 ], [q 4, q 9, q 14 ], [q 5, q 17 ], [q 6 ], [q 7, q 12 ], [q 8, q 13 ], [q 10, q 15, q 18, q 20 ], [q 11, q 16, q 19, q 21, q 24, q 30 ], [q 22 ], [q 23 ], [q 25 ], [q 26 ], [q 27 ], [q 28 ], [q 29 ]}. Now, mirt starts updating the above partition by employing the merging process. Finally, it finds the following partition f and outputs the reversible tree automaton A/ f. f = {[q 1, q 6 ], [q 2, q 7, q 12 ], [q 3, q 8, q 13, q 27 ], [q 4, q 9, q 14, q 22, q 25, q 28 ], [q 5, q 17 ], [q 10, q 15, q 18, q 20, q 23, q 29 ], [q 11, q 16, q 19, q 21, q 24, q 30 ], [q 26 ]}. Let we denote NT 1 = [q 4, q 9, q 14, q 22, q 25, q 28 ], NT 2 = [q 10, q 15, q 18, q 20, q 23, q 29 ], NT 3 = [q 1, q 6 ], NT 4 = [q 3, q 8, q 13, q 27 ], NT 5 = [q 2, q 7, q 12 ], NT 6 = [q 26 ], NT 7 = [q 5, q 17 ], and S = [q 11, q 16, q 19, q 21, q 24, q 30 ]. The algorithm mirc outputs the following reversible CFG: S NT 1 NT 2 NT 1 NT 3 NT 4 NT 4 NT 5 NT 4 NT 6 NT 4 NT 2 NT 7 NT 1 NT 3 the NT 3 a NT 5 girl NT 5 cat NT 5 dog 24

11 NT 6 young NT 7 likes NT 7 chases. This grammar is isomorphic to the unknown grammar G U. 4 Conclusion We have presented efficient polynomially bounded algorithms for updating a guess in the inference of CFGs from positive samples of their structural descriptions. These algorithms are of particular interest because it is impractical to start the inferring process from the beginning every time new examples are added in online manner. Our scheme guarantees to not make overhead computation to update a grammar on receiving new examples. The computation saving has been verified with a concrete example. References [1] Gold, E.M., Language Identification in the Limit, Information and Control, vol. 10, pp , [2] Martin, J.C., Introduction to Languages and the Theory of Computation, Tata McGraw-Hill, [3] Prajapati, G.L., Chaudhari, N.S., and Chandwani, M., An Effective Model for Context-Free Grammar Inference, in: Prasad, B. (Ed.), Proceedings of the 3 rd Indian International Conference on Artificial Intelligence (IICAI-07), Pune, India, pp , [4] Sakakibara, Y., Efficient Learning of Context-Free Grammars from Positive Structural Examples, Information and Computation, vol. 97, pp ,

Learning cover context-free grammars from structural data

Learning cover context-free grammars from structural data Learning cover context-free grammars from structural data Mircea Marin Gabriel Istrate West University of Timişoara, Romania 11th International Colloquium on Theoretical Aspects of Computing ICTAC 2014

More information

CISC 4090 Theory of Computation

CISC 4090 Theory of Computation CISC 4090 Theory of Computation Context-Free Languages and Push Down Automata Professor Daniel Leeds dleeds@fordham.edu JMH 332 Languages: Regular and Beyond Regular: Captured by Regular Operations a b

More information

CISC 4090 Theory of Computation

CISC 4090 Theory of Computation CISC 4090 Theory of Computation Context-Free Languages and Push Down Automata Professor Daniel Leeds dleeds@fordham.edu JMH 332 Languages: Regular and Beyond Regular: a b c b d e a Not-regular: c n bd

More information

CISC4090: Theory of Computation

CISC4090: Theory of Computation CISC4090: Theory of Computation Chapter 2 Context-Free Languages Courtesy of Prof. Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Spring, 2014 Overview In Chapter

More information

Improved TBL algorithm for learning context-free grammar

Improved TBL algorithm for learning context-free grammar Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 267 274 2007 PIPS Improved TBL algorithm for learning context-free grammar Marcin Jaworski

More information

Learning Languages with Help

Learning Languages with Help Learning Languages with Help Christopher Kermorvant Colin de la Higuera March 14, 2002 Abstract Grammatical inference consists in learning formal grammars for unknown languages when given learning data.

More information

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

CMPT-825 Natural Language Processing. Why are parsing algorithms important? CMPT-825 Natural Language Processing Anoop Sarkar http://www.cs.sfu.ca/ anoop October 26, 2010 1/34 Why are parsing algorithms important? A linguistic theory is implemented in a formal system to generate

More information

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis 1 Introduction Parenthesis Matching Problem Describe the set of arithmetic expressions with correctly matched parenthesis. Arithmetic expressions with correctly matched parenthesis cannot be described

More information

Learning Cover Context-Free Grammars from. Structural Data

Learning Cover Context-Free Grammars from. Structural Data Scientific Annals of Computer Science vol. 24 (2), 2014, pp. 253 286 doi: 10.7561/SACS.2014.2.253 Learning Cover Context-Free Grammars from Structural Data Mircea Marin 1, Gabriel Istrate 2 Abstract We

More information

Foreword. Grammatical inference. Examples of sequences. Sources. Example of problems expressed by sequences Switching the light

Foreword. Grammatical inference. Examples of sequences. Sources. Example of problems expressed by sequences Switching the light Foreword Vincent Claveau IRISA - CNRS Rennes, France In the course of the course supervised symbolic machine learning technique concept learning (i.e. 2 classes) INSA 4 Sources s of sequences Slides and

More information

What we have done so far

What we have done so far What we have done so far DFAs and regular languages NFAs and their equivalence to DFAs Regular expressions. Regular expressions capture exactly regular languages: Construct a NFA from a regular expression.

More information

Computational Models - Lecture 4

Computational Models - Lecture 4 Computational Models - Lecture 4 Regular languages: The Myhill-Nerode Theorem Context-free Grammars Chomsky Normal Form Pumping Lemma for context free languages Non context-free languages: Examples Push

More information

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the

More information

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46. : Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching

More information

Learning k-edge Deterministic Finite Automata in the Framework of Active Learning

Learning k-edge Deterministic Finite Automata in the Framework of Active Learning Learning k-edge Deterministic Finite Automata in the Framework of Active Learning Anuchit Jitpattanakul* Department of Mathematics, Faculty of Applied Science, King Mong s University of Technology North

More information

Computational Models - Lecture 3

Computational Models - Lecture 3 Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. p. 1 Computational Models - Lecture 3 Equivalence of regular expressions and regular languages (lukewarm leftover

More information

Outline. CS21 Decidability and Tractability. Machine view of FA. Machine view of FA. Machine view of FA. Machine view of FA.

Outline. CS21 Decidability and Tractability. Machine view of FA. Machine view of FA. Machine view of FA. Machine view of FA. Outline CS21 Decidability and Tractability Lecture 5 January 16, 219 and Languages equivalence of NPDAs and CFGs non context-free languages January 16, 219 CS21 Lecture 5 1 January 16, 219 CS21 Lecture

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner. Tel Aviv University. November 21, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner and Yishay Mansour. Tel Aviv University. April 3/8, 2013 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice

More information

Doctoral Course in Speech Recognition. May 2007 Kjell Elenius

Doctoral Course in Speech Recognition. May 2007 Kjell Elenius Doctoral Course in Speech Recognition May 2007 Kjell Elenius CHAPTER 12 BASIC SEARCH ALGORITHMS State-based search paradigm Triplet S, O, G S, set of initial states O, set of operators applied on a state

More information

Parsing with Context-Free Grammars

Parsing with Context-Free Grammars Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars

More information

A Support Vector Method for Multivariate Performance Measures

A Support Vector Method for Multivariate Performance Measures A Support Vector Method for Multivariate Performance Measures Thorsten Joachims Cornell University Department of Computer Science Thanks to Rich Caruana, Alexandru Niculescu-Mizil, Pierre Dupont, Jérôme

More information

Theoretical Computer Science

Theoretical Computer Science Theoretical Computer Science 448 (2012) 41 46 Contents lists available at SciVerse ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs Polynomial characteristic sets

More information

Extensions to the Logic of All x are y: Verbs, Relative Clauses, and Only

Extensions to the Logic of All x are y: Verbs, Relative Clauses, and Only 1/53 Extensions to the Logic of All x are y: Verbs, Relative Clauses, and Only Larry Moss Indiana University Nordic Logic School August 7-11, 2017 2/53 An example that we ll see a few times Consider the

More information

Context-Free Grammars and Languages. Reading: Chapter 5

Context-Free Grammars and Languages. Reading: Chapter 5 Context-Free Grammars and Languages Reading: Chapter 5 1 Context-Free Languages The class of context-free languages generalizes the class of regular languages, i.e., every regular language is a context-free

More information

Sequences and Information

Sequences and Information Sequences and Information Rahul Siddharthan The Institute of Mathematical Sciences, Chennai, India http://www.imsc.res.in/ rsidd/ Facets 16, 04/07/2016 This box says something By looking at the symbols

More information

To make a grammar probabilistic, we need to assign a probability to each context-free rewrite

To make a grammar probabilistic, we need to assign a probability to each context-free rewrite Notes on the Inside-Outside Algorithm To make a grammar probabilistic, we need to assign a probability to each context-free rewrite rule. But how should these probabilities be chosen? It is natural to

More information

FLAC Context-Free Grammars

FLAC Context-Free Grammars FLAC Context-Free Grammars Klaus Sutner Carnegie Mellon Universality Fall 2017 1 Generating Languages Properties of CFLs Generation vs. Recognition 3 Turing machines can be used to check membership in

More information

Context Free Grammars

Context Free Grammars Automata and Formal Languages Context Free Grammars Sipser pages 101-111 Lecture 11 Tim Sheard 1 Formal Languages 1. Context free languages provide a convenient notation for recursive description of languages.

More information

Expectation Maximization (EM)

Expectation Maximization (EM) Expectation Maximization (EM) The EM algorithm is used to train models involving latent variables using training data in which the latent variables are not observed (unlabeled data). This is to be contrasted

More information

In this chapter, we explore the parsing problem, which encompasses several questions, including:

In this chapter, we explore the parsing problem, which encompasses several questions, including: Chapter 12 Parsing Algorithms 12.1 Introduction In this chapter, we explore the parsing problem, which encompasses several questions, including: Does L(G) contain w? What is the highest-weight derivation

More information

Support Vector Machines: Kernels

Support Vector Machines: Kernels Support Vector Machines: Kernels CS6780 Advanced Machine Learning Spring 2015 Thorsten Joachims Cornell University Reading: Murphy 14.1, 14.2, 14.4 Schoelkopf/Smola Chapter 7.4, 7.6, 7.8 Non-Linear Problems

More information

Chapter 14 (Partially) Unsupervised Parsing

Chapter 14 (Partially) Unsupervised Parsing Chapter 14 (Partially) Unsupervised Parsing The linguistically-motivated tree transformations we discussed previously are very effective, but when we move to a new language, we may have to come up with

More information

CS5371 Theory of Computation. Lecture 14: Computability V (Prove by Reduction)

CS5371 Theory of Computation. Lecture 14: Computability V (Prove by Reduction) CS5371 Theory of Computation Lecture 14: Computability V (Prove by Reduction) Objectives This lecture shows more undecidable languages Our proof is not based on diagonalization Instead, we reduce the problem

More information

Strange goings on! Jones did it slowly, deliberately, in the bathroom, with a knife, at midnight. What he did was butter a piece of toast.

Strange goings on! Jones did it slowly, deliberately, in the bathroom, with a knife, at midnight. What he did was butter a piece of toast. CHAPTER 9: EVENTS AND PLURALITY 9.1. Neo-Davidsonian event semantics. "Strange goings on! Jones did it slowly, deliberately, in the bathroom, with a knife, at midnight. What he did was butter a piece of

More information

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). 1a) G = ({R, S, T}, {0,1}, P, S) where P is: S R0R R R0R1R R1R0R T T 0T ε (S generates the first 0. R generates

More information

Advanced Automata Theory 11 Regular Languages and Learning Theory

Advanced Automata Theory 11 Regular Languages and Learning Theory Advanced Automata Theory 11 Regular Languages and Learning Theory Frank Stephan Department of Computer Science Department of Mathematics National University of Singapore fstephan@comp.nus.edu.sg Advanced

More information

Properties of Regular Languages. BBM Automata Theory and Formal Languages 1

Properties of Regular Languages. BBM Automata Theory and Formal Languages 1 Properties of Regular Languages BBM 401 - Automata Theory and Formal Languages 1 Properties of Regular Languages Pumping Lemma: Every regular language satisfies the pumping lemma. A non-regular language

More information

October 6, Equivalence of Pushdown Automata with Context-Free Gramm

October 6, Equivalence of Pushdown Automata with Context-Free Gramm Equivalence of Pushdown Automata with Context-Free Grammar October 6, 2013 Motivation Motivation CFG and PDA are equivalent in power: a CFG generates a context-free language and a PDA recognizes a context-free

More information

Learning Regular Languages Using Nondeterministic Finite Automata

Learning Regular Languages Using Nondeterministic Finite Automata Learning Regular Languages Using Nondeterministic Finite Automata Pedro García 1,ManuelVázquez de Parga 1,GloriaI.Álvarez2,andJoséRuiz 1 1 DSIC, Universidad Politécnica de Valencia. Valencia (Spain) 2

More information

ECS 120: Theory of Computation UC Davis Phillip Rogaway February 16, Midterm Exam

ECS 120: Theory of Computation UC Davis Phillip Rogaway February 16, Midterm Exam ECS 120: Theory of Computation Handout MT UC Davis Phillip Rogaway February 16, 2012 Midterm Exam Instructions: The exam has six pages, including this cover page, printed out two-sided (no more wasted

More information

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata Journal of Mathematics and Statistics 2 (2): 386-390, 2006 ISSN 1549-3644 Science Publications, 2006 An Evolution Strategy for the Induction of Fuzzy Finite-state Automata 1,2 Mozhiwen and 1 Wanmin 1 College

More information

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever.   ETH Zürich (D-ITET) October, Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu ETH Zürich (D-ITET) October, 5 2017 Part 3 out of 5 Last week, we learned about closure and equivalence of regular

More information

Part 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages

Part 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu Part 3 out of 5 ETH Zürich (D-ITET) October, 5 2017 Last week, we learned about closure and equivalence of regular

More information

Pushdown Automata: Introduction (2)

Pushdown Automata: Introduction (2) Pushdown Automata: Introduction Pushdown automaton (PDA) M = (K, Σ, Γ,, s, A) where K is a set of states Σ is an input alphabet Γ is a set of stack symbols s K is the start state A K is a set of accepting

More information

Grammars and Context-free Languages; Chomsky Hierarchy

Grammars and Context-free Languages; Chomsky Hierarchy Regular and Context-free Languages; Chomsky Hierarchy H. Geuvers Institute for Computing and Information Sciences Version: fall 2015 H. Geuvers Version: fall 2015 Huygens College 1 / 23 Outline Regular

More information

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima. http://goo.gl/jv7vj9 Course website KYOTO UNIVERSITY Statistical Machine Learning Theory From Multi-class Classification to Structured Output Prediction Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT

More information

Context Free Languages (CFL) Language Recognizer A device that accepts valid strings. The FA are formalized types of language recognizer.

Context Free Languages (CFL) Language Recognizer A device that accepts valid strings. The FA are formalized types of language recognizer. Context Free Languages (CFL) Language Recognizer A device that accepts valid strings. The FA are formalized types of language recognizer. Language Generator: Context free grammars are language generators,

More information

Induction of Non-Deterministic Finite Automata on Supercomputers

Induction of Non-Deterministic Finite Automata on Supercomputers JMLR: Workshop and Conference Proceedings 21:237 242, 2012 The 11th ICGI Induction of Non-Deterministic Finite Automata on Supercomputers Wojciech Wieczorek Institute of Computer Science, University of

More information

The View Over The Horizon

The View Over The Horizon The View Over The Horizon enumerable decidable context free regular Context-Free Grammars An example of a context free grammar, G 1 : A 0A1 A B B # Terminology: Each line is a substitution rule or production.

More information

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima. http://goo.gl/xilnmn Course website KYOTO UNIVERSITY Statistical Machine Learning Theory From Multi-class Classification to Structured Output Prediction Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT

More information

Einführung in die Computerlinguistik

Einführung in die Computerlinguistik Einführung in die Computerlinguistik Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 22 CFG (1) Example: Grammar G telescope : Productions: S NP VP NP

More information

Context Sensitive Grammar

Context Sensitive Grammar Context Sensitive Grammar Aparna S Vijayan Department of Computer Science and Automation December 2, 2011 Aparna S Vijayan (CSA) CSG December 2, 2011 1 / 12 Contents Aparna S Vijayan (CSA) CSG December

More information

Introduction to Formal Languages, Automata and Computability p.1/42

Introduction to Formal Languages, Automata and Computability p.1/42 Introduction to Formal Languages, Automata and Computability Pushdown Automata K. Krithivasan and R. Rama Introduction to Formal Languages, Automata and Computability p.1/42 Introduction We have considered

More information

Lecture 21: Algebraic Computation Models

Lecture 21: Algebraic Computation Models princeton university cos 522: computational complexity Lecture 21: Algebraic Computation Models Lecturer: Sanjeev Arora Scribe:Loukas Georgiadis We think of numerical algorithms root-finding, gaussian

More information

Computability and Complexity

Computability and Complexity Computability and Complexity Rewriting Systems and Chomsky Grammars CAS 705 Ryszard Janicki Department of Computing and Software McMaster University Hamilton, Ontario, Canada janicki@mcmaster.ca Ryszard

More information

Theory of Computation (IV) Yijia Chen Fudan University

Theory of Computation (IV) Yijia Chen Fudan University Theory of Computation (IV) Yijia Chen Fudan University Review language regular context-free machine DFA/ NFA PDA syntax regular expression context-free grammar Pushdown automata Definition A pushdown automaton

More information

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):

More information

CS1800: Strong Induction. Professor Kevin Gold

CS1800: Strong Induction. Professor Kevin Gold CS1800: Strong Induction Professor Kevin Gold Mini-Primer/Refresher on Unrelated Topic: Limits This is meant to be a problem about reasoning about quantifiers, with a little practice of other skills, too

More information

On rigid NL Lambek grammars inference from generalized functor-argument data

On rigid NL Lambek grammars inference from generalized functor-argument data 7 On rigid NL Lambek grammars inference from generalized functor-argument data Denis Béchet and Annie Foret Abstract This paper is concerned with the inference of categorial grammars, a context-free grammar

More information

8: Hidden Markov Models

8: Hidden Markov Models 8: Hidden Markov Models Machine Learning and Real-world Data Helen Yannakoudakis 1 Computer Laboratory University of Cambridge Lent 2018 1 Based on slides created by Simone Teufel So far we ve looked at

More information

Decidable and undecidable languages

Decidable and undecidable languages The Chinese University of Hong Kong Fall 2011 CSCI 3130: Formal languages and automata theory Decidable and undecidable languages Andrej Bogdanov http://www.cse.cuhk.edu.hk/~andrejb/csc3130 Problems about

More information

Peled, Vardi, & Yannakakis: Black Box Checking

Peled, Vardi, & Yannakakis: Black Box Checking Peled, Vardi, & Yannakakis: Black Box Checking Martin Leucker leucker@it.uu.se Department of Computer Systems,, Sweden Plan Preliminaries State identification and verification Conformance Testing Extended

More information

Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs

Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs Harry Lewis October 8, 2013 Reading: Sipser, pp. 119-128. Pushdown Automata (review) Pushdown Automata = Finite automaton

More information

Preference-Based Rank Elicitation using Statistical Models: The Case of Mallows

Preference-Based Rank Elicitation using Statistical Models: The Case of Mallows Preference-Based Rank Elicitation using Statistical Models: The Case of Mallows Robert Busa-Fekete 1 Eyke Huellermeier 2 Balzs Szrnyi 3 1 MTA-SZTE Research Group on Artificial Intelligence, Tisza Lajos

More information

Weak vs. Strong Finite Context and Kernel Properties

Weak vs. Strong Finite Context and Kernel Properties ISSN 1346-5597 NII Technical Report Weak vs. Strong Finite Context and Kernel Properties Makoto Kanazawa NII-2016-006E July 2016 Weak vs. Strong Finite Context and Kernel Properties Makoto Kanazawa National

More information

CFGs and PDAs are Equivalent. We provide algorithms to convert a CFG to a PDA and vice versa.

CFGs and PDAs are Equivalent. We provide algorithms to convert a CFG to a PDA and vice versa. CFGs and PDAs are Equivalent We provide algorithms to convert a CFG to a PDA and vice versa. CFGs and PDAs are Equivalent We now prove that a language is generated by some CFG if and only if it is accepted

More information

PAC Generalization Bounds for Co-training

PAC Generalization Bounds for Co-training PAC Generalization Bounds for Co-training Sanjoy Dasgupta AT&T Labs Research dasgupta@research.att.com Michael L. Littman AT&T Labs Research mlittman@research.att.com David McAllester AT&T Labs Research

More information

Harvard CS 121 and CSCI E-207 Lecture 10: Ambiguity, Pushdown Automata

Harvard CS 121 and CSCI E-207 Lecture 10: Ambiguity, Pushdown Automata Harvard CS 121 and CSCI E-207 Lecture 10: Ambiguity, Pushdown Automata Salil Vadhan October 4, 2012 Reading: Sipser, 2.2. Another example of a CFG (with proof) L = {x {a, b} : x has the same # of a s and

More information

IC3 and Beyond: Incremental, Inductive Verification

IC3 and Beyond: Incremental, Inductive Verification IC3 and Beyond: Incremental, Inductive Verification Aaron R. Bradley ECEE, CU Boulder & Summit Middle School IC3 and Beyond: Incremental, Inductive Verification 1/62 Induction Foundation of verification

More information

Computability and Complexity

Computability and Complexity Computability and Complexity Lecture 5 Reductions Undecidable problems from language theory Linear bounded automata given by Jiri Srba Lecture 5 Computability and Complexity 1/14 Reduction Informal Definition

More information

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen Pushdown Automata Notes on Automata and Theory of Computation Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-Sen University Kaohsiung, Taiwan ROC Pushdown Automata p. 1

More information

Collapsed Variational Bayesian Inference for Hidden Markov Models

Collapsed Variational Bayesian Inference for Hidden Markov Models Collapsed Variational Bayesian Inference for Hidden Markov Models Pengyu Wang, Phil Blunsom Department of Computer Science, University of Oxford International Conference on Artificial Intelligence and

More information

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is Unger s Parser Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2012/2013 a top-down parser: we start with S and

More information

Logic. proof and truth syntacs and semantics. Peter Antal

Logic. proof and truth syntacs and semantics. Peter Antal Logic proof and truth syntacs and semantics Peter Antal antal@mit.bme.hu 10/9/2015 1 Knowledge-based agents Wumpus world Logic in general Syntacs transformational grammars Semantics Truth, meaning, models

More information

Handout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0

Handout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0 Massachusetts Institute of Technology 6.863J/9.611J, Natural Language Processing, Spring, 2001 Department of Electrical Engineering and Computer Science Department of Brain and Cognitive Sciences Handout

More information

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other

More information

Automated Compositional Analysis for Checking Component Substitutability

Automated Compositional Analysis for Checking Component Substitutability Automated Compositional Analysis for Checking Component Substitutability Nishant Sinha December 2007 Electrical and Computer Engineering Department Carnegie Mellon University Pittsburgh, PA 15213 Thesis

More information

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements

More information

CSCI 1010 Models of Computa3on. Lecture 17 Parsing Context-Free Languages

CSCI 1010 Models of Computa3on. Lecture 17 Parsing Context-Free Languages CSCI 1010 Models of Computa3on Lecture 17 Parsing Context-Free Languages Overview BoCom-up parsing of CFLs. BoCom-up parsing via the CKY algorithm An O(n 3 ) algorithm John E. Savage CSCI 1010 Lect 17

More information

Push-down Automata = FA + Stack

Push-down Automata = FA + Stack Push-down Automata = FA + Stack PDA Definition A push-down automaton M is a tuple M = (Q,, Γ, δ, q0, F) where Q is a finite set of states is the input alphabet (of terminal symbols, terminals) Γ is the

More information

Lecture 15. Probabilistic Models on Graph

Lecture 15. Probabilistic Models on Graph Lecture 15. Probabilistic Models on Graph Prof. Alan Yuille Spring 2014 1 Introduction We discuss how to define probabilistic models that use richly structured probability distributions and describe how

More information

CSE 105 THEORY OF COMPUTATION

CSE 105 THEORY OF COMPUTATION CSE 105 THEORY OF COMPUTATION Spring 2017 http://cseweb.ucsd.edu/classes/sp17/cse105-ab/ Review of CFG, CFL, ambiguity What is the language generated by the CFG below: G 1 = ({S,T 1,T 2 }, {0,1,2}, { S

More information

Linear Classifiers IV

Linear Classifiers IV Universität Potsdam Institut für Informatik Lehrstuhl Linear Classifiers IV Blaine Nelson, Tobias Scheffer Contents Classification Problem Bayesian Classifier Decision Linear Classifiers, MAP Models Logistic

More information

Turing Machines Part II

Turing Machines Part II Turing Machines Part II Hello Hello Condensed Slide Slide Readers! Readers! This This lecture lecture is is almost almost entirely entirely animations that that show show how how each each Turing Turing

More information

Pushdown Automata (Pre Lecture)

Pushdown Automata (Pre Lecture) Pushdown Automata (Pre Lecture) Dr. Neil T. Dantam CSCI-561, Colorado School of Mines Fall 2017 Dantam (Mines CSCI-561) Pushdown Automata (Pre Lecture) Fall 2017 1 / 41 Outline Pushdown Automata Pushdown

More information

10/17/04. Today s Main Points

10/17/04. Today s Main Points Part-of-speech Tagging & Hidden Markov Model Intro Lecture #10 Introduction to Natural Language Processing CMPSCI 585, Fall 2004 University of Massachusetts Amherst Andrew McCallum Today s Main Points

More information

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21 Parsing Unger s Parser Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2016/17 1 / 21 Table of contents 1 Introduction 2 The Parser 3 An Example 4 Optimizations 5 Conclusion 2 / 21 Introduction

More information

Evidential Paradigm and Intelligent Mathematical Text Processing

Evidential Paradigm and Intelligent Mathematical Text Processing Evidential Paradigm and Intelligent Mathematical Text Processing Alexander Lyaletski 1, Anatoly Doroshenko 2, Andrei Paskevich 1,3, and Konstantin Verchinine 3 1 Taras Shevchenko Kiev National University,

More information

Lecture Notes in Machine Learning Chapter 4: Version space learning

Lecture Notes in Machine Learning Chapter 4: Version space learning Lecture Notes in Machine Learning Chapter 4: Version space learning Zdravko Markov February 17, 2004 Let us consider an example. We shall use an attribute-value language for both the examples and the hypotheses

More information

Learning Context Free Grammars with the Syntactic Concept Lattice

Learning Context Free Grammars with the Syntactic Concept Lattice Learning Context Free Grammars with the Syntactic Concept Lattice Alexander Clark Department of Computer Science Royal Holloway, University of London alexc@cs.rhul.ac.uk ICGI, September 2010 Outline Introduction

More information

SOLUTION: SOLUTION: SOLUTION:

SOLUTION: SOLUTION: SOLUTION: Convert R and S into nondeterministic finite automata N1 and N2. Given a string s, if we know the states N1 and N2 may reach when s[1...i] has been read, we are able to derive the states N1 and N2 may

More information

Lecture Notes on Inductive Definitions

Lecture Notes on Inductive Definitions Lecture Notes on Inductive Definitions 15-312: Foundations of Programming Languages Frank Pfenning Lecture 2 September 2, 2004 These supplementary notes review the notion of an inductive definition and

More information

Homing and Synchronizing Sequences

Homing and Synchronizing Sequences Homing and Synchronizing Sequences Sven Sandberg Information Technology Department Uppsala University Sweden 1 Outline 1. Motivations 2. Definitions and Examples 3. Algorithms (a) Current State Uncertainty

More information

Unifying Version Space Representations: Part II

Unifying Version Space Representations: Part II Unifying Version Space Representations: Part II E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, and H.J. van den Herik IKAT, Department of Computer Science, Maastricht University, P.O.Box 616, 6200 MD Maastricht,

More information

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing Natural Language Processing CS 4120/6120 Spring 2017 Northeastern University David Smith with some slides from Jason Eisner & Andrew

More information

Lecture 17: Language Recognition

Lecture 17: Language Recognition Lecture 17: Language Recognition Finite State Automata Deterministic and Non-Deterministic Finite Automata Regular Expressions Push-Down Automata Turing Machines Modeling Computation When attempting to

More information

Introduction to machine learning. Concept learning. Design of a learning system. Designing a learning system

Introduction to machine learning. Concept learning. Design of a learning system. Designing a learning system Introduction to machine learning Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2). Introduction to machine learning When appropriate

More information

Tree Adjoining Grammars

Tree Adjoining Grammars Tree Adjoining Grammars TAG: Parsing and formal properties Laura Kallmeyer & Benjamin Burkhardt HHU Düsseldorf WS 2017/2018 1 / 36 Outline 1 Parsing as deduction 2 CYK for TAG 3 Closure properties of TALs

More information

Undecidable Problems and Reducibility

Undecidable Problems and Reducibility University of Georgia Fall 2014 Reducibility We show a problem decidable/undecidable by reducing it to another problem. One type of reduction: mapping reduction. Definition Let A, B be languages over Σ.

More information