Learning cover context-free grammars from structural data

Size: px
Start display at page:

Download "Learning cover context-free grammars from structural data"

Transcription

1 Learning cover context-free grammars from structural data Mircea Marin Gabriel Istrate West University of Timişoara, Romania 11th International Colloquium on Theoretical Aspects of Computing ICTAC 2014

2 Outline 1 Introduction Learning languages from queries 2 Context-free languages Structural data 3 Our problem Motivation Preliminary remark 4 Our contributions Algorithm LA l Complexity analysis

3 Learning languages from queries The general learning problem Learn a correct specification of a language from a finite number of positive and negative examples. Well-established means to specify languages of interest: For regular word languages: DFA (hopefully minimum), regular expressions For context-free languages: CFGs, push-down automata For regular tree languages: DFTA (hopefully minimum), regular tree expressions Learning methods: From an arbitrary set of positive and negative examples: Learning a regular language is NP-hard [Gold, 1967] From answers to queries posed to a teacher

4 Learning languages from queries Learning languages from queries Well-known results Learning a language L from answers to 2 kinds of queries: Membership query: w L? ANSWER: yes/no Equivalence query: L = L(A)? where A is the language specification guessed by the learner. ANSWER: yes/no+counterexample w L L(A). R1. For a regular language L, algorithm L builds a minimum DFA with n states in time polynomial in n and m [Angluin, 1987] R2. If L is the regular tree language of structural descriptions of a CFL, algorithm LA builds a minimum DFTA with n states in time polynomial in n and m [Sakakibara, 1990] Note: m =size of largest counterexample provided by the teacher.

5 Learning languages from queries Learning languages from queries An interesting variation (1) In several practical applications, we are interested in the correct specification of a finite subset of L, e.g., only those words of length l: Learn a minimal cover DFA A, i.e., DFA with minimum number of states, such that L(A) Σ l = L Σ l The learning protocol is slightly modified: Membership query: w L? for some w Σ l ANSWER: yes/no Equivalence query: L Σ l = L(A) Σ l? ANSWER: yes/no+counterexample w (L L(A)) Σ l. Note: Often, the size of a minimal cover DFA w.r.t. l is the size of minimum DFA.

6 Learning languages from queries Learning languages from queries An interesting variation (2) Let n := number of states of a minimal cover DFA of L w.r.t. l. A minimal cover DFA of L w.r.t. l can be constructed with algorithm L l in time polynomial in n and m, where m is the size of the largest counterexample returned by the teacher. [Ipate 2012] L l is a nontrivial adjustment of Angluin s algorithm L.

7 Learning languages from queries Learning languages from queries An interesting variation (2) Let n := number of states of a minimal cover DFA of L w.r.t. l. A minimal cover DFA of L w.r.t. l can be constructed with algorithm L l in time polynomial in n and m, where m is the size of the largest counterexample returned by the teacher. [Ipate 2012] L l is a nontrivial adjustment of Angluin s algorithm L. Questions: 1 Can Sakakibara s algorithm LA be adjusted, like L was adjusted to L l, to learn a cover automaton for the structural descriptions of a CFL? 2 Does this notion of cover automaton have practical significance? 3 What is the complexity of such an algorithm? (Is it tractable?)

8 Structural data Skeletons of derivation trees [Sakakibara 1990] ASSUMPTIONS: G = (N, Σ, P, S): ɛ-free CFG. D(G) : set of derivation trees of G Definition (skeleton) The skeleton, or structural description of t D(G) is the labeled tree sk(t) obtained from t by replacing all labels of interior nodes with σ. Example (Structural description of derivation trees) G = ({S, A}, {a, b}, {S A, A aab, A ab}, S) is a CFG t = a a S A A b b D(G) sk(t) = a a σ σ σ b b T ({σ, a, b})

9 Structural data Notation, definitions FOR WE DEFINE G, G U : CFGs with terminals from Σ A : DFTA for trees from T ({σ} Σ) M D(G), B T ({σ} Σ), l : positive integer K (M) := {sk(s) s M} K (D(G)) is called set of structural descriptions of G L(A) : language accepted by A B [l] := {t B d(t) l} where d(t) is the depth of t. G is a cover CFG of G U w.r.t. l if K (D(G)) [l] = K (D(G U )) [l] A is a cover DFTA of G w.r.t. l if L(A) [l] = K (D(G)) [l]

10 Structural data DFTAs for structural descriptions A skeletal signature for CFG G = (N, Σ, P, S) is {σ} Σ together with ar : {σ} Σ P(N) defined by ar(a) = 0 for all a Σ ar(σ) = {m (X U 1... U m ) P}. Definition (DFTA for skeletal signature) A = (Q, {σ} Σ, Q f, {δ i i {0} ar(σ)}) where Q : set of states, Q f Q: set of final states, δ 0 = id Σ and δ i : (Σ Q) i Q for all i ar(σ) Define recursively δ : T ({σ} Σ) Q Σ: δ (a) := a for all a Σ, δ (σ(t 1,..., t i )) := δ i (δ (t 1 ),..., δ (t i )) Q if i > 0. and L(A) := {t δ (t) Q f }.

11 The problem For a positive integer l, and an unknown CFG G U learn a cover CFG G of G U w.r.t. l by posing two kinds of questions: Structural membership query: t K (D(G U )) [l]? ANSWER: yes/no Structural equivalence query: K (D(G)) [l] = K (D(G U )) [l]? ANSWER: yes cover CFG G was learned no counterexample s K (D(G)) [l] K (D(G U )) [l]. ASSUMPTION: The learner and teacher share the following knowledge: σ: set of terminals of G U d := max{m X U 1... U m is a production of G U } l and σ.

12 Motivation Significance of the problem By learning the structural descriptions of a CFL instead of the CFL itself, we learn how to parse and interpret its expressions. In natural language understanding, the bound memory restriction on human comprehension seems to limit the recursion depth of such parse trees to a constant. the restriction to structural descriptions of derivation trees with depth a given constant is reasonable. E.g., the L A T E X system limits the number of nestings of itemized environments to a small constant.

13 Preliminary remark From DFTAs to CFGs Every cover DFTA of G U can be transformed easily into a cover CFG of G U : where A = (Q, {σ} Σ, Q f, {δ i i {0} ar(σ)}) G = (Q {S}, Σ, P, S) P := {q r 1... r m δ m (r 1,..., r m ) = q} {S r 1... r m δ m (r 1,..., r m ) Q f }.

14 Preliminary remark From DFTAs to CFGs Every cover DFTA of G U can be transformed easily into a cover CFG of G U : where A = (Q, {σ} Σ, Q f, {δ i i {0} ar(σ)}) G = (Q {S}, Σ, P, S) P := {q r 1... r m δ m (r 1,..., r m ) = q} {S r 1... r m δ m (r 1,..., r m ) Q f }. it suffices to learn a cover DFTA of K (D(G U )) instead of a cover CFG of K (D(G U )).

15 Algorithm LA l Main result 1 Algorithm LA l which learns a minimal DCTA A for K (D(G U )) w.r.t. l 2 The running time of LA l is T (LA l ) = polynomial in m and n where n =number of states of A m = max size of counterexample provided by teacher

16 Algorithm LA l Auxiliary notions S T ({σ} Σ) \ Σ is subterm-closed if s S whenever s S and s is a subterm if s with d(s) > 0. C C({σ} Σ) if C T ({σ} Σ { }) and appears exactly once in C, at leaf position. If S is subterm-closed then σ S is the set of contexts with hole-depth 1, whose subterms below root are from S Σ: σ S := m ar(σ) i=1 m {σ(s 1,..., s m )[ ] i s 1,..., s m S Σ} E C({σ} Σ) is -prefix closed if C E whenever C = C 1 [C ] with C E and C 1 σ S.

17 Algorithm LA l The observation table (1) LA l =generalisation of algorithms L l and LA Based on the construction of an observation table T = (S, E, T, l) for K (D(G U )) T ({σ} Σ), where S = subterm-closed set of trees from T ({σ} Σ) [l] \ Σ E = -prefix-closed set of contexts from C({σ} Σ) with depth l and hole depth l 1 Rows are labeled with elements from S X(S) [l] where X(S) := {C 1 [s] C 1 σ S, s S Σ} \ S Columns are labeled with elements from E Builds DFTA A = (Q, {σ} Σ, Q f, δ) consistent with T, i.e. C E, s S X(S) such that d(c[s]) l: δ (C[s]) Q f iff T (C[s]) = 1.

18 Algorithm LA l The observation table (2) The entry at position (s, C) (S X(S) [l] ) E in the table is labeled with T (C[s]) {1, 0, 1} where C[s] =term produced by replacing with s in C, and 1 if t K (D(G U )) [l], T (t) := 0 if t T (Sk Σ) [l] \ K (D(G U )), 1 if t T (Sk Σ) [l]. Initial observation table E = { }, S = X(S) [l] has 1 + Σ Σ d = Σ d+1 1 Σ 1 entries, where d=max. rank of σ.

19 Algorithm LA l Observation tables Example For G U := ({S, A}, {a, b}, {S Ab, A Ab, A ab}, S), l = 2, S = {σ(a, b)}, E = {, σ(, b), σ(a, σ(, b))}. Then σ (S) = {σ(, s), σ(s, ) s {a, b, σ(a, b)}}. The observation table is σ(, b) σ(a, b) 0 1 σ(a, a) 0 0 σ(b, a) 0 0 σ(σ(a, b), a) 0 1 σ(b, b) 0 0 σ(σ(a, b), b) 1 1 σ(a, σ(a, b)) 0 1 σ(b, σ(a, b)) 0 1 σ(σ(a, b), σ(a, b)) 0 1

20 Algorithm LA l The automaton construction Main ideas (1) We wish to identify the states of a minimal DCTA A for K (D(G U )) [l] with certain rows of s S: even if row of s 1 row of s 2, they may represent the same state of A, if C[s 1 ] = C[s 2 ] whenever C E such that C[s 1 ], C[s 2 ] T ({σ} Σ) [l] auxiliary notions: k-similarity for 1 k l: s k t : T (C[s]) = T (C[t]) for all C E k max{d(s),d(t)} similarity: := l A total order T on T ({σ} Σ) induced by a total order on Σ (Defn. 6)

21 Algorithm LA l The automaton construction Main ideas (1) We wish to identify the states of a minimal DCTA A for K (D(G U )) [l] with certain rows of s S: even if row of s 1 row of s 2, they may represent the same state of A, if C[s 1 ] = C[s 2 ] whenever C E such that C[s 1 ], C[s 2 ] T ({σ} Σ) [l] auxiliary notions: k-similarity for 1 k l: s k t : T (C[s]) = T (C[t]) for all C E k max{d(s),d(t)} similarity: := l A total order T on T ({σ} Σ) induced by a total order on Σ (Defn. 6) the representative of x S X(S): r(x) := min T {s S x S}

22 Algorithm LA l The automaton construction Main ideas (2) where Q := {r(s) s S} Observation table T = (S, E, T, l) DFTA A(T) := (Q, {σ} Σ, Q f, δ) Q f := {q Q T (q) = 1} δ(q 1,..., q m ) := r(σ(q 1,..., q m )) for all m ar(σ) REMARK: If T is closed and consistent then A(T) is well-defined.

23 Algorithm LA l Relevant properties of observation tables T = (S, E, T, l) is Consistent if k {1,..., l}, s 1, s 2 S, C 1 σ S : If s 1 k s 2 then C 1 [s 1 ] k C 1 [s 2 ] Closed if x X(S), s S : d(s) d(x) x s

24 Algorithm LA l Relevant properties of observation tables T = (S, E, T, l) is Consistent if k {1,..., l}, s 1, s 2 S, C 1 σ S : If s 1 k s 2 then C 1 [s 1 ] k C 1 [s 2 ] Closed if x X(S), s S : d(s) d(x) x s Fact (Corr. 2) If T: closed and consistent observation table of K (D(G U )) n = # of states of A(T) N = # of states of minimal DFCA of K (D(G U )) n N then n = N and A(T) is minimal DFCA of K (D(G U ))

25 Algorithm LA l Relevant properties of observation tables Example is Consistent Not closed σ(, b) σ(a, b) 0 1 σ(a, a) 0 0 σ(b, a) 0 0 σ(σ(a, b), a) 0 1 σ(b, b) 0 0 σ(σ(a, b), b) 1 1 σ(a, σ(a, b)) 0 1 σ(b, σ(a, b)) 0 1 σ(σ(a, b), (a, b)) 0 1

26 Algorithm LA l Learning strategy Create initial observation table T 0 for S = and E =

27 Algorithm LA l Learning strategy Create initial observation table T 0 for S = and E = Repeat building sound and consistent observation tables of K (D(G U )), with more and more states Tables are produced incrementally, by adding more and more rows and columns. 1 If T(t) is not consistent T(t + 1) extends T(t) with new column 2 If T t is not closed T(t + 1) extends T(t) with new row 3 Otherwise, ask structural equivalence query: K (D(G U )) [l] = L(A(T(t))) [l]? 1 If yes, stop with learned answer A(T t) 2 Otherwise, use counterexample s K (D(G U )) L(A(T(t))) [l] to extend T(t) with all missing rows of subterms of s, including s itself.

28 Algorithm LA l Learning strategy Create initial observation table T 0 for S = and E = Repeat building sound and consistent observation tables of K (D(G U )), with more and more states Tables are produced incrementally, by adding more and more rows and columns. 1 If T(t) is not consistent T(t + 1) extends T(t) with new column 2 If T t is not closed T(t + 1) extends T(t) with new row 3 Otherwise, ask structural equivalence query: K (D(G U )) [l] = L(A(T(t))) [l]? 1 If yes, stop with learned answer A(T t) 2 Otherwise, use counterexample s K (D(G U )) L(A(T(t))) [l] to extend T(t) with all missing rows of subterms of s, including s itself. By Corr. 2, a minimal DFCA of K (D(G U )) will be eventually constructed.

29 Complexity analysis Complexity analysis Failed checks T(t) = (S t, E t, T, l) is extended when Failed closeness check: Find C 1 σ S t and s S t such that C 1 [s] t for all t S t with d(t) d(c 1 [s]) Failed consistency check: Find C E t with d (C) = i, s 1, s 2 S t with d(s 1 ), d(s 2 ) l i 1, and C 1 σ (S t ) s.t. C[C 1 [s 1 ]], C[C 1 [s 2 ]] T ({σ} Σ) [l], s 1 k s 2, where k = max{d(s 1 ), d(s 2 )} + i + 1 and T (C[C 1 [s 1 ]]) T (C[C 1 [s 2 ]]) Failed structural equivalence query: T(t) is closed and consistent, but we know a counterexample t K (D(G U )) [l] L(A(T(t))) [l]

30 Complexity analysis Complexity analysis If n := # of states of minimal DFCA of K (D(G U )) w.r.t. l, then 1 total # of failed closedness checks: n(n + 1)/2 [Theorem 3] 2 total # of failed consistency checks: n(n 1)/2 [Theorem 5] 3 total # of failed structural equivalence queries: n [Theorem 6]

31 Complexity analysis Complexity analysis Parameters: n := # of states of minimal DFCA of K (D(G U )) w.r.t. l m := max. size of counterexample returned by a failed structural equivalence query p := Σ d := max. rank of skeleton symbol σ R1. Total space occupied by the observation table at any time: O ( n 2 (m n + n 2 + p) d (m + 2 n + 1)d m+2 n+1). R2. Consistency check: O(n 5 d (m n + n 2 + p) d+2 ). R3. Closedness check: O((m n + n 2 ) 2 d (m n + n 2 + p) d n 2 ) = O(n 2 d (m n + n 2 + p) d+2 ).

32 Complexity analysis Complexity analysis R4. There are at most n(n + 1)/2 + n m elements added to S by failed closeness checks and failed structural equivalence checks. R5. Extend the table with new element in S t+1 : O(d (m n + n 2 + p) d ) membership queries total time spent to extend the S-component of the observation table: O(n 2 d (2 d d) (m n + n 2 )(m n + n 2 + p) d ) R6. Only failed consistency checks extend the E-component of the observation table There are at most n(n 1)/2 failed consistency checks Total time to insert context in E: O(d (m n + n 2 + p) d ) Total time to insert all contexts in E: O(n 2 d (m n + n 2 + p) d )

Learning Cover Context-Free Grammars from. Structural Data

Learning Cover Context-Free Grammars from. Structural Data Scientific Annals of Computer Science vol. 24 (2), 2014, pp. 253 286 doi: 10.7561/SACS.2014.2.253 Learning Cover Context-Free Grammars from Structural Data Mircea Marin 1, Gabriel Istrate 2 Abstract We

More information

Advanced Automata Theory 11 Regular Languages and Learning Theory

Advanced Automata Theory 11 Regular Languages and Learning Theory Advanced Automata Theory 11 Regular Languages and Learning Theory Frank Stephan Department of Computer Science Department of Mathematics National University of Singapore fstephan@comp.nus.edu.sg Advanced

More information

Learning k-edge Deterministic Finite Automata in the Framework of Active Learning

Learning k-edge Deterministic Finite Automata in the Framework of Active Learning Learning k-edge Deterministic Finite Automata in the Framework of Active Learning Anuchit Jitpattanakul* Department of Mathematics, Faculty of Applied Science, King Mong s University of Technology North

More information

Introduction to Theory of Computing

Introduction to Theory of Computing CSCI 2670, Fall 2012 Introduction to Theory of Computing Department of Computer Science University of Georgia Athens, GA 30602 Instructor: Liming Cai www.cs.uga.edu/ cai 0 Lecture Note 3 Context-Free Languages

More information

Using Multiplicity Automata to Identify Transducer Relations from Membership and Equivalence Queries

Using Multiplicity Automata to Identify Transducer Relations from Membership and Equivalence Queries Using Multiplicity Automata to Identify Transducer Relations from Membership and Equivalence Queries Jose Oncina Dept. Lenguajes y Sistemas Informáticos - Universidad de Alicante oncina@dlsi.ua.es September

More information

Foundations of Informatics: a Bridging Course

Foundations of Informatics: a Bridging Course Foundations of Informatics: a Bridging Course Week 3: Formal Languages and Semantics Thomas Noll Lehrstuhl für Informatik 2 RWTH Aachen University noll@cs.rwth-aachen.de http://www.b-it-center.de/wob/en/view/class211_id948.html

More information

Non-context-Free Languages. CS215, Lecture 5 c

Non-context-Free Languages. CS215, Lecture 5 c Non-context-Free Languages CS215, Lecture 5 c 2007 1 The Pumping Lemma Theorem. (Pumping Lemma) Let be context-free. There exists a positive integer divided into five pieces, Proof for for each, and..

More information

Classes of Boolean Functions

Classes of Boolean Functions Classes of Boolean Functions Nader H. Bshouty Eyal Kushilevitz Abstract Here we give classes of Boolean functions that considered in COLT. Classes of Functions Here we introduce the basic classes of functions

More information

Learning Regular Sets

Learning Regular Sets Learning Regular Sets Author: Dana Angluin Presented by: M. Andreína Francisco Department of Computer Science Uppsala University February 3, 2014 Minimally Adequate Teachers A Minimally Adequate Teacher

More information

CPS 220 Theory of Computation Pushdown Automata (PDA)

CPS 220 Theory of Computation Pushdown Automata (PDA) CPS 220 Theory of Computation Pushdown Automata (PDA) Nondeterministic Finite Automaton with some extra memory Memory is called the stack, accessed in a very restricted way: in a First-In First-Out fashion

More information

Properties of Context-Free Languages

Properties of Context-Free Languages Properties of Context-Free Languages Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner. Tel Aviv University. November 21, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.

More information

Final exam study sheet for CS3719 Turing machines and decidability.

Final exam study sheet for CS3719 Turing machines and decidability. Final exam study sheet for CS3719 Turing machines and decidability. A Turing machine is a finite automaton with an infinite memory (tape). Formally, a Turing machine is a 6-tuple M = (Q, Σ, Γ, δ, q 0,

More information

Einführung in die Computerlinguistik

Einführung in die Computerlinguistik Einführung in die Computerlinguistik Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 22 CFG (1) Example: Grammar G telescope : Productions: S NP VP NP

More information

On the Incremental Inference of Context-Free Grammars from Positive Structural Information

On the Incremental Inference of Context-Free Grammars from Positive Structural Information International Journal of ISSN 0974-2107 Systems and Technologies IJST Vol.1, No.2, pp 15-25 KLEF 2008 On the Incremental Inference of Context-Free Grammars from Positive Structural Information Gend Lal

More information

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). 1a) G = ({R, S, T}, {0,1}, P, S) where P is: S R0R R R0R1R R1R0R T T 0T ε (S generates the first 0. R generates

More information

Context-Free Languages (Pre Lecture)

Context-Free Languages (Pre Lecture) Context-Free Languages (Pre Lecture) Dr. Neil T. Dantam CSCI-561, Colorado School of Mines Fall 2017 Dantam (Mines CSCI-561) Context-Free Languages (Pre Lecture) Fall 2017 1 / 34 Outline Pumping Lemma

More information

Theory Bridge Exam Example Questions

Theory Bridge Exam Example Questions Theory Bridge Exam Example Questions Annotated version with some (sometimes rather sketchy) answers and notes. This is a collection of sample theory bridge exam questions. This is just to get some idea

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner and Yishay Mansour. Tel Aviv University. April 3/8, 2013 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice

More information

Negative Results for Equivalence Queries

Negative Results for Equivalence Queries Machine Learning, 5, 121-150 (1990) 1990 Kluwer Academic Publishers, Boston. Manufactured in The Netherlands. Negative Results for Equivalence Queries DANA ANGLUIN (ANGLUIN@CS.YALE.EDU) Department of Computer

More information

Le rning Regular. A Languages via Alt rn ting. A E Autom ta

Le rning Regular. A Languages via Alt rn ting. A E Autom ta 1 Le rning Regular A Languages via Alt rn ting A E Autom ta A YALE MIT PENN IJCAI 15, Buenos Aires 2 4 The Problem Learn an unknown regular language L using MQ and EQ data mining neural networks geometry

More information

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen Pushdown Automata Notes on Automata and Theory of Computation Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-Sen University Kaohsiung, Taiwan ROC Pushdown Automata p. 1

More information

Learning Context Free Grammars with the Syntactic Concept Lattice

Learning Context Free Grammars with the Syntactic Concept Lattice Learning Context Free Grammars with the Syntactic Concept Lattice Alexander Clark Department of Computer Science Royal Holloway, University of London alexc@cs.rhul.ac.uk ICGI, September 2010 Outline Introduction

More information

60-354, Theory of Computation Fall Asish Mukhopadhyay School of Computer Science University of Windsor

60-354, Theory of Computation Fall Asish Mukhopadhyay School of Computer Science University of Windsor 60-354, Theory of Computation Fall 2013 Asish Mukhopadhyay School of Computer Science University of Windsor Pushdown Automata (PDA) PDA = ε-nfa + stack Acceptance ε-nfa enters a final state or Stack is

More information

CS5371 Theory of Computation. Lecture 7: Automata Theory V (CFG, CFL, CNF)

CS5371 Theory of Computation. Lecture 7: Automata Theory V (CFG, CFL, CNF) CS5371 Theory of Computation Lecture 7: Automata Theory V (CFG, CFL, CNF) Announcement Homework 2 will be given soon (before Tue) Due date: Oct 31 (Tue), before class Midterm: Nov 3, (Fri), first hour

More information

Pushdown Automata. We have seen examples of context-free languages that are not regular, and hence can not be recognized by finite automata.

Pushdown Automata. We have seen examples of context-free languages that are not regular, and hence can not be recognized by finite automata. Pushdown Automata We have seen examples of context-free languages that are not regular, and hence can not be recognized by finite automata. Next we consider a more powerful computation model, called a

More information

Learning of Bi-ω Languages from Factors

Learning of Bi-ω Languages from Factors JMLR: Workshop and Conference Proceedings 21:139 144, 2012 The 11th ICGI Learning of Bi-ω Languages from Factors M. Jayasrirani mjayasrirani@yahoo.co.in Arignar Anna Government Arts College, Walajapet

More information

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard

More information

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181. Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität

More information

Context-Free Grammars and Languages. Reading: Chapter 5

Context-Free Grammars and Languages. Reading: Chapter 5 Context-Free Grammars and Languages Reading: Chapter 5 1 Context-Free Languages The class of context-free languages generalizes the class of regular languages, i.e., every regular language is a context-free

More information

CISC4090: Theory of Computation

CISC4090: Theory of Computation CISC4090: Theory of Computation Chapter 2 Context-Free Languages Courtesy of Prof. Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Spring, 2014 Overview In Chapter

More information

Synthesis and Inductive Learning Part 3

Synthesis and Inductive Learning Part 3 Synthesis and Inductive Learning Part 3 Sanjit A. Seshia EECS Department UC Berkeley Acknowledgments: Susmit Jha, Alexandre Donze, Edward Lee NSF ExCAPE Summer School June 23-25, 2015 Questions of Interest

More information

Introduction to Theoretical Computer Science. Motivation. Automata = abstract computing devices

Introduction to Theoretical Computer Science. Motivation. Automata = abstract computing devices Introduction to Theoretical Computer Science Motivation Automata = abstract computing devices Turing studied Turing Machines (= computers) before there were any real computers We will also look at simpler

More information

Tree Adjoining Grammars

Tree Adjoining Grammars Tree Adjoining Grammars TAG: Parsing and formal properties Laura Kallmeyer & Benjamin Burkhardt HHU Düsseldorf WS 2017/2018 1 / 36 Outline 1 Parsing as deduction 2 CYK for TAG 3 Closure properties of TALs

More information

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET Regular Languages and FA A language is a set of strings over a finite alphabet Σ. All languages are finite or countably infinite. The set of all languages

More information

CP405 Theory of Computation

CP405 Theory of Computation CP405 Theory of Computation BB(3) q 0 q 1 q 2 0 q 1 1R q 2 0R q 2 1L 1 H1R q 1 1R q 0 1L Growing Fast BB(3) = 6 BB(4) = 13 BB(5) = 4098 BB(6) = 3.515 x 10 18267 (known) (known) (possible) (possible) Language:

More information

LBT for Procedural and Reac1ve Systems Part 2: Reac1ve Systems Basic Theory

LBT for Procedural and Reac1ve Systems Part 2: Reac1ve Systems Basic Theory LBT for Procedural and Reac1ve Systems Part 2: Reac1ve Systems Basic Theory Karl Meinke, karlm@kth.se School of Computer Science and Communica:on KTH Stockholm 0. Overview of the Lecture 1. Learning Based

More information

Pushdown Automata (Pre Lecture)

Pushdown Automata (Pre Lecture) Pushdown Automata (Pre Lecture) Dr. Neil T. Dantam CSCI-561, Colorado School of Mines Fall 2017 Dantam (Mines CSCI-561) Pushdown Automata (Pre Lecture) Fall 2017 1 / 41 Outline Pushdown Automata Pushdown

More information

Computational Models - Lecture 4

Computational Models - Lecture 4 Computational Models - Lecture 4 Regular languages: The Myhill-Nerode Theorem Context-free Grammars Chomsky Normal Form Pumping Lemma for context free languages Non context-free languages: Examples Push

More information

Context-Free Grammars and Languages

Context-Free Grammars and Languages Context-Free Grammars and Languages Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY

NODIA AND COMPANY. GATE SOLVED PAPER Computer Science Engineering Theory of Computation. Copyright By NODIA & COMPANY No part of this publication may be reproduced or distributed in any form or any means, electronic, mechanical, photocopying, or otherwise without the prior permission of the author. GATE SOLVED PAPER Computer

More information

Einführung in die Computerlinguistik

Einführung in die Computerlinguistik Einführung in die Computerlinguistik Context-Free Grammars formal properties Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2018 1 / 20 Normal forms (1) Hopcroft and Ullman (1979) A normal

More information

Computational Models - Lecture 5 1

Computational Models - Lecture 5 1 Computational Models - Lecture 5 1 Handout Mode Iftach Haitner. Tel Aviv University. November 28, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.

More information

Before We Start. The Pumping Lemma. Languages. Context Free Languages. Plan for today. Now our picture looks like. Any questions?

Before We Start. The Pumping Lemma. Languages. Context Free Languages. Plan for today. Now our picture looks like. Any questions? Before We Start The Pumping Lemma Any questions? The Lemma & Decision/ Languages Future Exam Question What is a language? What is a class of languages? Context Free Languages Context Free Languages(CFL)

More information

Functions on languages:

Functions on languages: MA/CSSE 474 Final Exam Notation and Formulas page Name (turn this in with your exam) Unless specified otherwise, r,s,t,u,v,w,x,y,z are strings over alphabet Σ; while a, b, c, d are individual alphabet

More information

Theory of Computation (IV) Yijia Chen Fudan University

Theory of Computation (IV) Yijia Chen Fudan University Theory of Computation (IV) Yijia Chen Fudan University Review language regular context-free machine DFA/ NFA PDA syntax regular expression context-free grammar Pushdown automata Definition A pushdown automaton

More information

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018 Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 3. Logarithmic Space Reinhard Pichler Institute of Logic and Computation DBAI Group TU Wien 3. Logarithmic Space 3.1 Computational

More information

Learning Regular ω-languages

Learning Regular ω-languages Learning Regular ω-languages The Goal Device an algorithm for learning an unknown regular ω-language L 3 The Goal Device an algorithm for learning an unknown regular ω-language L set of infinite words

More information

NPDA, CFG equivalence

NPDA, CFG equivalence NPDA, CFG equivalence Theorem A language L is recognized by a NPDA iff L is described by a CFG. Must prove two directions: ( ) L is recognized by a NPDA implies L is described by a CFG. ( ) L is described

More information

Solutions to Old Final Exams (For Fall 2007)

Solutions to Old Final Exams (For Fall 2007) Solutions to Old Final Exams (For Fall 2007) CS 381 (Fall 2002, Fall 2004, Fall 2005, Fall 2006) Yogi Sharma Disclaimer: I, Yogi Sharma, do not claim these solution to be complete, or even to be absolutely

More information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY 15-453 FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY REVIEW for MIDTERM 1 THURSDAY Feb 6 Midterm 1 will cover everything we have seen so far The PROBLEMS will be from Sipser, Chapters 1, 2, 3 It will be

More information

ECS 120: Theory of Computation UC Davis Phillip Rogaway February 16, Midterm Exam

ECS 120: Theory of Computation UC Davis Phillip Rogaway February 16, Midterm Exam ECS 120: Theory of Computation Handout MT UC Davis Phillip Rogaway February 16, 2012 Midterm Exam Instructions: The exam has six pages, including this cover page, printed out two-sided (no more wasted

More information

Pushdown Automata: Introduction (2)

Pushdown Automata: Introduction (2) Pushdown Automata: Introduction Pushdown automaton (PDA) M = (K, Σ, Γ,, s, A) where K is a set of states Σ is an input alphabet Γ is a set of stack symbols s K is the start state A K is a set of accepting

More information

CSE 105 THEORY OF COMPUTATION

CSE 105 THEORY OF COMPUTATION CSE 105 THEORY OF COMPUTATION Spring 2016 http://cseweb.ucsd.edu/classes/sp16/cse105-ab/ Today's learning goals Sipser Ch 2 Define push down automata Trace the computation of a push down automaton Design

More information

Lecture 17: Language Recognition

Lecture 17: Language Recognition Lecture 17: Language Recognition Finite State Automata Deterministic and Non-Deterministic Finite Automata Regular Expressions Push-Down Automata Turing Machines Modeling Computation When attempting to

More information

Automata Theory. CS F-10 Non-Context-Free Langauges Closure Properties of Context-Free Languages. David Galles

Automata Theory. CS F-10 Non-Context-Free Langauges Closure Properties of Context-Free Languages. David Galles Automata Theory CS411-2015F-10 Non-Context-Free Langauges Closure Properties of Context-Free Languages David Galles Department of Computer Science University of San Francisco 10-0: Fun with CFGs Create

More information

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of

More information

Tasks of lexer. CISC 5920: Compiler Construction Chapter 2 Lexical Analysis. Tokens and lexemes. Buffering

Tasks of lexer. CISC 5920: Compiler Construction Chapter 2 Lexical Analysis. Tokens and lexemes. Buffering Tasks of lexer CISC 5920: Compiler Construction Chapter 2 Lexical Analysis Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017. All

More information

Theoretical Computer Science

Theoretical Computer Science Theoretical Computer Science 448 (2012) 41 46 Contents lists available at SciVerse ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs Polynomial characteristic sets

More information

CS375: Logic and Theory of Computing

CS375: Logic and Theory of Computing CS375: Logic and Theory of Computing Fuhua (Frank) Cheng Department of Computer Science University of Kentucky 1 Table of Contents: Week 1: Preliminaries (set algebra, relations, functions) (read Chapters

More information

October 6, Equivalence of Pushdown Automata with Context-Free Gramm

October 6, Equivalence of Pushdown Automata with Context-Free Gramm Equivalence of Pushdown Automata with Context-Free Grammar October 6, 2013 Motivation Motivation CFG and PDA are equivalent in power: a CFG generates a context-free language and a PDA recognizes a context-free

More information

Finite Automata. Seungjin Choi

Finite Automata. Seungjin Choi Finite Automata Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr 1 / 28 Outline

More information

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully EXAM Please read all instructions, including these, carefully There are 7 questions on the exam, with multiple parts. You have 3 hours to work on the exam. The exam is open book, open notes. Please write

More information

Peter Wood. Department of Computer Science and Information Systems Birkbeck, University of London Automata and Formal Languages

Peter Wood. Department of Computer Science and Information Systems Birkbeck, University of London Automata and Formal Languages and and Department of Computer Science and Information Systems Birkbeck, University of London ptw@dcs.bbk.ac.uk Outline and Doing and analysing problems/languages computability/solvability/decidability

More information

This lecture covers Chapter 6 of HMU: Pushdown Automata

This lecture covers Chapter 6 of HMU: Pushdown Automata This lecture covers Chapter 6 of HMU: ushdown Automata ushdown Automata (DA) Language accepted by a DA Equivalence of CFs and the languages accepted by DAs Deterministic DAs Additional Reading: Chapter

More information

UNIT-VIII COMPUTABILITY THEORY

UNIT-VIII COMPUTABILITY THEORY CONTEXT SENSITIVE LANGUAGE UNIT-VIII COMPUTABILITY THEORY A Context Sensitive Grammar is a 4-tuple, G = (N, Σ P, S) where: N Set of non terminal symbols Σ Set of terminal symbols S Start symbol of the

More information

Theory of computation: initial remarks (Chapter 11)

Theory of computation: initial remarks (Chapter 11) Theory of computation: initial remarks (Chapter 11) For many purposes, computation is elegantly modeled with simple mathematical objects: Turing machines, finite automata, pushdown automata, and such.

More information

Theory of Computation

Theory of Computation Fall 2002 (YEN) Theory of Computation Midterm Exam. Name:... I.D.#:... 1. (30 pts) True or false (mark O for true ; X for false ). (Score=Max{0, Right- 1 2 Wrong}.) (1) X... If L 1 is regular and L 2 L

More information

CS20a: summary (Oct 24, 2002)

CS20a: summary (Oct 24, 2002) CS20a: summary (Oct 24, 2002) Context-free languages Grammars G = (V, T, P, S) Pushdown automata N-PDA = CFG D-PDA < CFG Today What languages are context-free? Pumping lemma (similar to pumping lemma for

More information

Click to edit. Master title. style

Click to edit. Master title. style Query Learning of Derived ω-tree Languages in Polynomial Time Dana Angluin, Tımos Antonopoulos & Dana 1 Query Learning a set of possile concepts Question Answer tc Learner Teacher Tries to identify a target

More information

Automata-based Verification - III

Automata-based Verification - III CS3172: Advanced Algorithms Automata-based Verification - III Howard Barringer Room KB2.20/22: email: howard.barringer@manchester.ac.uk March 2005 Third Topic Infinite Word Automata Motivation Büchi Automata

More information

TAFL 1 (ECS-403) Unit- III. 3.1 Definition of CFG (Context Free Grammar) and problems. 3.2 Derivation. 3.3 Ambiguity in Grammar

TAFL 1 (ECS-403) Unit- III. 3.1 Definition of CFG (Context Free Grammar) and problems. 3.2 Derivation. 3.3 Ambiguity in Grammar TAFL 1 (ECS-403) Unit- III 3.1 Definition of CFG (Context Free Grammar) and problems 3.2 Derivation 3.3 Ambiguity in Grammar 3.3.1 Inherent Ambiguity 3.3.2 Ambiguous to Unambiguous CFG 3.4 Simplification

More information

Chapter 5. Finite Automata

Chapter 5. Finite Automata Chapter 5 Finite Automata 5.1 Finite State Automata Capable of recognizing numerous symbol patterns, the class of regular languages Suitable for pattern-recognition type applications, such as the lexical

More information

Automata Theory, Computability and Complexity

Automata Theory, Computability and Complexity Automata Theory, Computability and Complexity Mridul Aanjaneya Stanford University June 26, 22 Mridul Aanjaneya Automata Theory / 64 Course Staff Instructor: Mridul Aanjaneya Office Hours: 2:PM - 4:PM,

More information

Computational Models - Lecture 5 1

Computational Models - Lecture 5 1 Computational Models - Lecture 5 1 Handout Mode Iftach Haitner and Yishay Mansour. Tel Aviv University. April 10/22, 2013 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice

More information

The Power of Random Counterexamples

The Power of Random Counterexamples Proceedings of Machine Learning Research 76:1 14, 2017 Algorithmic Learning Theory 2017 The Power of Random Counterexamples Dana Angluin DANA.ANGLUIN@YALE.EDU and Tyler Dohrn TYLER.DOHRN@YALE.EDU Department

More information

Theory of computation: initial remarks (Chapter 11)

Theory of computation: initial remarks (Chapter 11) Theory of computation: initial remarks (Chapter 11) For many purposes, computation is elegantly modeled with simple mathematical objects: Turing machines, finite automata, pushdown automata, and such.

More information

Decidability (What, stuff is unsolvable?)

Decidability (What, stuff is unsolvable?) University of Georgia Fall 2014 Outline Decidability Decidable Problems for Regular Languages Decidable Problems for Context Free Languages The Halting Problem Countable and Uncountable Sets Diagonalization

More information

Introduction and Motivation. Introduction and Motivation. Introduction to Computability. Introduction and Motivation. Theory. Lecture5: Context Free

Introduction and Motivation. Introduction and Motivation. Introduction to Computability. Introduction and Motivation. Theory. Lecture5: Context Free ntroduction to Computability Theory Lecture5: Context Free Languages Prof. Amos sraeli 1 ntroduction and Motivation n our study of RL-s we Covered: 1. Motivation and definition of regular languages. 2.

More information

CS5371 Theory of Computation. Lecture 9: Automata Theory VII (Pumping Lemma, Non-CFL, DPDA PDA)

CS5371 Theory of Computation. Lecture 9: Automata Theory VII (Pumping Lemma, Non-CFL, DPDA PDA) CS5371 Theory of Computation Lecture 9: Automata Theory VII (Pumping Lemma, Non-CFL, DPDA PDA) Objectives Introduce the Pumping Lemma for CFL Show that some languages are non- CFL Discuss the DPDA, which

More information

UNIT-VI PUSHDOWN AUTOMATA

UNIT-VI PUSHDOWN AUTOMATA Syllabus R09 Regulation UNIT-VI PUSHDOWN AUTOMATA The context free languages have a type of automaton that defined them. This automaton, called a pushdown automaton, is an extension of the nondeterministic

More information

Lecture 59 : Instance Compression and Succinct PCP s for NP

Lecture 59 : Instance Compression and Succinct PCP s for NP IITM-CS6840: Advanced Complexity Theory March 31, 2012 Lecture 59 : Instance Compression and Succinct PCP s for NP Lecturer: Sivaramakrishnan N.R. Scribe: Prashant Vasudevan 1 Introduction Classical Complexity

More information

Parikh s Theorem and Descriptional Complexity

Parikh s Theorem and Descriptional Complexity Parikh s Theorem and Descriptional Complexity Giovanna J. Lavado and Giovanni Pighizzini Dipartimento di Informatica e Comunicazione Università degli Studi di Milano SOFSEM 2012 Špindlerův Mlýn, Czech

More information

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully EXAM Please read all instructions, including these, carefully There are 7 questions on the exam, with multiple parts. You have 3 hours to work on the exam. The exam is open book, open notes. Please write

More information

CPSC 421: Tutorial #1

CPSC 421: Tutorial #1 CPSC 421: Tutorial #1 October 14, 2016 Set Theory. 1. Let A be an arbitrary set, and let B = {x A : x / x}. That is, B contains all sets in A that do not contain themselves: For all y, ( ) y B if and only

More information

Congruence based approaches

Congruence based approaches Congruence based approaches Learnable representations for languages Alexander Clark Department of Computer Science Royal Holloway, University of London August 00 ESSLLI, 00 Outline Distributional learning

More information

CSE 105 THEORY OF COMPUTATION

CSE 105 THEORY OF COMPUTATION CSE 105 THEORY OF COMPUTATION Spring 2017 http://cseweb.ucsd.edu/classes/sp17/cse105-ab/ Review of CFG, CFL, ambiguity What is the language generated by the CFG below: G 1 = ({S,T 1,T 2 }, {0,1,2}, { S

More information

Open Problems in Automata Theory: An Idiosyncratic View

Open Problems in Automata Theory: An Idiosyncratic View Open Problems in Automata Theory: An Idiosyncratic View Jeffrey Shallit School of Computer Science University of Waterloo Waterloo, Ontario N2L 3G1 Canada shallit@cs.uwaterloo.ca http://www.cs.uwaterloo.ca/~shallit

More information

1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u,

1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u, 1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u, v, x, y, z as per the pumping theorem. 3. Prove that

More information

Complexity of Equivalence and Learning for Multiplicity Tree Automata

Complexity of Equivalence and Learning for Multiplicity Tree Automata Journal of Machine Learning Research 16 (2015) 2465-2500 Submitted 11/14; Revised 6/15; Published 12/15 Complexity of Equivalence and Learning for Multiplicity Tree Automata Ines Marušić James Worrell

More information

Learning to Verify Branching Time Properties

Learning to Verify Branching Time Properties Learning to Verify Branching Time Properties Abhay Vardhan and Mahesh Viswanathan Dept. of Computer Science, Univ. of Illinois at Urbana-Champaign, USA Abstract. We present a new model checking algorithm

More information

DD2371 Automata Theory

DD2371 Automata Theory KTH CSC VT 2008 DD2371 Automata Theory Dilian Gurov Lecture Outline 1. The lecturer 2. Introduction to automata theory 3. Course syllabus 4. Course objectives 5. Course organization 6. First definitions

More information

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever.   ETH Zürich (D-ITET) October, Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu ETH Zürich (D-ITET) October, 5 2017 Part 3 out of 5 Last week, we learned about closure and equivalence of regular

More information

Part 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages

Part 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu Part 3 out of 5 ETH Zürich (D-ITET) October, 5 2017 Last week, we learned about closure and equivalence of regular

More information

Finite Automata - Deterministic Finite Automata. Deterministic Finite Automaton (DFA) (or Finite State Machine)

Finite Automata - Deterministic Finite Automata. Deterministic Finite Automaton (DFA) (or Finite State Machine) Finite Automata - Deterministic Finite Automata Deterministic Finite Automaton (DFA) (or Finite State Machine) M = (K, Σ, δ, s, A), where K is a finite set of states Σ is an input alphabet s K is a distinguished

More information

Finite Automata. Finite Automata

Finite Automata. Finite Automata Finite Automata Finite Automata Formal Specification of Languages Generators Grammars Context-free Regular Regular Expressions Recognizers Parsers, Push-down Automata Context Free Grammar Finite State

More information

Please give details of your answer. A direct answer without explanation is not counted.

Please give details of your answer. A direct answer without explanation is not counted. Please give details of your answer. A direct answer without explanation is not counted. Your answers must be in English. Please carefully read problem statements. During the exam you are not allowed to

More information

Even More on Dynamic Programming

Even More on Dynamic Programming Algorithms & Models of Computation CS/ECE 374, Fall 2017 Even More on Dynamic Programming Lecture 15 Thursday, October 19, 2017 Sariel Har-Peled (UIUC) CS374 1 Fall 2017 1 / 26 Part I Longest Common Subsequence

More information

Peled, Vardi, & Yannakakis: Black Box Checking

Peled, Vardi, & Yannakakis: Black Box Checking Peled, Vardi, & Yannakakis: Black Box Checking Martin Leucker leucker@it.uu.se Department of Computer Systems,, Sweden Plan Preliminaries State identification and verification Conformance Testing Extended

More information

6.1 The Pumping Lemma for CFLs 6.2 Intersections and Complements of CFLs

6.1 The Pumping Lemma for CFLs 6.2 Intersections and Complements of CFLs CSC4510/6510 AUTOMATA 6.1 The Pumping Lemma for CFLs 6.2 Intersections and Complements of CFLs The Pumping Lemma for Context Free Languages One way to prove AnBn is not regular is to use the pumping lemma

More information