Computational Models - Lecture 4 1

Similar documents
Computational Models - Lecture 4 1

Computational Models - Lecture 3

Computational Models - Lecture 4

The View Over The Horizon

What we have done so far

Computational Models - Lecture 5 1

Computational Models - Lecture 5 1

Computational Models - Lecture 3 1

Computational Models: Class 3

CISC4090: Theory of Computation

Computational Models - Lecture 1 1

Computational Models Lecture 2 1

Computational Models - Lecture 3 1

Computational Models Lecture 2 1

Theory of Computation (II) Yijia Chen Fudan University

COMP-330 Theory of Computation. Fall Prof. Claude Crépeau. Lec. 10 : Context-Free Grammars

Computational Models: Class 5

Einführung in die Computerlinguistik

Introduction to Theory of Computing

Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

Foundations of Informatics: a Bridging Course

FLAC Context-Free Grammars

Context Free Languages. Automata Theory and Formal Grammars: Lecture 6. Languages That Are Not Regular. Non-Regular Languages

Context Free Languages (CFL) Language Recognizer A device that accepts valid strings. The FA are formalized types of language recognizer.

CS5371 Theory of Computation. Lecture 7: Automata Theory V (CFG, CFL, CNF)

Computational Models Lecture 8 1

Context Free Grammars

Computational Models Lecture 8 1

Outline. CS21 Decidability and Tractability. Machine view of FA. Machine view of FA. Machine view of FA. Machine view of FA.

Automata Theory CS F-08 Context-Free Grammars

CISC 4090 Theory of Computation

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET

CISC 4090 Theory of Computation

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Pushdown automata. Twan van Laarhoven. Institute for Computing and Information Sciences Intelligent Systems Radboud University Nijmegen

Computational Models Lecture 5

Context Free Languages and Grammars

Properties of Context-Free Languages

Context-Free Grammars and Languages

3515ICT: Theory of Computation. Regular languages

60-354, Theory of Computation Fall Asish Mukhopadhyay School of Computer Science University of Windsor

CPS 220 Theory of Computation

1. (a) Explain the procedure to convert Context Free Grammar to Push Down Automata.

Computational Models: Class 1

Chapter 5: Context-Free Languages

Theory of Computation

Computational Models Lecture 8 1

Harvard CS 121 and CSCI E-207 Lecture 10: Ambiguity, Pushdown Automata

Lecture 11 Context-Free Languages

Parsing. Context-Free Grammars (CFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26

Lecture III CONTEXT FREE LANGUAGES

Context-Free Grammars and Languages. Reading: Chapter 5

Theory of Computation (IV) Yijia Chen Fudan University

Einführung in die Computerlinguistik

Properties of Context-Free Languages. Closure Properties Decision Properties

Automata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,

Part 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages

Chapter 1. Formal Definition and View. Lecture Formal Pushdown Automata on the 28th April 2009

5 Context-Free Languages

AC68 FINITE AUTOMATA & FORMULA LANGUAGES JUNE 2014

Non-context-Free Languages. CS215, Lecture 5 c

Critical CS Questions

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars

Miscellaneous. Closure Properties Decision Properties

Harvard CS 121 and CSCI E-207 Lecture 12: General Context-Free Recognition

Final exam study sheet for CS3719 Turing machines and decidability.

PUSHDOWN AUTOMATA (PDA)

Concordia University Department of Computer Science & Software Engineering

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).

This lecture covers Chapter 7 of HMU: Properties of CFLs

Context-Free Grammars (and Languages) Lecture 7

Grammars and Context Free Languages

ECS 120: Theory of Computation UC Davis Phillip Rogaway February 16, Midterm Exam

Pushdown Automata. Notes on Automata and Theory of Computation. Chia-Ping Chen

Grammars and Context Free Languages

(b) If G=({S}, {a}, {S SS}, S) find the language generated by G. [8+8] 2. Convert the following grammar to Greibach Normal Form G = ({A1, A2, A3},

Context-Free Grammars. 2IT70 Finite Automata and Process Theory

Lecture 17: Language Recognition

Automata: a short introduction

Introduction and Motivation. Introduction and Motivation. Introduction to Computability. Introduction and Motivation. Theory. Lecture5: Context Free

AC68 FINITE AUTOMATA & FORMULA LANGUAGES DEC 2013

TAFL 1 (ECS-403) Unit- III. 3.1 Definition of CFG (Context Free Grammar) and problems. 3.2 Derivation. 3.3 Ambiguity in Grammar

NPDA, CFG equivalence

CSE 105 Homework 5 Due: Monday November 13, Instructions. should be on each page of the submission.

Notes for Comp 497 (Comp 454) Week 10 4/5/05

Theory of Computation - Module 3

Context-free grammars and languages

CpSc 421 Final Exam December 15, 2006

St.MARTIN S ENGINEERING COLLEGE Dhulapally, Secunderabad

Chapter 6. Properties of Regular Languages

Pushdown Automata. Reading: Chapter 6

Comment: The induction is always on some parameter, and the basis case is always an integer or set of integers.

Part 4 out of 5 DFA NFA REX. Automata & languages. A primer on the Theory of Computation. Last week, we showed the equivalence of DFA, NFA and REX

Theory of Computation (Classroom Practice Booklet Solutions)

CPSC 421: Tutorial #1

Context-free Grammars and Languages

Automata Theory. Lecture on Discussion Course of CS120. Runzhe SJTU ACM CLASS

6.1 The Pumping Lemma for CFLs 6.2 Intersections and Complements of CFLs

Transcription:

Computational Models - Lecture 4 1 Handout Mode Iftach Haitner. Tel Aviv University. November 21, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 1 / 32

Talk Outline Context Free Grammars/Languages (CFG/CFL) (begin) Algorithmic issues for CFL Sipser s book, 2.1 Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 2 / 32

Short overview So far we saw finite automata, regular languages, regular expressions, Myhill-Nerode theorem pumping lemma for regular languages. We now introduce stronger machines and languages with more expressive power: pushdown automata, context-free languages, context-free grammars, pumping lemma for context-free languages. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 3 / 32

Context Free Grammars (CFG) An example of a context free grammar, G 1 : A 0A1 A B B # Terminology: Each line is a substitution rule or production. Each rule has the form: symbol string. The left-hand symbol is a variable (usually upper-case). A string consists of variables and terminals. One variable is the start variable (lhs of top rule). In this case, it is A. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 4 / 32

Rules for generating strings Write down the start variable. Pick a variable written down in current string and a derivation that starts with that variable. Replace that variable with right-hand side of that derivation. Repeat until no variables remain. Return final string (concatenation of terminals). Process is inherently non deterministic. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 5 / 32

Example Grammar G 1 : A 0A1 A B B # Derivation with G 1 : A 0A1 00A11 000A111 000B111 000#111 Question 1 What strings can be generated in this way from the grammar G 1? Answer: Exactly those of the form 0 n #1 n (n 0). Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 6 / 32

Context-Free Languages (CFL) The language generated in this way is called the language of the grammar. For example, L(G 1 ) = {0 n #1 n : n 0}. Any language generated by a context-free grammar is called a context-free language. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 7 / 32

A useful abbreviation Rules with same variable on left hand side A 0A1 A B are written as: A 0A1 B Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 8 / 32

English-like sentences A grammar G 2 to describe a few English sentences: < SENTENCE > < NP >< VERB > < NP > < ARTICLE >< NOUN > < NOUN > boy girl flower < ARTICLE > a the < VERB > touches likes sees A specific derivation in G 2 : < SENTENCE > < NP >< VERB > < ARTICLE >< NOUN >< VERB > a < NOUN >< VERB > a boy < VERB > a boy sees More strings generated by G 2 : a flower sees, the girl touches Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 9 / 32

English-like sentences, cont. < SENTENCE > < NP >< VERB > < ARTICLE >< NOUN >< VERB > a < NOUN >< VERB > a boy < VERB > a boy sees Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 10 / 32

Formal definition A context-free grammar is a 4-tuple (V, Σ, R, S), where V is a finite set of variables Σ is a finite set of terminals (V Σ = ) R is a finite set of rules of the form A x, where A V and x (V Σ). S V is the start symbol. Let u, v (V Σ). If A w R, then uav yields uwv, denoted uav uwv. u v if u = v, or u u 1... u k v for some sequence u 1, u 2,..., u k Note that if A xby and B z, then A xzy. Definition 2 The language of the grammar G, denoted L(G), is {w Σ : S w} where is determined by G. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 11 / 32

Example 1 G 3 = ({S}, {a, b}, R, S). R (Rules): S asb SS ε Some words in the language: aabb, aababb. Question 3 What is this language? Hint: Think of parentheses: i.e., a is "(" and b is ")". (()), (()()) Using larger alphabet (i.e., more terminals), ([]()), represent well formed programs with many kinds of nested loops, "if then/else" statements. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 12 / 32

Example 2 G 4 = ({S}, {a, b}, R, S). R (Rules): S asa bsb ε Some words in the language: abba,aabaabaa. Question 4 What is this language? L(G 4 ) = {ww R : w {a, b} } (almost but not quite the set of palindromes) Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 13 / 32

Proving L(G 4 ) = L = {ww R : w {a, b} } What do we need to show? 1. If z = ww R then z is generated by G 4. 2. If z is generated by G 4 then z = ww R. Proving ww R L(G 4 ). Proof by induction on the length of z. Base: z = 0. Since S ε = z, z L(G 4 ). Inductive step. Let z = ww R be a word of size 2k. Let w = σw (hence w = k 1) Hence, z = σw (w ) R σ By i.h. S w (w ) R Hence, S σsσ σw (w ) R σ = z Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 14 / 32

Proving z L(G 4 ) = z {ww R : w {a, b} } Proof by induction on the number of derivations steps used to derive z from S. Single derivation. Only possible derivation is S ε, and thus z = ε L. Inductive step. Assume S z in k derivations steps. First derivation is S σsσ for σ {a, b} Hence z = σz σ, and z is derived from S using k 1 steps By i.h., z = w (w ) R L Hence, z = σw (w ) R σ = w(w) R for w = σw is in L Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 15 / 32

Example 3 G 5 = ({S, A, B}, {a, b}, R, S) R (Rules): S ab ba ε A a as baa B b bs abb Some words in the language: aababb, baabba. Question 5 What is this language? L(G 5 ) = {w {a, b} : # a (w) = # b (w)} # x (w) number of occurrences of x in w Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 16 / 32

Proving L(G 5 ) = L = {w {a, b} : # a (w) = # b (w)} What do we need to show? 1. w L(G 5 ) = w L. 2. w L = w L(G 5 ). Claim 6 If S w, then # a (w) + # A (w) = # b (w) + # B (w). Proof: DIY Claim 7 For w {a, b} let k = k(w) = # a (w) # b (w). Then: 1. If k = 0, then S ws 2. If k > 0, then S wb k 3. If k < 0, then S wa k Are we done? How to prove the claim? Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 17 / 32

Proving Claim 7 For w {a, b} let k = k(w) = # a(w) # b (w). Then: 1. If k = 0, then S ws 2. If k > 0, then S wb k 3. If k < 0, then S wa k Proof by induction on w : Basis: w = ɛ then k(w) = 0. Since S S, then S ɛs Induction step: for w {a, b} n, write w = w σ with w = n 1 Assume for concreteness (other cases proved analogously) 1. σ=a 2. k = k(w ) = (# a (w ) # b (w )) = k 1>0 By i.h., S w B k Hence, S w B k = w BB k 1 w abbb k 1 = wb k +1 = wb k Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 18 / 32

Section 1 Parse Trees Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 19 / 32

Parse trees Definition 8 (parse tree) A labeled tree T is a parse tree of CFG G = (V, Σ, R, S), if Inner nodes labels by elements of V. Leaves are labeled by elements of V Σ {ε}. If n 1,..., n k are the labels of the direct descendants (from left to right) of a node labeled A, then (A n 1,..., n k ) R The yield of T, are the labels of all its leaves ordered written from left to right. Theorem 9 Let G = (V, Σ, R, S) be a CFG, let A V and let x (Σ V ). Then A x iff G has a parse tree root labeled by A, that yields x. Proof? DIY Might be that several derivation sequences (i.e., S... w) show that w L(G), but only a single tree rooted by S yields w (i.e., trees are indifferent to derivation order). Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 20 / 32

Example 1 A 0A1 B B # The (unique) derivation sequence from A to 000#111: A 0A1 00A11 000A111 000B111 000#111 Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 21 / 32

Example 2 < SENTENCE > < NP >< VERB > < NP > < ARTICLE >< NOUN > < NOUN > boy girl flower < ARTICLE > a the < VERB > touches likes sees SENTENCE NOUN-PHRASE VERB ARTICLE NOUN a boy sees Two derivation sequences from < SENTENCE > to aboysees: < SENTENCE > < NP >< VERB > < ARTICLE >< NOUN >< VERB > a < NOUN >< VERB > aboy < VERB > aboysees < SENTENCE > < NP >< VERB > < ARTICLE >< NOUN >< VERB > < ARTICLE >< NOUN > sees < ARTICLE > boysees aboysees Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 22 / 32

Section 2 Designing Context-Free Grammars Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 23 / 32

Designing CFGs No recipe in general, but few rules-of-thumb If CFG is the union of several CFGs, rename variables (not terminals) so they are disjoint, and add new rule S S 1 S 2... S i. For languages (like {0 n #1 n : n 0} ), with linked substrings, a rule of form R urv is helpful to force desired relation between substrings. For a regular language, grammar follows a DFA for the language (see next frame). Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 24 / 32

CFG for regular languages Given a DFA : M = (Q, Σ, δ, q 0, F ) CFG G for L(M): 1. Let R 0 be the starting variable 2. Add rule R i ar j, for any q i, q j Q and a Σ with δ(q i, a) = q j 3. Add rule R i ε, for any q i F Claim 10 L(G) = L(M) Proof? Claim 11 R 0 wrj iff δ M (q 0, w) = q j. Proof? Next class we ll see alternative proof via Push-Down Automata" Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 25 / 32

Section 3 Ambiguity in Context-Free Languages Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 26 / 32

Ambiguity in CFLs Grammar G: E E + E E E (E) a a X a + a E E E E E a X a + a E E E E E Is this a problem? Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 27 / 32

Ambiguity in CFLs cont. Consider the grammar G = (V, Σ, R, E), where V = {E, T, F } Σ = {a, +,, (, )} Rules: E E + T T T T F F F (E) a Claim 12 L(G ) = L(G) Proof Induction on derivation length. But G is not ambiguous. Proof (see Hopcroft, for a similar grammar) Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 28 / 32

Parsing tree of G for a + a a G : E E + T T T T F F F (E) a E E T T T F F F a + a X a Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 29 / 32

Parsing tree of G for (a + a) a G : E E + T T T T F F F (E) a E T T F E E T T F F F ( a + a ) X a Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 30 / 32

Ambiguity Definition 13 A string w is derived ambiguously from grammar G, if w has two or more different parse trees that generate it from G. A CFG is ambiguous, if it ambiguously derives a string. Ambiguity is usually not only a syntactic notion but also semantic, implying multiple meanings for the same string. Think of a + a a from last grammar. It is sometime possible to eliminate ambiguity by finding a different context free grammar generating the same language. This is true for the arithmetic expressions grammar. Some languages are inherently ambiguous. Example: L = {a n b n c m d m : n, m 1} {a n b m c m d n : n, m 1} L is a CFL. Proof? L is inherently ambiguous. Proof? see Hopcroft Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 31 / 32

Checking membership in CFLs Challenge Given a CFG G and a string w, decide whether w L(G)? Initial Idea: Design an algorithm that tries all derivations. Problem: If G does not generate w, we ll never stop. Possible solution: Use special grammars that are: just as expressive! better for checking membership. Iftach Haitner (TAU) Computational Models, Lecture 4 November 21, 2016 32 / 32