From EBNF to PEG. Roman R. Redziejowski. Concurrency, Specification and Programming Berlin 2012

Similar documents
From EBNF to PEG. Roman R. Redziejowski

From EBNF to PEG. Roman R. Redziejowski. Giraf s Research

Recursive descent for grammars with contexts

Creating a Recursive Descent Parse Table

Top-Down Parsing and Intro to Bottom-Up Parsing

Foundations of Informatics: a Bridging Course

On Parsing Expression Grammars A recognition-based system for deterministic languages

Introduction to Bottom-Up Parsing

Ambiguity, Precedence, Associativity & Top-Down Parsing. Lecture 9-10

Parsing Algorithms. CS 4447/CS Stephen Watt University of Western Ontario

Introduction to Bottom-Up Parsing

CSE302: Compiler Design

Introduction to Bottom-Up Parsing

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully

COMP4141 Theory of Computation

Introduction to Bottom-Up Parsing

Syntax Analysis - Part 1. Syntax Analysis

Harvard CS 121 and CSCI E-207 Lecture 12: General Context-Free Recognition

Syntax Analysis Part I

Follow sets. LL(1) Parsing Table

CONTEXT FREE GRAMMAR AND

What Is a Language? Grammars, Languages, and Machines. Strings: the Building Blocks of Languages

THEORY OF COMPUTATION (AUBER) EXAM CRIB SHEET

(NB. Pages are intended for those who need repeated study in formal languages) Length of a string. Formal languages. Substrings: Prefix, suffix.

Context-Free Grammars (and Languages) Lecture 7

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars

Syntax Analysis Part I

Predictive parsing as a specific subclass of recursive descent parsing complexity comparisons with general parsing

Compiling Techniques

Computing if a token can follow

CVO103: Programming Languages. Lecture 2 Inductive Definitions (2)

Chapter 3. Regular grammars

CMSC 330: Organization of Programming Languages. Pushdown Automata Parsing

Theory of Computer Science

Syntax Analysis: Context-free Grammars, Pushdown Automata and Parsing Part - 3. Y.N. Srikant

Compiling Techniques

LR2: LR(0) Parsing. LR Parsing. CMPT 379: Compilers Instructor: Anoop Sarkar. anoopsarkar.github.io/compilers-class

THEORY OF COMPILATION

Languages. Languages. An Example Grammar. Grammars. Suppose we have an alphabet V. Then we can write:

Lecture Notes on Inductive Definitions

Lecture 11 Context-Free Languages

Parsing -3. A View During TD Parsing

Peter Wood. Department of Computer Science and Information Systems Birkbeck, University of London Automata and Formal Languages

What we have done so far

INF5110 Compiler Construction

UNIT II REGULAR LANGUAGES

CS 314 Principles of Programming Languages

Computational Models - Lecture 5 1

Lecture Notes on Inductive Definitions

Compiler Design 1. LR Parsing. Goutam Biswas. Lect 7

n Top-down parsing vs. bottom-up parsing n Top-down parsing n Introduction n A top-down depth-first parser (with backtracking)

1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u,

Theory of Computation - Module 3

Automata and Formal Languages - CM0081 Algebraic Laws for Regular Expressions

C1.1 Introduction. Theory of Computer Science. Theory of Computer Science. C1.1 Introduction. C1.2 Alphabets and Formal Languages. C1.

Context Free Languages. Automata Theory and Formal Grammars: Lecture 6. Languages That Are Not Regular. Non-Regular Languages

EXAM. CS331 Compiler Design Spring Please read all instructions, including these, carefully

Parsing Regular Expressions and Regular Grammars

LL(1) Grammar and parser

EDUCATIONAL PEARL. Proof-directed debugging revisited for a first-order version

October 6, Equivalence of Pushdown Automata with Context-Free Gramm

Context- Free Parsing with CKY. October 16, 2014

Compiler Construction Lent Term 2015 Lectures (of 16)

Compiler Construction Lent Term 2015 Lectures (of 16)

Syntactical analysis. Syntactical analysis. Syntactical analysis. Syntactical analysis

Talen en Compilers. Johan Jeuring , period 2. October 29, Department of Information and Computing Sciences Utrecht University

Lecture VII Part 2: Syntactic Analysis Bottom-up Parsing: LR Parsing. Prof. Bodik CS Berkley University 1

6.8 The Post Correspondence Problem

CS153: Compilers Lecture 5: LL Parsing

Syntax Analysis Part I. Position of a Parser in the Compiler Model. The Parser. Chapter 4

MA/CSSE 474 Theory of Computation

1. (a) Explain the procedure to convert Context Free Grammar to Push Down Automata.

Finite Automata. BİL405 - Automata Theory and Formal Languages 1

TAFL 1 (ECS-403) Unit- III. 3.1 Definition of CFG (Context Free Grammar) and problems. 3.2 Derivation. 3.3 Ambiguity in Grammar

Context-free Grammars and Languages

CS415 Compilers Syntax Analysis Top-down Parsing

Compiling Techniques

1 Alphabets and Languages

Automata Theory CS F-08 Context-Free Grammars

CS 406: Bottom-Up Parsing

AC68 FINITE AUTOMATA & FORMULA LANGUAGES DEC 2013

Chapter 1. Formal Definition and View. Lecture Formal Pushdown Automata on the 28th April 2009

Computer Science 160 Translation of Programming Languages

Theory of Computation Turing Machine and Pushdown Automata

UNIT-VI PUSHDOWN AUTOMATA

Fundamentele Informatica II

Syntactic Analysis. Top-Down Parsing

cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska

Compiler Principles, PS4

FLAC Context-Free Grammars

The Parser. CISC 5920: Compiler Construction Chapter 3 Syntactic Analysis (I) Grammars (cont d) Grammars

This lecture covers Chapter 5 of HMU: Context-free Grammars

Compiler construction

Final exam study sheet for CS3719 Turing machines and decidability.

Turing s thesis: (1930) Any computation carried out by mechanical means can be performed by a Turing Machine

Computational Models: Class 3

CSE 355 Test 2, Fall 2016

HKN CS/ECE 374 Midterm 1 Review. Nathan Bleier and Mahir Morshed

Automata theory. An algorithmic approach. Lecture Notes. Javier Esparza

This lecture covers Chapter 6 of HMU: Pushdown Automata

Transcription:

Concurrency, Specification and Programming Berlin 2012

EBNF: Extended Backus-Naur Form A way to define grammar.

EBNF: Extended Backus-Naur Form A way to define grammar. Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B"

Recursive-descent parsing Parsing procedure for each equation and each terminal. Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" Literal calls Decimal or Binary. Decimal calls repeatedly [0-9], then ".", then repeatedly [0-9]. Binary calls repeatedly [01], then "B".

Recursive-descent parsing Parsing procedure for each equation and each terminal. Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" Literal calls Decimal or Binary. Decimal calls repeatedly [0-9], then ".", then repeatedly [0-9]. Binary calls repeatedly [01], then "B". Problem: Decimal and Binary may start with any number of 0 s and 1 s. Literal cannot choose which procedure to call by looking at any fixed distance ahead.

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Decimal

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Decimal [0-9]

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Decimal [0-9] : advance 3 times

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Decimal "."

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Decimal "." : fail, backtrack

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Binary

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Binary [01]

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Binary [01] : advance 3 times

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Binary "B"

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" 101B ^ Literal Binary "B" : advance, return

Solution: Backtracking Literal = Decimal Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B"

Limited backtracking Backtracking solves the problem, but may take exponential time.

Limited backtracking Backtracking solves the problem, but may take exponential time. Solution: limited backtracking. Never go back after one alternative succeeded.

Limited backtracking Backtracking solves the problem, but may take exponential time. Solution: limited backtracking. Never go back after one alternative succeeded. 1961 Brooker & Morris - Altas Compiler Compiler 1965 McClure - TransMoGrifier (TMG) 1972 Aho & Ullman - Top-Down Parsing Language (TDPL)... 2004 Ford - Parsing Expression Grammar (PEG)

Limited backtracking Backtracking solves the problem, but may take exponential time. Solution: limited backtracking. Never go back after one alternative succeeded. 1961 Brooker & Morris - Altas Compiler Compiler 1965 McClure - TransMoGrifier (TMG) 1972 Aho & Ullman - Top-Down Parsing Language (TDPL)... 2004 Ford - Parsing Expression Grammar (PEG) It can work in linear time.

PEG - Parsing Expression Grammar Looks exactly like EBNF: Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B"

PEG - Parsing Expression Grammar Looks exactly like EBNF: Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" Specification of a recursive-descent parser with limited backtracking, where "/" means an ordered no-return choice.

PEG is not EBNF EBNF: PEG: A = ("a" / "aa") "b" {ab, aab} {ab} A = ("aa" / "a") "ab" {aaab, aab} {aaab} A = ("a" / "b"?) "a" {aa, ba, a} {aa, ba}

PEG is not EBNF EBNF: PEG: A = ("a" / "aa") "b" {ab, aab} {ab} A = ("aa" / "a") "ab" {aaab, aab} {aaab} A = ("a" / "b"?) "a" {aa, ba, a} {aa, ba} Backtracking may examine input far ahead so result may depend on context in front.

PEG is not EBNF EBNF: PEG: A = ("a" / "aa") "b" {ab, aab} {ab} A = ("aa" / "a") "ab" {aaab, aab} {aaab} A = ("a" / "b"?) "a" {aa, ba, a} {aa, ba} Backtracking may examine input far ahead so result may depend on context in front. A = "a" A "a" / "aa" EBNF: a 2n PEG: a 2n

Sometimes PEG is EBNF In this case PEG = EBNF: Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B"

Sometimes PEG is EBNF In this case PEG = EBNF: Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" When does it happen?

When PEG = EBNF? Sérgio Queiroz de Medeiros Correspondência entre PEGs e Classes de Gramáticas Livres de Contexto. Ph.D. Thesis Pontifícia Universidade Católica do Rio dejaneiro (2010).

When PEG = EBNF? Sérgio Queiroz de Medeiros Correspondência entre PEGs e Classes de Gramáticas Livres de Contexto. Ph.D. Thesis Pontifícia Universidade Católica do Rio dejaneiro (2010). If EBNF has LL(1) property then PEG = EBNF

When PEG = EBNF? But this is not LL(1): Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B"

When PEG = EBNF? But this is not LL(1): Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" Which means PEG = EBNF for a wider class.

When PEG = EBNF? But this is not LL(1): Literal = Decimal / Binary Decimal = [0-9] + "." [0-9] Binary = [01] + "B" Which means PEG = EBNF for a wider class. Let us find more about it.

Simple grammar Alphabet Σ (the "terminals"). Set N of names (the "nonterminals"). For each A N one rule of the form: A = e 1 e 2 (Sequence) or A = e 1 e 2 (Choice) where e 1, e 2 N Σ {ε}. Start symbol S A. "Syntax expressions": E = N Σ {ε}.

Simple grammar Alphabet Σ (the "terminals"). Set N of names (the "nonterminals"). For each A N one rule of the form: A = e 1 e 2 (Sequence) or A = e 1 e 2 (Choice) where e 1, e 2 N Σ {ε}. Start symbol S A. "Syntax expressions": E = N Σ {ε}. Will consider two interpretations: EBNF and PEG.

EBNF interpretation L(e) language of expression e E. L(ε) = {ε} L(a) = {a} for a Σ L(A) = L(e 1 )L(e 2 ) for A = e 1 e 2 L(A) = L(e 1 ) L(e 2 ) for A = e 1 e 2 Language defined by the grammar: L(S).

"Natural semantics" (after Medeiros) Relation BNF E Σ Σ, written [e] x BNF y. [e] xy BNF y means "xy has prefix x L(e)". Or: parsing procedure for e, applied to xy consumes x". w L(S) [S] w$ BNF $ where $ is "end of text" marker.

"Natural semantics" (after Medeiros) [e] x BNF y holds if and only if it can be proved using these inference rules: [ε] x BNF x [a] ax BNF x A = e 1 e 2 A = e 1 e 2 A = e 1 e 2 [e 1 ] xyz BNF yz [A] xyz BNF z [e 1 ] xy BNF y [A] xy BNF y [e 2 ] xy BNF y [A] xy BNF y [e 2 ] yz BNF z

Example of proof Grammar: S = ax, X = S b Proof of aab L(S) [b] b$ BNF $ [a] ab$ BNF b$ X = S b [b] b$ BNF $ [X] b$ BNF $ S = ax [a] ab$ BNF b$ [S] ab$ BNF $ [X] b$ BNF $ [a] aab$ BNF ab$ X = S b [S] ab$ BNF $ [X] ab$ BNF $ S = ax [a] aab$ BNF ab$ [S] aab$ BNF $ [X] ab$ BNF $

PEG interpretation Elements of E are parsing procedures that consume input or return "failure". ε returns success without consuming input. a consumes a if input starts with a. Otherwise returns failure. A = e 1 e 2 calls e 1 then e 2. If any of them failed, backtracks and returns failure. A = e 1 e 2 calls e 1. If e 1 succeeded, returns success. If e 1 failed, calls e 2 and returns its result.

"Natural semantics" (after Medeiros) Relation PEG E Σ (Σ fail), written [e] x PEG y. [e] xy PEG y means "e consumes prefix x of xy". [e] x PEG fail means "e applied to x returns failure". w accepted by the grammar iff [S] w$ PEG $.

"Natural semantics" (after Medeiros) [e] x PEG Y holds if and only if it can be proved using these inference rules: [ε] x PEG x A = e 1 e 2 A = e 1 e 2 A = e 1 e 2 [a] ax PEG x [e 1 ] xyz PEG yz [A] xyz PEG Z [e 1 ] x PEG fail [A] x PEG fail [e 1 ] x PEG fail [A] xy PEG Y b a [b] ax PEG fail [e 2 ] yz PEG Z A = e 1 e 2 [e 2 ] xy PEG Y where Y is y or fail and Z is z or fail. [a] ε PEG fail [e 1 ] xy PEG y [A] xy PEG y

When PEG = EBNF? By induction on the height of proof trees for [S] w$ PEG $ and [S] w$ BNF $: [S] w$ PEG $ [S] w$ BNF $. (Medeiros)

When PEG = EBNF? By induction on the height of proof trees for [S] w$ PEG $ and [S] w$ BNF $: [S] w$ PEG $ [S] w$ BNF $. (Medeiros) [S] w$ BNF $ [S] w$ PEG $ if for every Choice A = e 1 e 2 holds L(e 1 ) Pref(L(e 2 ) Tail(A)) =.

When PEG = EBNF? By induction on the height of proof trees for [S] w$ PEG $ and [S] w$ BNF $: [S] w$ PEG $ [S] w$ BNF $. (Medeiros) [S] w$ BNF $ [S] w$ PEG $ if for every Choice A = e 1 e 2 holds L(e 1 ) Pref(L(e 2 ) Tail(A)) =. (Tail(A) is any possible continuation after A: y Tail(A) iff proof tree of [S] w$ BNF $ for some w contains partial result [A] xy$ BNF y$.)

When PEG = EBNF? Let us say that Choice A = e 1 e 2 is "safe" to mean L(e 1 ) Pref(L(e 2 ) Tail(A)) =.

When PEG = EBNF? Let us say that Choice A = e 1 e 2 is "safe" to mean L(e 1 ) Pref(L(e 2 ) Tail(A)) =. The two interpretations are equivalent if every Choice in the grammar is safe.

Notes L(e 1 ) Pref(L(e 2 ) Tail(A)) =

Notes L(e 1 ) Pref(L(e 2 ) Tail(A)) = Requires ε / L(e 1 ).

Notes L(e 1 ) Pref(L(e 2 ) Tail(A)) = Requires ε / L(e 1 ). Depends on context.

Notes L(e 1 ) Pref(L(e 2 ) Tail(A)) = Requires ε / L(e 1 ). Depends on context. Difficult to check: L(e 1 ), L(e 2 ), and Tail(A) can be any context-free languages. Intersection of context-free languages is in general undecidable.

Notes L(e 1 ) Pref(L(e 2 ) Tail(A)) = Requires ε / L(e 1 ). Depends on context. Difficult to check: L(e 1 ), L(e 2 ), and Tail(A) can be any context-free languages. Intersection of context-free languages is in general undecidable. Can be "approximated" by stronger conditions.

Approximation by first letters Consider A = e 1 e 2. FIRST(e 1 ), FIRST(e 2 ): sets of possible first letters of words in L(e 1 ) respectively L(e 2 ).

Approximation by first letters Consider A = e 1 e 2. FIRST(e 1 ), FIRST(e 2 ): sets of possible first letters of words in L(e 1 ) respectively L(e 2 ). If L(e 1 ),L(e 2 ), do not contain ε, FIRST(e 1 ) FIRST(e 2 ) = implies L(e 1 ) Pref(L(e 2 ) Tail(A)) =.

Approximation by first letters Consider A = e 1 e 2. FIRST(e 1 ), FIRST(e 2 ): sets of possible first letters of words in L(e 1 ) respectively L(e 2 ). If L(e 1 ),L(e 2 ), do not contain ε, FIRST(e 1 ) FIRST(e 2 ) = implies L(e 1 ) Pref(L(e 2 ) Tail(A)) =. This is LL(1) for grammar without ε. Each choice in such grammar is safe. The two interpretations are equivalent.

Approximation by first expressions To go beyond LL(1), we shall look at first expressions rather than first letters.

Computing FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W c c

Computing FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W c c FIRST(X) = {a, b, c}

Computing FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W c c FIRST(X) = {a, b, c} FIRST(Y) = {c, d}

Computing FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W c c FIRST(X) = {a, b, c} FIRST(Y) = {c, d} {a, b, c} {c, d} : S = X Y is not LL(1).

Truncated computation of FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W

Truncated computation of FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W Each word in L(X) has a prefix in {a, b} L(T) = a c b.

Truncated computation of FIRST S = X Y X = Z V X Y Y = W X Z = a b V = b T Z V W W = d U T = c V a b T d U U = c W Each word in L(X) has a prefix in {a, b} L(T) = a c b. Each word in L(Y) has a prefix in {d} L(U) = d b.

Approximation by first expressions Each word in L(X) has a prefix in a c b. Each word in L(Y) has a prefix in d b. L(X) = (a c b)(...) L(Y) = (c d)(...)

Approximation by first expressions Each word in L(X) has a prefix in a c b. Each word in L(Y) has a prefix in d b. L(X) = (a c b)(...) L(Y) = (c d)(...) L(X) Pref(L(Y) Tail(S)) = (a c b)(...) Pref((c d)(...)(...))

Approximation by first expressions Each word in L(X) has a prefix in a c b. Each word in L(Y) has a prefix in d b. L(X) = (a c b)(...) L(Y) = (c d)(...) L(X) Pref(L(Y) Tail(S)) = (a c b)(...) Pref((c d)(...)(...)) No word in a c b is a prefix of word in c d and vice-versa.

Approximation by first expressions Each word in L(X) has a prefix in a c b. Each word in L(Y) has a prefix in d b. L(X) = (a c b)(...) L(Y) = (c d)(...) L(X) Pref(L(Y) Tail(S)) = (a c b)(...) Pref((c d)(...)(...)) No word in a c b is a prefix of word in c d and vice-versa. The intersection is empty: S = X Y is safe.

Some terminology X starts with a, b, or T : "X has a, b, and T as possible first expressions". {a, b, T} X

Some terminology X starts with a, b, or T : "X has a, b, and T as possible first expressions". {a, b, T} X No word in a, b, or T is a prefix of a word in d or U and vice-versa: "{a, b, T} and {d, U} are exclusive". {a, b, T} {d, U}

One can easily see that... If ε / e 1 and ε / e 2 and there exist FIRST 1 e 1, FIRST 2 e 2 such that FIRST 1 FIRST 2 then A = e 1 e 2 is safe.

When PEG = EBNF? The two interpretations of an ε-free grammar are equivalent if for every Choice A = e 1 e 2, e 1 and e 2 have exclusive sets of first expressions.

Final remarks (Good news) Grammar with ε is easy to handle. This involves first expressions of Tail(A), that are obtained using the classical computation of FOLLOW.

Final remarks (Good news) Grammar with ε is easy to handle. This involves first expressions of Tail(A), that are obtained using the classical computation of FOLLOW. (Good news) The results for simple grammar are easily extended to full EBNF / PEG.

Final remarks (Good news) Grammar with ε is easy to handle. This involves first expressions of Tail(A), that are obtained using the classical computation of FOLLOW. (Good news) The results for simple grammar are easily extended to full EBNF / PEG. (Good news) The possible sets of first expressions are easily obtained in a mechanical way.

Final remarks (Good news) Grammar with ε is easy to handle. This involves first expressions of Tail(A), that are obtained using the classical computation of FOLLOW. (Good news) The results for simple grammar are easily extended to full EBNF / PEG. (Good news) The possible sets of first expressions are easily obtained in a mechanical way. (Bad news) Checking that they are exclusive is not easy: it is undecidable in general case (but we may hope first expressions are simple enough to be decidable.)

Final final remark S = (aa a)b (that is: S = Xb, X = aa a.)

Final final remark S = (aa a)b (that is: S = Xb, X = aa a.) L(e 1 ) Pref(L(e 2 ) Tail(X)) = aa Pref(ab) =.

Final final remark S = (aa a)b (that is: S = Xb, X = aa a.) L(e 1 ) Pref(L(e 2 ) Tail(X)) = aa Pref(ab) =. X is safe. Both interpretations accept {aab,ab}.

Final final remark S = (aa a)b (that is: S = Xb, X = aa a.) L(e 1 ) Pref(L(e 2 ) Tail(X)) = aa Pref(ab) =. X is safe. Both interpretations accept {aab,ab}. Sets of first expressions in X: {aa} and {a}. Not exclusive!

Final final remark S = (aa a)b (that is: S = Xb, X = aa a.) L(e 1 ) Pref(L(e 2 ) Tail(X)) = aa Pref(ab) =. X is safe. Both interpretations accept {aab,ab}. Sets of first expressions in X: {aa} and {a}. Not exclusive! There is more to squeeze out of L(e 1 ) Pref(L(e 2 ) Tail(A)).

That s all Thanks for your attention!