Postprint.

Similar documents
CS 275 Automata and Formal Language Theory

Parse trees, ambiguity, and Chomsky normal form

Formal Languages and Automata

Minimal DFA. minimal DFA for L starting from any other

Coalgebra, Lecture 15: Equations for Deterministic Automata

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

p-adic Egyptian Fractions

Lecture 08: Feb. 08, 2019

1 Nondeterministic Finite Automata

Assignment 1 Automata, Languages, and Computability. 1 Finite State Automata and Regular Languages

Mildly Context-Sensitive Grammar Formalisms: Introduction

First Midterm Examination

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

Lecture 09: Myhill-Nerode Theorem

CS 330 Formal Methods and Models Dana Richards, George Mason University, Spring 2016 Quiz Solutions

Hamiltonian Cycle in Complete Multipartite Graphs

Farey Fractions. Rickard Fernström. U.U.D.M. Project Report 2017:24. Department of Mathematics Uppsala University

Designing finite automata II

Tutorial Automata and formal Languages

CS 275 Automata and Formal Language Theory

Nondeterminism and Nodeterministic Automata

First Midterm Examination

The size of subsequence automaton

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

Formal Languages and Automata Theory. D. Goswami and K. V. Krishna

Domino Recognizability of Triangular Picture Languages

CS 330 Formal Methods and Models

Chapter Five: Nondeterministic Finite Automata. Formal Language, chapter 5, slide 1

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018

Lecture 3: Equivalence Relations

CM10196 Topic 4: Functions and Relations

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms

1 From NFA to regular expression

Closure Properties of Regular Languages

I. Theory of Automata II. Theory of Formal Languages III. Theory of Turing Machines

Convert the NFA into DFA

Formal languages, automata, and theory of computation

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Compiler Design. Fall Lexical Analysis. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Regular expressions, Finite Automata, transition graphs are all the same!!

Finite Automata-cont d

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2016

CSE : Exam 3-ANSWERS, Spring 2011 Time: 50 minutes

Chapter 2 Finite Automata

Let's start with an example:

Model Reduction of Finite State Machines by Contraction

1.4 Nonregular Languages

CS375: Logic and Theory of Computing

5. (±±) Λ = fw j w is string of even lengthg [ 00 = f11,00g 7. (11 [ 00)± Λ = fw j w egins with either 11 or 00g 8. (0 [ ffl)1 Λ = 01 Λ [ 1 Λ 9.

Thoery of Automata CS402

Bases for Vector Spaces

Context-Free Grammars and Languages

Table of contents: Lecture N Summary... 3 What does automata mean?... 3 Introduction to languages... 3 Alphabets... 3 Strings...

CHAPTER 1 Regular Languages. Contents

DFA minimisation using the Myhill-Nerode theorem

A negative answer to a question of Wilke on varieties of!-languages

Harvard University Computer Science 121 Midterm October 23, 2012

CHAPTER 1 Regular Languages. Contents. definitions, examples, designing, regular operations. Non-deterministic Finite Automata (NFA)

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2

1.3 Regular Expressions

Quadratic Forms. Quadratic Forms

Finite Automata. Informatics 2A: Lecture 3. Mary Cryan. 21 September School of Informatics University of Edinburgh

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs

Converting Regular Expressions to Discrete Finite Automata: A Tutorial

CMSC 330: Organization of Programming Languages

PART 2. REGULAR LANGUAGES, GRAMMARS AND AUTOMATA

Finite Automata. Informatics 2A: Lecture 3. John Longley. 22 September School of Informatics University of Edinburgh

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb.

Deterministic Finite Automata

Speech Recognition Lecture 2: Finite Automata and Finite-State Transducers. Mehryar Mohri Courant Institute and Google Research

Talen en Automaten Test 1, Mon 7 th Dec, h45 17h30

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science

Some Theory of Computation Exercises Week 1

Homework 3 Solutions

Exercises Chapter 1. Exercise 1.1. Let Σ be an alphabet. Prove wv = w + v for all strings w and v.

Overview HC9. Parsing: Top-Down & LL(1) Context-Free Grammars (1) Introduction. CFGs (3) Context-Free Grammars (2) Vertalerbouw HC 9: Ch.

FABER Formal Languages, Automata and Models of Computation

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

Zero-Sum Magic Graphs and Their Null Sets

Grammar. Languages. Content 5/10/16. Automata and Languages. Regular Languages. Regular Languages

80 CHAPTER 2. DFA S, NFA S, REGULAR LANGUAGES. 2.6 Finite State Automata With Output: Transducers

The Regulated and Riemann Integrals

Normal Forms for Context-free Grammars

Handout: Natural deduction for first order logic

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

QUADRATURE is an old-fashioned word that refers to

AUTOMATA AND LANGUAGES. Definition 1.5: Finite Automaton

3 Regular expressions

Fundamentals of Computer Science

Finite-State Automata: Recap

CSCI 340: Computational Models. Transition Graphs. Department of Computer Science

Revision Sheet. (a) Give a regular expression for each of the following languages:

CS103 Handout 32 Fall 2016 November 11, 2016 Problem Set 7

Scanner. Specifying patterns. Specifying patterns. Operations on languages. A scanner must recognize the units of syntax Some parts are easy:

The transformation to right derivation is called the canonical reduction sequence. Bottom-up analysis

Designing Information Devices and Systems I Spring 2018 Homework 7

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2014

Transcription:

http://www.div-portl.org Postprint This is the ccepted version of pper presented t Eighth Workshop on Non-Clssicl Models of utomt nd pplictions(ncm 2016). Cittion for the originl pulished pper: ensch, S., Kutri, M., Mlcher,. (2016) Extended Uniformly Limited T0L Lnguges nd Mild Context-Sensitivity. In: Henning ordihn, Rudolf Freund, enedek Ngy, nd yörgy Vszil (ed.), Eight Workshop on Non-Clssicl Models of utomt nd pplictions (NCM 2016): Short Ppers (pp. 35-46). Wien: Institut für Computersprchen N.. When citing this work, cite the originl pulished pper. Permnent link to this version: http://urn.k.se/resolve?urn=urn:nn:se:umu:div-125175

EXTENDED UNIFORMLY LIMITED T0L LNUES ND MILD CONTEXT-SENSITIVITY Sun ensch () Mrtin Kutri () ndres Mlcher () () Deprtment of Computing Science, Umeå University, 90187 Umeå, Sweden sun@cs.umu.se () Institut für Informtik, Universität iessen, rndtstr. 2, 35392 iessen, ermny {kutri,mlcher}@informtik.uni-giessen.de strct We study the fixed memership prolem for k-uniformly-limited nd propgting ET0L systems (kulept0l systems). To this end, the lgorithm given in [7] is pplied. It follows tht kulept0l lnguges re prsle in polynomil time. Since kulept0l lnguges re semiliner [1] nd kulept0l systems generte certin non-context-free lnguges, which cpture the non-context-free phenomen occurring in nturl lnguges, this is the lst uilding lock to show tht kulept0l lnguges, for k 2, elong to the fmily of mildly context-sensitive lnguges. 1. Introduction Context-free lnguges re prsle in polynomil time nd there re severl memership lgorithms sed on the concept of dynmic progrmming for context-free lnguges [6, 15]. For context-sensitive lnguges, on the other hnd, there re no known polynomil time lgorithms for the memership prolem. Extended Lindenmyer systems without interction (E0L systems) cn e considered s prllel counterprts of context-free grmmrs nd E0L lnguges re known to e prsle in polynomil time s well [10]. However, for ET0L lnguges, tht is, lnguges generted y tled E0L systems, the memership complexity is NP-complete [12]. Reserch hs studied mny lnguge fmilies tht lie etween context-free (or E0L) nd context-sensitive (or ET0L), (see, for instnce [4]). In [7], for exmple, the uthors study the memership complexity for context-free lnguges generted y context-free grmmrs which re extended so tht context-free productions re pplied to fixed numer k of symols t ech derivtion step. The uthors use results from scheduling theory nd dynmic progrmming to

2 Sun ensch, Mrtin Kutri, ndres Mlcher show tht the memership for these extended context-free lnguges is decidle in polynomil time. The uthors in [14], for instnce, introduced restricted version of ET0L systems, nmely k-uniformly-limited ET0L systems (revited kulet0l systems). In these systems the prllel rewriting mechnism of Lindenmyer systems is limited such tht not ll symols in word w hve to e rewritten, ut min{k, w } symols, where k is positive integer. Note tht, if k = 1, we hve context-free grmmr (see [14]). The crucil differences etween these two grmmr formlisms re, tht (i) the extended context-free grmmrs in [7] re extended with respect to their sequentil rewriting mechnisms (tht is, from rewriting one symol to rewriting k symols) nd kulet0l systems re limited with respect tht their prllel sustitution mechnisms (tht is, from sustituting ll symols to sustituting k symols), nd (ii) in [7] the lengths of ll sententil forms (fter rewriting the strt symol) re t lest k. In mthemticl linguistics, reserchers lso hve een investigting lnguge fmilies tht lie etween the context-free lnguge fmily nd the context-sensitive lnguge fmily in the Chomsky hierrchy. concept tht cptures such lnguge fmilies is mild context-sensitivity. The notion of mild context-sensitivity ws first mentioned in [8], where the uthor proposed tht clss of grmmrs (nd their ssocited lnguges) modeling the syntx of nturl lnguges should hve the following three chrcteristics. First, it should descrie certin non-context-free structures in order to cpture the non-context-free phenomen occurring in nturl lnguges. Second, it should hve the constnt growth property (the constnt growth property is oeyed y every semiliner lnguge) nd, third, it should e prsle in polynomil time. Note, tht the concept mildly context-sensitive cptures fmily of fmilies of lnguges, not single lnguge fmily. For forml definition of mildly context-sensitive grmmr formlisms see [2]. In the literture there hve een mny investigtions of mildly context-sensitive sequentil grmmr formlisms nd their lnguges (see, for instnce, [9, 13]). There hve een less investigtions of mildly context-sensitive prllel grmmr formlisms. In [1] the uthor investigtes some restricted versions of limited prllel Lindenmyer systems with respect to their mild contextsensitivity. In this pper, we pply the fixed memership lgorithm given in [7] to propgting kulet0l lnguges (revited kulept0l lnguges). It follows tht kulept0l lnguges re prsle in polynomil time. Moreover, it is known tht kulept0l lnguges re semiliner [1]. dditionlly, kulept0l systems generte non-context-free lnguges, such s { n n c n n 1 }, { n m c n d m n, m 1 }, nd { ww w {, } + }, which cpture the non-context-free phenomen occurring in nturl lnguges. From ll this, we conclude tht kulept0l lnguges, for k 2, elong to the fmily of mildly context-sensitive lnguges. 2. Definition nd Preliminries We ssume the reder to e fmilir with the sic notions of Lindenmyer systems without interction such s in [11]. In generl we hve the following conventions: The set of positive integers is denoted y N nd if we wnt to include 0, we write N 0. The crdinlity of set is denoted y #. Let V = { 1, 2,..., n } e some lphet, where the order of the symols

Extended kulpt0l Lnguges nd Mild Context-Sensitivity 3 is fix. y V + we denote the set of nonempty words; if the empty word λ is included, then we use the nottion V. The length of word w in V is the numer of letters in w nd written s w. The length of the empty word λ is 0. n ET0L system is qudruple = (Σ, H, ψ, ), where Σ is n lphet, ψ Σ is the xiom, Σ is the terminl lphet, nd H is finite set of finite sustitutions from Σ into Σ. sustitution h in H is clled tle. For x in Σ we write x y if y h(x). y Mxr() we denote the length of the longest right-hnd side of production in. In generl, we write u w if nd only if w h(u), for u nd w in Σ nd some h in H. If the tle should e noted explicitly, we write u w. The reflexive trnsitive closure of the derivtion reltion h is denoted y. The lnguge generted y is L() = { w ψ w }. n ET0L system is clled propgting (EPT0L system, for short) if for ll sustitutions h in H nd ll x Σ, we hve λ / h(x). In derivtion of kulet0l system t ech step of the rewriting process exctly min{k, w } symols of the word w considered hve to e rewritten. Tht is, if w < k then ll symols hve to e rewritten, ut if w k then there re ( ) w k possiilities to rewrite the word w. Formlly, k-uniformly-limited ET0L system (kulet0l system, for short) ([14]) is quintuple = (Σ, H, ψ,, k), where k N nd (Σ, H, ψ, ) is n ET0L system. The derivtion reltion h H. of kulet0l system is defined s follows. Let u, w Σ nd 1. If u k then u w if we cn write u = v 1 x 1 v 2 x 2 x k v k+1 nd w = v 1 z 1 v 2 z 2 z k v k+1 with x i Σ, v j Σ, z i h(x i ), i = 1,..., k, j = 1,..., k + 1. 2. If u < k then u w if w h(u). sententil form of kulet0l system = (Σ, H, ψ,, k) is word w Σ with ψ w. propgting kulet0l (kulept0l, for short) is defined the sme wy s for ET0L systems. The uthors in [14] introduced the notion of pseudo-synchroniztion for kulet0l grmmrs. kulet0l system is clled pseudo-synchronized if for every nd for every w Σ the following holds: if w then w / ; tht is, there re no productions of the form w, for nd w. Theorem 2.1 ([14], Theorem 3.1) To every kule(p)t0l system = (Σ, H, ψ,, k), there exists n equivlent pseudo-synchronized kule(p)t0l system = (Σ, H, S,, k) such tht L() = L( ).

4 Sun ensch, Mrtin Kutri, ndres Mlcher In the following we ssume kulept0l system to e pseudo-synchronized without explicitly mentioning. The reder is ssumed to e fmilir with the sic notions of trees, forests, root, prent, child, ncestor, descendnt, internl node nd lef. Let v e node. The depth of v is its distnce from the root of its tree, plus 1. The height of v is the distnce to its furthest descendnt. Leves hve height zero nd roots hve depth one. Let F e forest. The height of F is the mximl height of its roots. y F we denote the numer of nodes in F nd y #F we denote the numer of trees in F. The re forest of F is otined y deleting ll leves in F ; the re forest of F is denoted y re(f ). The child forest of F is otined y deleting ll roots from F. To ech derivtion of kulept0l system one cn ssocite derivtion tree t in similr fshion s it is done for context-free grmmrs. Moreover, if node is leled with symol Σ, nd the lel of its children (from left to right) form the word w, then w is production in tle of the kulept0l system. derivtion forest is forest contining tree for ech symol of the xiom. The roots of the trees re leled y these symols. We illustrte some of these notions y the following exmple. Exmple 2.2 (derivtion forest) Let = (Σ, H, ψ,, k) e the kulept0l system with Σ = {,,, }, = {, }, ψ =, k = 3, H = {h 1, h 2 }, where the set of tles is given y h 1 = {,,, }, h 2 = {,,, }. Consider the following derivtion, in which we mrk the symols which re rewritten with dot. Tle h 1 is used in the first two derivtion steps nd then Tle h 2 is pplied to terminte the derivtion: ȦḂ ȦḂ ḂḂ Ȧ ȧḃḃ. The derivtion forest for is illustrted in Figure 1. Figure 1: Derivtion forest of derived y the kulept0l system given in Exmple 2.2.

Extended kulpt0l Lnguges nd Mild Context-Sensitivity 5 In the following we will divide the derivtions of kulept0l system into two phses. The sententil forms w in derivtion of kulept0l cn e divided into phse in which ech w < k nd into the phse in which ech w k. In the first phse ll symols hve to e rewritten nd in the second phse exctly k symols hve to e rewritten. The derivtion steps in the first phse correspond to derivtion steps in n EPT0L derivtion nd the second phse is clled k-derivtion of kulept0l system. Note tht in propgting kulet0l the lengths of the sententil forms do not decrese. In the following we define k-derivtion of kulept0l system nd its k-derivtion forest. Definition 2.3 (k-derivtion forest) Let = (Σ, H, ψ,, k) e kulept0l system. k- derivtion in is sequence of words w 1,..., w l+1, such tht, for ll 1 i l, w i w i+1 nd w i k, 1 i l + 1. The length of the derivtion w 1,..., w l+1 is l nd the ith step is w i w i+1. k-derivtion forest is derivtion forest tht corresponds to k-derivtion. Exmple 2.4 (k-derivtion forest) Note tht the derivtion forest given in Exmple 2.2 is not k-derivtion forest, since ψ < k. Figure 2 illustrtes k-derivtion forest for the k- derivtion Ȧ Ḃ ȦḂḂ Ȧ ȧḃḃ s derived y the kulept0l system given in Exmple 2.2. Figure 2: k-derivtion forest of derived y the kulept0l system given in Exmple 2.2. Next we turn to define schedules on forests following [7]. Let F e forest. We ssume tht there re k processors tht correspond to rewriting k symols in ech derivtion step. Every node in F is interpreted s tsk. We ssume tht ll tsks re unit-length (tht is, they tke the sme time to e executed). The prent-child reltion in F specifies the precedence constrints (tht is, the prent nodes re scheduled efore its children nodes). k-schedule is then sequence of slots, where ech slot contins up to k tsks. Ech slot hs corresponding time unit. Ech slot indictes which of the t most k tsks re to e scheduled in the corresponding time unit. Ech slot is filled with symols tht occur on the left hnd side of production in tle h H of given kulept0l system = (Σ, H, ψ,, k). Figure 3 illustrtes these notions. Definition 2.5 (k-schedule [7]) Let F e forest nd let k 1. k-schedule of F is function σ mpping the nodes of F onto the set {1,..., l}, for some l F, such tht

6 Sun ensch, Mrtin Kutri, ndres Mlcher time 1 2... l processor 1 processor 2. processor k slot 1 slot 2 slot l Figure 3: n illustrtion of k-schedule of length l with k processors. Ech cell will e filled with one symol from n lphet. 1. 1 #σ 1 (i) k for ll 1 k l, 2. for ech pir of nodes v 1, v 2 in F, if v 2 is successor of v 1, then σ(v 2 ) > σ(v 1 ). The length of σ is l nd σ 1 (i) is clled the ith slot of σ. The tsks of slot i re scheduled t time i (tht is, #σ 1 (i) out of the k processors re ssigned tsk t tht time). There re k #σ 1 (i) idle periods in slot i (tht is, periods in which not ll k processors re used). schedule σ hs p(σ) idle periods, where p(σ) = l (k #σ 1 (i)) = l k F. i=1 schedule σ is optiml for F if there is no schedule σ of F with p(σ ) p(σ). Note tht optiml schedules hve miniml length. The numer of idle periods of F, denoted p(f ), is the numer of idle periods in n optiml schedule for F. If p(σ) = 0 in schedule σ, then σ is clled perfect. 1 2 3 4 5 1 2 3 4 Figure 4: The tle on the left side is k-schedule, k = 3, for the derivtion forest given in Figure 1; the schedule is optiml nd hs one idle period. The tle on the right side is k-schedule, k = 3, for the k-derivtion forest given in Figure 2; it is perfect schedule. Oserve tht the re forests of k-derivtion forests hve perfect k-schedules. If v 1,..., v k re symols tht re rewritten in step i during k-derivtion, then v 1,..., v k occur in the ith slot of the corresponding perfect schedule. Lemm 2.6 ([7]) derivtion forest is k-derivtion forest if nd only if its re forest hs perfect schedule. Lemm 2.7 (first nd second phse) Let = (Σ, H, ψ,, k) e kulept0l system nd let ψ w 1 w l w n = w e derivtion of w in, where ψ < k, w i < k, for 1 i l 1 nd w l k.

Extended kulpt0l Lnguges nd Mild Context-Sensitivity 7 word w is in L() if nd only if there exists derivtion ψ w, such tht 1. there is derivtion tree t of w l from ψ, nd 2. there is k-derivtion tree s of w from w l, such tht p(re(s)) = 0. We refer to the derivtion from ψ to w l s the first phse nd to the derivtion from w l to w s the second phse. If ψ k, then the derivtion hs no first phse. Proof. ll slots in k-schedule for k-derivtion re filled with tsks nd there re no idle periods. The following lgorithm is for otining Highest Level First (HLF) k-schedule for forest. Roughly speking, the lgorithm uilds n HLF k-schedule for forest F y scheduling the nodes on the longest pths in tree in F. lgorithm 2.8 ([7]) 1. If F consists of t lest k trees, then σ 1 (1) contins the roots of the k highest trees (for trees of equl height the choice is ritrry). 2. Otherwise, σ 1 (1) is the set of ll the roots. 3. The til of the schedule is constructed similrly, with the nodes in σ 1 (1) deleted from F. The k-schedules in Figure 4 re not HLF k-schedules. n HLF k-schedule is given in Figure 5. 1 2 3 4 Figure 5: n HLF k-schedule for the k-derivtion forest given in Figure 2. Theorem 2.9 ([3, 5]) ny HLF schedule for forest is optiml. 3. Polynomil Memership lgorithm We divide the memership testing into the two phses of derivtion of kulept0l system (see Lemm 2.7). The first phse ends when the length of the sententil form is t lest k. Since there re t most ( Σ + 1) k 1 different sententil forms whose length is less thn k nd the systems re propgting, testing memership in the first phse cn e done in constnt time. For the second phse we use the lgorithm in [7] for extended context-free grmmrs, where k nonterminl symols re replced in ech derivtion step. The lgorithm in [7] is ottom-up nd keeps trck of ll the derivtion trees deriving suwords of the input word, similr to the CYK lgorithm. These derivtion trees re prmeterized nd we otin polynomil

8 Sun ensch, Mrtin Kutri, ndres Mlcher size chrcteriztion. There my e fmily of derivtion forests for given word. The prmeterized trees re clled frmes nd the prmeters re the root symol, the strt nd end position of the suword in w, the height of the sutree nd the numer of its nodes. The lgorithm computes then the numer of idle periods for frme collection y using the medin. In [5] the medin ws used to present polynomil time scheduling lgorithm for forests nd other grphs ssuming constnt numer of processors. Note tht the numer of idle periods for vrious frme collections hve to e computed. Intuitively, ll those k-derivtion trees which re higher thn the medin re hrd to schedule, wheres ll other k-derivtion trees re esy to schedule. There re only polynomil numer of frme collections tht re hrd to schedule nd this chieves polynomil time lgorithm for solving memership for constnt k nd constnt kulept0l system. Definition 3.1 (k-medin [7]) The k-medin of forest F is one plus the height of the kth highest tree of F. If F contins less thn k trees, then the medin is zero. The k-high forest of F is the set of ll those trees in F which re strictly higher thn the medin. The k-low forest is the set of the remining trees. The k-high forest nd k-low forest of forest F re denoted y High k (F ) nd Low k (F ), respectively. Theorem 3.2 ([7]) Let F e forest nd σ e k-schedule for High k (F ) with q idle periods. Then there is schedule σ for the whole forest F, such tht: 1. if q Low k (F ), then the length of σ is s most s long s the length of σ; 2. if q < Low k (F ), then σ hs idle periods only in its lst slot. Lemm 3.3 ([7]) ssume tht the HLF schedules for High k (F ) hve q idle periods. Then the HLF schedules for F hve 1. q Low k (F ) idle periods if q Low k (F ), 2. F mod k idle periods, otherwise. The following lemm is resttement for kulept0l derivtions nd restricts the length of derivtion for word w in kulept0l lnguge polynomilly in n. Lemm 3.4 Let = (Σ, H, ψ,, k) e kulept0l system nd let w with w = n. Then w L() if nd only if there exists derivtion tree t of w from ψ, such tht the height of t is t most f(n) = nk 2k (#Σ) k(k+1)/2 nd t nf(n). The ound on the totl numer of nodes t follows from the fct tht in ech tree level, there cn e t most n nodes. The following definition is resttement for kulept0l systems nd its k-derivtion trees. The k-derivtion trees re prmeterized into frmes.

Extended kulpt0l Lnguges nd Mild Context-Sensitivity 9 Definition 3.5 Let e kulept0l nd let w = 1 n, where 1,..., n nd w k. frme R (of w) is quintuple (, l, r, h, c), such tht Σ is the root of R, 1 l r n, nd there is k-derivtion tree t of l r from in, such tht its re tree hs height h nd c nodes. If the derivtion tree is of height zero, tht is, is terminl symol, then c = 0 nd h = 1. tree t s ove is clled frme tree for R. The height of frme R is h nd the size of R, denoted R, is c. n ordered set R of frmes (, l, r, h, c) is clled frme collection, where ll in R occur on the left hnd side of production in tle h H of given kulept0l system. The height of R is the mximum of the frme heights in R nd the size of R is the sum of the sizes of the frmes in R. If F is forest, such tht the ith tree in F is frme tree for the ith frme in R, for 1 i #F = #R, then F is clled frme forest of R (see Exmple 3.6 for n exmple). Exmple 3.6 The forest in Figure 2 is frme forest for the frme collection R: R = {(, 1, 4, 3, 8), (, 5, 7, 3, 6)}. The notions of k-medin, k-high collection (k-high forest), nd k-low collection (k-low forest) re defined similrly for frme collections. In prticulr, the numer of idle periods for frme collection is given y p(r) = min{p(re(f )) F is frme forest of R}. The following is resttement of Lemm 2.7 for frmes. Lemm 3.7 Let = (Σ, H, ψ,, k) e kulept0l system nd let ψ w 1 w l w n = w e derivtion of w in, where ψ < k, w i < k, for 1 i l 1 nd w l k. Furthermore, let S w l e n dditionl rule for, where S is new symol in Σ \ nd S w l new production in new tle h new in H. word w is in L() if nd only if there exists derivtion ψ 1. there is derivtion tree t of w l from ψ, nd w, such tht 2. there exists frme R = (S, 1, n, h, c), for some h nd c, such tht p(r) = 0. If ψ k, then let S ψ e n dditionl rule in new tle in H of. Definition 3.8 (child collection [7]) Let R = (, l, r, h, c) e frme of word w, nd let R = {R 1,..., R j } e frme collection of w, where R i = ( i, l i, r i, h i, c i ) for ll 1 < i j. We

10 Sun ensch, Mrtin Kutri, ndres Mlcher sy tht R is child collection of R if: 1 j H, l = l 1, r j = r, nd l i = r i 1 + 1 for ll 2 i j, h = 1 + mx{h 1,..., h j }, nd c = 1 + j c i. child collection of frme collection R is otined y choosing child collection for ech of the frmes in R nd tking their union. i=1 y Lemm 3.4, we only consider those frmes R = (, l, r, h, c) with ounded length. Since there re t most #Σ choices for, nd n choices for ech l nd r nd since f(n) in Lemm 3.4 is liner function, the following ound is otined. Corollry 3.9 ([7]) There re O(n 5 ) frmes to e computed while testing memership for word of length n. The following corollry is resttement of Lemm 3.3 for frmes. Corollry 3.10 ([7]) Let R e frme collection. Then p(r) = { p(high k (R)) Low k (R) if p(high k (R)) Low k (R), R mod k, otherwise. y definition, k-high collection consists of t most k 1 frmes nd y Corollry 3.9 t most O(n 5(k 1) ) idle periods of k-high collections hve to e computed. Lemm 3.11 ([7]) Let j e the numer of frmes in frme collection I, where I = High k (I). Then { 0 if I is empty, p(i) = k j + min{ p(r ) R is child collection of I }, otherwise. The lgorithm in [7] first constructs ll the frmes for the input word w, nd then, computes the numer of idle periods for ech possile high collection. The numer of idle periods for ech frme of word w is computle in polynomil time. The numer of idle periods of vrious frme collections is computed, y incresing height, using the recurrences stted in Corollry 3.10 nd Lemm 3.11. Finlly, it is tested whether there exists frme tht covers ll of w, tht is, frme of the form (S, 1, w, h, c) nd tht hs k 1 idle periods. The lgorithm does not only decide memership ut lso provides the informtion necessry to construct k-derivtion for n input word w (see [7]). Theorem 3.12 ([7]) lgorithm 1 runs in time polynomil in n, O(n 5(k 1)(Mxr()+1)+1 ), if oth k nd re constnt.

Extended kulpt0l Lnguges nd Mild Context-Sensitivity 11 lgorithm 1: djusted memership lgorithm from [7] for the second phse in kulept0l derivtions Dt: kulept0l = ({ 1,..., m }, H, S,, k), where H contins new tle h new with S w s its single production, nd word w = p1 pn. Result: ccept if w L(), otherwise reject. egin /* test if w = w nd S directly derives w in one derivtion step; */ if S w then ccept; /* construct ll the frmes of w, of height h; */ for i := 1 to n do ( pi, i, i, 1, 0) is frme; for h := 0 to f(n) do forll the g H do forll the 1 2 j g do forll the 1 l 0... l j n do forll the h 1,..., h j with mx{h 1,..., h j } = h 1 do forll the 0 c 1,..., c j nf(n) do if ( i, l i 1, l i, h i, c i ) is frme or 1 i j then (, l 0, l j, h, c 1 + c 2 +... + c j + 1) is frme; /* compute the numer of idle periods for ll collections up to k 1 frmes; */ p({}) := 0; for h := 1 to f(n) do forll the frme collections R of height h, consisting of up to k 1 frmes, ech of positive height do q := ; forll the child collections R of R do forll the High k (R ) where ll in the frmes occur on the left hnd side of production in tle g H do I := High k (R ); /* Since the height of I is h 1, we cn recur on p(i ); */ if p(i ) Low k (R ) then p(r ) := p(i ) Low k (R ) else p(r ) := R mod k; q := min{q, p(r )}; p(r) := k #R + q; /* This is the memership test; */ for h := 1 to f(n) do for c := 2 to n f(n) do if (S, 1, n, h, c) is frme nd p((s, 1, n, h, c)) = k 1 then ccept else reject;

12 Sun ensch, Mrtin Kutri, ndres Mlcher References [1] S. ENSCH, Prllel systems s mildly context-sensitive grmmr formlisms. Ph.D. thesis, Universität Potsdm, Potsdm, ermny, 2009. [2] H. ORDIHN, Mildly context-sensitive grmmrs. In: C. MRTÍN-VIDE, V. MITRN,. PĂUN (eds.), Forml Lnguges nd ppliction, Studies in Fuzziness nd Soft Computing 148. Springer, erlin, 2004, 163 173. [3] J. RUNO, Deterministic nd stochstic scheduling prolems with tree-like precedence constrints. NTO Conference (1981). [4] J. DSSOW,. PĂUN, Regulted Rewriting in Forml Lnguge Theory. Springer, 1989. [5] D. DOLEV, M. WRMUTH, Scheduling flt grphs. SIM Journl on Computing 14 (1985) 3, 638 657. [6] J. ERLEY, n efficient context-free prsing lgorithm. Communictions of the CM 13 (1970) 2, 94 102. [7] J. ONCZROWSKI, M. K. WRMUTH, pplictions of scheduling theory to forml lnguge theory. Theoreticl Computer Science 37 (1985), 217 243. [8]. K. JOSHI, Tree djoining grmmrs: how much context-sensitivity is required to provide resonle structurl descriptions? In: D. R. DOWTY, L. KRTTUNEN,. M. ZWICKY (eds.), Nturl Lnguge Prsing. Psychologicl, Computtionl, nd Theoreticl Perspectives. Cmridge University Press, New York, 1985, 206 250. [9]. K. JOSHI, K. VIJY-SHNKER, D. J. WEIR, The convergence of mildly context-sensitive grmmr formlisms. In: T. WSOW, P. SELLS (eds.), The Processing of Linguistic Structure. MIT Press, 1991, 31 81. [10] J. OPTRNY, K. CULIK II, Time complexity of recognition nd prsing of E0L lnguges. In:. LINDENMYER,. ROZENER (eds.), utomt, Lnguges, Development. North- Hollnd, msterdm, 1976, 243 250. [11]. ROZENER,. SLOM, The Mthemticl Theory of L-Systems. cdemic Press, New York, 1980. [12] J. VN LEEUWEN, The memership question for ET0L lnguges is polynomilly complete. Informtion Processing Letters 3 (1967), 138 143. [13] K. VIJY-SHNKER, D. J. WEIR, The equivlence of four extensions of context-free grmmrs. Mthemticl Systems Theory 27 (1994) 6, 511 545. [14] D. WÄTJEN, E. UNRUH, On extended k-uniformly limited T0L systems nd lnguges. Informtion Processing nd Cyernetics EIK 26 5/6 (1990), 283 299. [15] D. H. YOUNER, Recognition nd prsing of context-free lnguges in time n 3. Informtion nd Control 10 (1967), 189 208.