How to train your multi bottom-up tree transducer

Size: px
Start display at page:

Download "How to train your multi bottom-up tree transducer"

Transcription

1 How to train your multi bottom-up tree transducer Andreas Maletti Universität tuttgart Institute for Natural Language Processing tuttgart, Germany Portland, OR June 22, 2011 How to train your multi bottom-up tree transducer A. Maletti 1

2 ynchronous Tree ubstitution Grammar Used rule Next rule CONJ wa How to train your multi bottom-up tree transducer A. Maletti 2

3 ynchronous Tree ubstitution Grammar CONJ wa Used rule Next rule CONJ wa 1 p 2 p 1 2 How to train your multi bottom-up tree transducer A. Maletti 2

4 ynchronous Tree ubstitution Grammar 1 CONJ p 2 wa p 1 2 Used rule Next rule 1 p 1 2 V saw p V ra aa p 2 How to train your multi bottom-up tree transducer A. Maletti 2

5 ynchronous Tree ubstitution Grammar 1 V saw 2 CONJ wa V ra aa 1 2 Used rule Next rule V saw p V ra aa DT r 1 r the How to train your multi bottom-up tree transducer A. Maletti 2

6 ynchronous Tree ubstitution Grammar CONJ DT r V 2 wa V 2 the saw ra aa r Used rule Next rule DT r 1 r the N boy r N atefl How to train your multi bottom-up tree transducer A. Maletti 2

7 ynchronous Tree ubstitution Grammar CONJ DT N V 2 wa V 2 the boy saw ra aa N atefl Used rule Next rule N boy r N atefl DT r 2 r the How to train your multi bottom-up tree transducer A. Maletti 2

8 ynchronous Tree ubstitution Grammar CONJ DT N V wa V the boy saw DT r ra aa N r the atefl Used rule Next rule DT r 2 r the N door r N albab How to train your multi bottom-up tree transducer A. Maletti 2

9 ynchronous Tree ubstitution Grammar CONJ DT N V wa V the boy saw DT N ra aa N N the door atefl albab Used rule Next rule N door r N albab How to train your multi bottom-up tree transducer A. Maletti 2

10 ynchronous Tree ubstitution Grammar Rule shape states can occur only once in lhs and rhs states in rhs = states in lhs exactly one lhs and one rhs bijective synchronization relation 1 p 1 2 p 2 How to train your multi bottom-up tree transducer A. Maletti 3

11 ynchronous Tree ubstitution Grammar Rule shape states can occur only once in lhs and rhs states in rhs = states in lhs exactly one lhs and one rhs bijective synchronization relation 1 p 1 2 p 2 How to train your multi bottom-up tree transducer A. Maletti 3

12 ynchronous Tree ubstitution Grammar Rule shape states can occur only once in lhs and rhs states in rhs = states in lhs exactly one lhs and one rhs bijective synchronization relation Notes version with states (instead of locality tests) euivalent to linear nondeleting extended tree transducers implemented in TIBURON [May, Knight 2006] How to train your multi bottom-up tree transducer A. Maletti 3

13 Multi Bottom-up Tree Transducer Used rule Next rule p wa p p How to train your multi bottom-up tree transducer A. Maletti 4

14 Multi Bottom-up Tree Transducer p wa p p Used rule Next rule p wa p p signed r p twly -OBJ r AltwyE How to train your multi bottom-up tree transducer A. Maletti 4

15 Multi Bottom-up Tree Transducer wa signed r twly -OBJ r AltwyE Used rule Next rule signed r p twly -OBJ r AltwyE -BJ p Voislav -BJ p fwyslaf How to train your multi bottom-up tree transducer A. Maletti 4

16 Multi Bottom-up Tree Transducer -BJ wa p Voislav signed r twly -OBJ -BJ r p AltwyE fwyslaf Used rule Next rule -BJ p Voislav -BJ p fwyslaf NML Yugoslav President p Alr}ys AlywgwslAfy How to train your multi bottom-up tree transducer A. Maletti 4

17 Multi Bottom-up Tree Transducer -BJ wa NML Voislav signed r twly -OBJ -BJ Yugoslav President r... AltwyE Alr}ys AlywgwslAfy Used rule Next rule PP PP Yugoslav NML President p Alr}ys AlywgwslAfy for erbia r en rbya How to train your multi bottom-up tree transducer A. Maletti 4

18 Multi Bottom-up Tree Transducer -BJ NML Voislav Yugoslav President signed PP for erbia wa twly -OBJ... PP AltwyE en rbya Used rule Next rule PP for erbia r PP en rbya How to train your multi bottom-up tree transducer A. Maletti 4

19 Multi Bottom-up Tree Transducer Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: bijective synchronization relation MBOT: inverse synchronization relation is functional p wa p p How to train your multi bottom-up tree transducer A. Maletti 5

20 Multi Bottom-up Tree Transducer Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: bijective synchronization relation MBOT: inverse synchronization relation is functional p wa p p How to train your multi bottom-up tree transducer A. Maletti 5

21 Multi Bottom-up Tree Transducer Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: bijective synchronization relation MBOT: inverse synchronization relation is functional signed r p twly -OBJ AltwyE r How to train your multi bottom-up tree transducer A. Maletti 5

22 Multi Bottom-up Tree Transducer Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: bijective synchronization relation MBOT: inverse synchronization relation is functional p wa p p How to train your multi bottom-up tree transducer A. Maletti 5

23 TG vs. MBOT property TG MBOT simple and natural? symmetric preserves regularity inverse pres. regularity binarizable closed under composition input parsing O( M n x ) O( M n 3 ) output parsing O( M n x ) O( M n y ) where x = 2 rk(m) + 5 and y = 2 rk(m) + 2 How to train your multi bottom-up tree transducer A. Maletti 6

24 Additional Expressive Power Finite copying { t, σ(t, t) t T Σ } desirable [ATANLP participants, 2010] e.g., for cross-serial dependencies harms preservation of regularity Non-contiguous rules BLEU [un, Zhang, Tan, 2009] do not harm preservation of regularity How to train your multi bottom-up tree transducer A. Maletti 7

25 Additional Expressive Power Finite copying { t, σ(t, t) t T Σ } desirable [ATANLP participants, 2010] e.g., for cross-serial dependencies harms preservation of regularity Non-contiguous rules BLEU [un, Zhang, Tan, 2009] do not harm preservation of regularity How to train your multi bottom-up tree transducer A. Maletti 7

26 Approach Old approach 1 extract TG rules 2 train TG 3 convert to MBOT 4 work with MBOT New approach 1 extract (restricted) MBOT rules 2 train MBOT 3 work with MBOT How to train your multi bottom-up tree transducer A. Maletti 8

27 Approach Old approach 1 extract TG rules 2 train TG 3 convert to MBOT 4 work with MBOT New approach 1 extract (restricted) MBOT rules 2 train MBOT 3 work with MBOT How to train your multi bottom-up tree transducer A. Maletti 8

28 Roadmap 1 Motivation 2 Relation to TG 3 Rule Extraction and Training 4 Preservation of Recognizability How to train your multi bottom-up tree transducer A. Maletti 9

29 ynchronous Tree-euence ubstitution Grammar Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: (no restriction) TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: (no restriction) TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: (no restriction) How to train your multi bottom-up tree transducer A. Maletti 10

30 ynchronous Tree-euence ubstitution Grammar Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: (no restriction) TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: (no restriction) TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: (no restriction) How to train your multi bottom-up tree transducer A. Maletti 10

31 ynchronous Tree-euence ubstitution Grammar Rule shape TG: states can occur only once in lhs and rhs MBOT: states can occur only once in lhs TG: (no restriction) TG: states in rhs = states in lhs MBOT: states in rhs states in lhs TG: (no restriction) TG: exactly one lhs and one rhs MBOT: exactly one lhs TG: (no restriction) How to train your multi bottom-up tree transducer A. Maletti 10

32 Expressive Power Corollary TG MBOT TG Theorem TG MBOT TG Claim MBOT 1 ; MBOT = TG How to train your multi bottom-up tree transducer A. Maletti 11

33 Expressive Power Corollary TG MBOT TG Theorem TG MBOT TG Claim MBOT 1 ; MBOT = TG How to train your multi bottom-up tree transducer A. Maletti 11

34 Expressive Power Corollary TG MBOT TG Theorem TG MBOT TG Claim MBOT 1 ; MBOT = TG How to train your multi bottom-up tree transducer A. Maletti 11

35 TG vs. MBOT vs. TG property TG MBOT TG simple and natural??? symmetric preserves regularity inverse pres. regularity binarizable closed under composition input parsing O( M n x ) O( M n 3 ) O( M n y ) output parsing O( M n x ) O( M n y ) O( M n y ) where x = 2 rk(m) + 5 and y = 2 rk(m) + 2 How to train your multi bottom-up tree transducer A. Maletti 12

36 Roadmap 1 Motivation 2 Relation to TG 3 Rule Extraction and Training 4 Preservation of Recognizability How to train your multi bottom-up tree transducer A. Maletti 13

37 Rule Extraction Input parallel bi-text parses for input and output word alignment Output MBOT rules How to train your multi bottom-up tree transducer A. Maletti 14

38 Rule Extraction Input parallel bi-text parses for input and output word alignment Output MBOT rules How to train your multi bottom-up tree transducer A. Maletti 14

39 -BJ NML N VBD PP JJ N Voislav signed IN Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 15

40 -BJ NML N VBD PP JJ N Voislav signed IN Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 15

41 -BJ NML N VBD PP JJ N Voislav signed IN Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 15

42 -BJ NML N VBD PP JJ N Voislav signed IN Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 15

43 -BJ NML N VBD PP JJ N Voislav signed IN Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 15

44 Rule Extraction JJ -BJ NML N N Voislav VBD PP signed IN V P CONJ w V P V P Yugoslav President for N erbia rbya AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ How to train your multi bottom-up tree transducer A. Maletti 16

45 Rule Extraction JJ -BJ NML N N Voislav VBD PP signed IN V P CONJ w V P V P Yugoslav President for N erbia rbya VBD P P AltwyE En NN-PROP Alr}ys AlywgwslAfy fwyslaf signed DET-NN PREP DET-NN DET-ADJ NN-PROP twly PP w PV -OBJ -BJ CONJ V P PV twly -OBJ P P DET-NN AltwyE How to train your multi bottom-up tree transducer A. Maletti 16

46 Extracted Rules TG rule MBOT rules TG rule PV -OBJ twly P P VBD P P DET-NN signed AltwyE How to train your multi bottom-up tree transducer A. Maletti 17

47 Extracted Rules TG rule MBOT rules TG rule PV -OBJ twly P P VBD P P DET-NN signed AltwyE V P V P V P -OBJ PV P P VBD P P V P twly DET-NN signed AltwyE How to train your multi bottom-up tree transducer A. Maletti 17

48 Extracted Rules TG rule MBOT rules TG rule PV -OBJ twly P P VBD P P DET-NN signed AltwyE V P V P V P -OBJ PV P P VBD P P V P twly DET-NN signed AltwyE VBD signed IN PV for twly DET-NN AltwyE PREP En How to train your multi bottom-up tree transducer A. Maletti 17

49 Training Definition A tree language is regular if it can be represented by a TG (with states). Example is not regular. {σ(t, t) t T Σ } How to train your multi bottom-up tree transducer A. Maletti 18

50 Training Theorem The set of derivations of an MBOT is regular. Conclusion Minimal adaptation of [Graehl, Knight, May 2008] can be used. How to train your multi bottom-up tree transducer A. Maletti 19

51 Training Theorem The set of derivations of an MBOT is regular. Conclusion Minimal adaptation of [Graehl, Knight, May 2008] can be used. How to train your multi bottom-up tree transducer A. Maletti 19

52 Roadmap 1 Motivation 2 Relation to TG 3 Rule Extraction and Training 4 Preservation of Recognizability How to train your multi bottom-up tree transducer A. Maletti 20

53 Preservation of Regularity Input MBOT M regular tree language L Question Is M(L) = {t s L: s, t M} regular? Example M = { t, σ(t, t) t T Σ } infinite, but regular tree language L T Σ Then M(L) = {σ(t, t) t L} is not regular. How to train your multi bottom-up tree transducer A. Maletti 21

54 Preservation of Regularity Input MBOT M regular tree language L Question Is M(L) = {t s L: s, t M} regular? Example M = { t, σ(t, t) t T Σ } infinite, but regular tree language L T Σ Then M(L) = {σ(t, t) t L} is not regular. How to train your multi bottom-up tree transducer A. Maletti 21

55 Preservation of Regularity Why desired? easy and efficient representation of output languages allows intersection with (regular) language model allows bucket-brigade algorithms [May, Knight, Vogler, 2010] How to train your multi bottom-up tree transducer A. Maletti 22

56 Rule Composition Input rules Output rule V P VBD P P signed V P V P V P PV twly -OBJ DET-NN AltwyE P P VBD P P signed PV twly -OBJ DET-NN AltwyE P P How to train your multi bottom-up tree transducer A. Maletti 23

57 Rule Composition Power MBOT M k = {R 1 R k R 1,..., R k M} Explanation M k contains k-fold composed rules composition fails if states do not match composition succeeds (with no effect) if no matchable states present How to train your multi bottom-up tree transducer A. Maletti 24

58 Finitely Collapsing Definition M is finitely collapsing if there exists k N such that every state occurs at most once in rhs of every rule of M k. Example V P V P V P -OBJ PV P P VBD P P V P twly DET-NN signed AltwyE PV -OBJ twly P P VBD P P DET-NN signed AltwyE How to train your multi bottom-up tree transducer A. Maletti 25

59 Finite ynchronization Definition M has finite synchronization if there exists k N such that all occurrences of a state occur in the same tree in the rhs of every rule of M k. Theorem (Raoult, 1997) MBOT with finite synchronization and finitely collapsing MBOT preserve regularity. How to train your multi bottom-up tree transducer A. Maletti 26

60 Finite ynchronization Definition M has finite synchronization if there exists k N such that all occurrences of a state occur in the same tree in the rhs of every rule of M k. Theorem (Raoult, 1997) MBOT with finite synchronization and finitely collapsing MBOT preserve regularity. How to train your multi bottom-up tree transducer A. Maletti 26

61 Copy-free Definition M is copy-free if there exists k N such that for every rule of M k and all occurrences w of a state: w is the root of a tree in the rhs or w belongs to a fixed tree in the rhs. Example (not copy-free) X x X c X c X a X a X X b X b X σ How to train your multi bottom-up tree transducer A. Maletti 27

62 Copy-free Definition M is copy-free if there exists k N such that for every rule of M k and all occurrences w of a state: w is the root of a tree in the rhs or w belongs to a fixed tree in the rhs. Example (copy-free) X Y Z X X X X X X X X Y X How to train your multi bottom-up tree transducer A. Maletti 27

63 Main Result Theorem Copy-free MBOT preserve regularity. Theorem finitely collapsing finite synchronization copy-free Note All 3 restrictions can be guaranteed during rule extraction. How to train your multi bottom-up tree transducer A. Maletti 28

64 Main Result Theorem Copy-free MBOT preserve regularity. Theorem finitely collapsing finite synchronization copy-free Note All 3 restrictions can be guaranteed during rule extraction. How to train your multi bottom-up tree transducer A. Maletti 28

65 Main Result Theorem Copy-free MBOT preserve regularity. Theorem finitely collapsing finite synchronization copy-free Note All 3 restrictions can be guaranteed during rule extraction. How to train your multi bottom-up tree transducer A. Maletti 28

66 Thank you for your attention! How to train your multi bottom-up tree transducer A. Maletti 29

67 References 1 ARNOLD, DAUCHET: Morphismes et bimorphismes d arbres. Theoret. Comput. ci., ATANLP participants: Desired properties of syntax-based MT systems. ACL workshop, GRAEHL, KNIGHT, MAY: Training tree transducers. Computational Linguistics, MAY, KNIGHT: TIBURON A weighted tree automata toolkit. In CIAA, MAY, KNIGHT, VOGLER: Efficient Inference through Cascades of Weighted Tree Transducers. In ACL, RAOULT: Rational tree relations. Bull. Belg. Math. oc. imon tevin, UN, ZHANG, TAN: A non-contiguous tree seuence alignment-based model for statistical machine translation. In ACL, 2009 How to train your multi bottom-up tree transducer A. Maletti 30

Applications of Tree Automata Theory Lecture V: Theory of Tree Transducers

Applications of Tree Automata Theory Lecture V: Theory of Tree Transducers Applications of Tree Automata Theory Lecture V: Theory of Tree Transducers Andreas Maletti Institute of Computer Science Universität Leipzig, Germany on leave from: Institute for Natural Language Processing

More information

Multi Bottom-up Tree Transducers

Multi Bottom-up Tree Transducers Multi Bottom-up Tree Transducers Andreas Maletti Institute for Natural Language Processing Universität tuttgart, Germany maletti@ims.uni-stuttgart.de Avignon April 24, 2012 Multi Bottom-up Tree Transducers

More information

Tree Transducers in Machine Translation

Tree Transducers in Machine Translation Tree Transducers in Machine Translation Andreas Maletti Universitat Rovira i Virgili Tarragona, pain andreas.maletti@urv.cat Jena August 23, 2010 Tree Transducers in MT A. Maletti 1 Machine translation

More information

Rule Extraction for Machine Translation

Rule Extraction for Machine Translation Rule Extraction for Machine Translation Andreas Maletti Institute of Computer cience Universität Leipzig, Germany on leave from: Institute for Natural Language Processing Universität tuttgart, Germany

More information

Applications of Tree Automata Theory Lecture VI: Back to Machine Translation

Applications of Tree Automata Theory Lecture VI: Back to Machine Translation Applications of Tree Automata Theory Lecture VI: Back to Machine Translation Andreas Maletti Institute of Computer Science Universität Leipzig, Germany on leave from: Institute for Natural Language Processing

More information

Lecture 9: Decoding. Andreas Maletti. Stuttgart January 20, Statistical Machine Translation. SMT VIII A. Maletti 1

Lecture 9: Decoding. Andreas Maletti. Stuttgart January 20, Statistical Machine Translation. SMT VIII A. Maletti 1 Lecture 9: Decoding Andreas Maletti Statistical Machine Translation Stuttgart January 20, 2012 SMT VIII A. Maletti 1 Lecture 9 Last time Synchronous grammars (tree transducers) Rule extraction Weight training

More information

Why Synchronous Tree Substitution Grammars?

Why Synchronous Tree Substitution Grammars? Why ynchronous Tr ustitution Grammars? Andras Maltti Univrsitat Rovira i Virgili Tarragona, pain andras.maltti@urv.cat Los Angls Jun 4, 2010 Why ynchronous Tr ustitution Grammars? A. Maltti 1 Motivation

More information

Linking Theorems for Tree Transducers

Linking Theorems for Tree Transducers Linking Theorems for Tree Transducers Andreas Maletti maletti@ims.uni-stuttgart.de peyer October 1, 2015 Andreas Maletti Linking Theorems for MBOT Theorietag 2015 1 / 32 tatistical Machine Translation

More information

The Power of Tree Series Transducers

The Power of Tree Series Transducers The Power of Tree Series Transducers Andreas Maletti 1 Technische Universität Dresden Fakultät Informatik June 15, 2006 1 Research funded by German Research Foundation (DFG GK 334) Andreas Maletti (TU

More information

Statistical Machine Translation of Natural Languages

Statistical Machine Translation of Natural Languages 1/26 Statistical Machine Translation of Natural Languages Heiko Vogler Technische Universität Dresden Germany Graduiertenkolleg Quantitative Logics and Automata Dresden, November, 2012 1/26 Weighted Tree

More information

Preservation of Recognizability for Weighted Linear Extended Top-Down Tree Transducers

Preservation of Recognizability for Weighted Linear Extended Top-Down Tree Transducers Preservation of Recognizability for Weighted Linear Extended Top-Down Tree Transducers Nina eemann and Daniel Quernheim and Fabienne Braune and Andreas Maletti University of tuttgart, Institute for Natural

More information

Lecture 7: Introduction to syntax-based MT

Lecture 7: Introduction to syntax-based MT Lecture 7: Introduction to syntax-based MT Andreas Maletti Statistical Machine Translation Stuttgart December 16, 2011 SMT VII A. Maletti 1 Lecture 7 Goals Overview Tree substitution grammars (tree automata)

More information

Composition Closure of ε-free Linear Extended Top-down Tree Transducers

Composition Closure of ε-free Linear Extended Top-down Tree Transducers Composition Closure of ε-free Linear Extended Top-down Tree Transducers Zoltán Fülöp 1, and Andreas Maletti 2, 1 Department of Foundations of Computer Science, University of Szeged Árpád tér 2, H-6720

More information

Compositions of Tree Series Transformations

Compositions of Tree Series Transformations Compositions of Tree Series Transformations Andreas Maletti a Technische Universität Dresden Fakultät Informatik D 01062 Dresden, Germany maletti@tcs.inf.tu-dresden.de December 03, 2004 1. Motivation 2.

More information

Hierarchies of Tree Series TransducersRevisited 1

Hierarchies of Tree Series TransducersRevisited 1 Hierarchies of Tree Series TransducersRevisited 1 Andreas Maletti 2 Technische Universität Dresden Fakultät Informatik June 27, 2006 1 Financially supported by the Friends and Benefactors of TU Dresden

More information

Statistical Machine Translation of Natural Languages

Statistical Machine Translation of Natural Languages 1/37 Statistical Machine Translation of Natural Languages Rule Extraction and Training Probabilities Matthias Büchse, Toni Dietze, Johannes Osterholzer, Torsten Stüber, Heiko Vogler Technische Universität

More information

Key words. tree transducer, natural language processing, hierarchy, composition closure

Key words. tree transducer, natural language processing, hierarchy, composition closure THE POWER OF EXTENDED TOP-DOWN TREE TRANSDUCERS ANDREAS MALETTI, JONATHAN GRAEHL, MARK HOPKINS, AND KEVIN KNIGHT Abstract. Extended top-down tree transducers (transducteurs généralisés descendants [Arnold,

More information

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this.

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this. Chapter 12 Synchronous CFGs Synchronous context-free grammars are a generalization of CFGs that generate pairs of related strings instead of single strings. They are useful in many situations where one

More information

Syntax-based Machine Translation using Multi Bottom-up Tree Transducers

Syntax-based Machine Translation using Multi Bottom-up Tree Transducers yntx-sed Mchine Trnsltion using Multi Bottom-up Tree Trnsducers Andres Mletti Fienne Brune, Dniel Quernheim, Nin eemnn Institute for Nturl Lnguge Processing Universität tuttgrt, Germny mletti@ims.uni-stuttgrt.de

More information

Tree transformations and dependencies

Tree transformations and dependencies Tree transformations and deendencies Andreas Maletti Universität Stuttgart Institute for Natural Language Processing Azenbergstraße 12, 70174 Stuttgart, Germany andreasmaletti@imsuni-stuttgartde Abstract

More information

Random Generation of Nondeterministic Tree Automata

Random Generation of Nondeterministic Tree Automata Random Generation of Nondeterministic Tree Automata Thomas Hanneforth 1 and Andreas Maletti 2 and Daniel Quernheim 2 1 Department of Linguistics University of Potsdam, Germany 2 Institute for Natural Language

More information

Compositions of Bottom-Up Tree Series Transformations

Compositions of Bottom-Up Tree Series Transformations Compositions of Bottom-Up Tree Series Transformations Andreas Maletti a Technische Universität Dresden Fakultät Informatik D 01062 Dresden, Germany maletti@tcs.inf.tu-dresden.de May 17, 2005 1. Motivation

More information

The Power of Weighted Regularity-Preserving Multi Bottom-up Tree Transducers

The Power of Weighted Regularity-Preserving Multi Bottom-up Tree Transducers International Journal of Foundations of Computer cience c World cientific Publishing Company The Power of Weighted Regularity-Preserving Multi Bottom-up Tree Transducers ANDREA MALETTI Universität Leipzig,

More information

Compositions of Top-down Tree Transducers with ε-rules

Compositions of Top-down Tree Transducers with ε-rules Compositions of Top-down Tree Transducers with ε-rules Andreas Maletti 1, and Heiko Vogler 2 1 Universitat Rovira i Virgili, Departament de Filologies Romàniques Avinguda de Catalunya 35, 43002 Tarragona,

More information

Syntax-Directed Translations and Quasi-alphabetic Tree Bimorphisms Revisited

Syntax-Directed Translations and Quasi-alphabetic Tree Bimorphisms Revisited Syntax-Directed Translations and Quasi-alphabetic Tree Bimorphisms Revisited Andreas Maletti and C t lin Ionuµ Tîrn uc Universitat Rovira i Virgili Departament de Filologies Romàniques Av. Catalunya 35,

More information

Composition and Decomposition of Extended Multi Bottom-Up Tree Transducers?

Composition and Decomposition of Extended Multi Bottom-Up Tree Transducers? Composition and Decomposition of Extended Multi Bottom-Up Tree Transducers? Joost Engelfriet 1, Eric Lilin 2, and Andreas Maletti 3;?? 1 Leiden Institute of Advanced Computer Science Leiden University,

More information

Extended Multi Bottom-Up Tree Transducers

Extended Multi Bottom-Up Tree Transducers Extended Multi Bottom-Up Tree Transducers Joost Engelfriet 1, Eric Lilin 2, and Andreas Maletti 3;? 1 Leiden Instite of Advanced Comper Science Leiden University, P.O. Box 9512, 2300 RA Leiden, The Netherlands

More information

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016 CKY & Earley Parsing Ling 571 Deep Processing Techniques for NLP January 13, 2016 No Class Monday: Martin Luther King Jr. Day CKY Parsing: Finish the parse Recognizer à Parser Roadmap Earley parsing Motivation:

More information

Compositions of Extended Top-down Tree Transducers

Compositions of Extended Top-down Tree Transducers Compositions of Extended Top-down Tree Transducers Andreas Maletti ;1 International Computer Science Institute 1947 Center Street, Suite 600, Berkeley, CA 94704, USA Abstract Unfortunately, the class of

More information

Pure and O-Substitution

Pure and O-Substitution International Journal of Foundations of Computer Science c World Scientific Publishing Company Pure and O-Substitution Andreas Maletti Department of Computer Science, Technische Universität Dresden 01062

More information

Synchronous Grammars

Synchronous Grammars ynchronous Grammars ynchronous grammars are a way of simultaneously generating pairs of recursively related strings ynchronous grammar w wʹ ynchronous grammars were originally invented for programming

More information

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):

More information

Compositions of Bottom-Up Tree Series Transformations

Compositions of Bottom-Up Tree Series Transformations Compositions of Bottom-Up Tree Series Transformations Andreas Maletti a Technische Universität Dresden Fakultät Informatik D 01062 Dresden, Germany maletti@tcs.inf.tu-dresden.de May 27, 2005 1. Motivation

More information

Minimization of Weighted Automata

Minimization of Weighted Automata Minimization of Weighted Automata Andreas Maletti Universitat Rovira i Virgili Tarragona, Spain Wrocław May 19, 2010 Minimization of Weighted Automata Andreas Maletti 1 In Collaboration with ZOLTÁN ÉSIK,

More information

Bringing machine learning & compositional semantics together: central concepts

Bringing machine learning & compositional semantics together: central concepts Bringing machine learning & compositional semantics together: central concepts https://githubcom/cgpotts/annualreview-complearning Chris Potts Stanford Linguistics CS 244U: Natural language understanding

More information

CS460/626 : Natural Language

CS460/626 : Natural Language CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 23, 24 Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th,

More information

Algorithms for Syntax-Aware Statistical Machine Translation

Algorithms for Syntax-Aware Statistical Machine Translation Algorithms for Syntax-Aware Statistical Machine Translation I. Dan Melamed, Wei Wang and Ben Wellington ew York University Syntax-Aware Statistical MT Statistical involves machine learning (ML) seems crucial

More information

On-the-Fly Construction, Correctness and Completeness of Model Transformations based on Triple Graph Grammars

On-the-Fly Construction, Correctness and Completeness of Model Transformations based on Triple Graph Grammars On-the-Fly Construction, Correctness and Completeness of Model Transformations based on Triple Graph Grammars Hartmut Ehrig, Claudia Ermel, Frank Hermann, and Ulrike Prange Technische Universität Berlin

More information

Aspects of Tree-Based Statistical Machine Translation

Aspects of Tree-Based Statistical Machine Translation Aspects of Tree-Based Statistical Machine Translation Marcello Federico Human Language Technology FBK 2014 Outline Tree-based translation models: Synchronous context free grammars Hierarchical phrase-based

More information

Exam: Synchronous Grammars

Exam: Synchronous Grammars Exam: ynchronous Grammars Duration: 3 hours Written documents are allowed. The numbers in front of questions are indicative of hardness or duration. ynchronous grammars consist of pairs of grammars whose

More information

CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012 CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012 Parsing Problem Semantics Part of Speech Tagging NLP Trinity Morph Analysis

More information

Structure and Complexity of Grammar-Based Machine Translation

Structure and Complexity of Grammar-Based Machine Translation Structure and of Grammar-Based Machine Translation University of Padua, Italy New York, June 9th, 2006 1 2 Synchronous context-free grammars Definitions Computational problems 3 problem SCFG projection

More information

Analysing Soft Syntax Features and Heuristics for Hierarchical Phrase Based Machine Translation

Analysing Soft Syntax Features and Heuristics for Hierarchical Phrase Based Machine Translation Analysing Soft Syntax Features and Heuristics for Hierarchical Phrase Based Machine Translation David Vilar, Daniel Stein, Hermann Ney IWSLT 2008, Honolulu, Hawaii 20. October 2008 Human Language Technology

More information

What You Must Remember When Processing Data Words

What You Must Remember When Processing Data Words What You Must Remember When Processing Data Words Michael Benedikt, Clemens Ley, and Gabriele Puppis Oxford University Computing Laboratory, Park Rd, Oxford OX13QD UK Abstract. We provide a Myhill-Nerode-like

More information

Computational Models - Lecture 3

Computational Models - Lecture 3 Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. p. 1 Computational Models - Lecture 3 Equivalence of regular expressions and regular languages (lukewarm leftover

More information

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the

More information

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46. : Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching

More information

ACS2: Decidability Decidability

ACS2: Decidability Decidability Decidability Bernhard Nebel and Christian Becker-Asano 1 Overview An investigation into the solvable/decidable Decidable languages The halting problem (undecidable) 2 Decidable problems? Acceptance problem

More information

Arturo Carpi 1 and Cristiano Maggi 2

Arturo Carpi 1 and Cristiano Maggi 2 Theoretical Informatics and Applications Theoret. Informatics Appl. 35 (2001) 513 524 ON SYNCHRONIZED SEQUENCES AND THEIR SEPARATORS Arturo Carpi 1 and Cristiano Maggi 2 Abstract. We introduce the notion

More information

Music as a Formal Language

Music as a Formal Language Music as a Formal Language Finite-State Automata and Pd Bryan Jurish moocow@ling.uni-potsdam.de Universität Potsdam, Institut für Linguistik, Potsdam, Germany pd convention 2004 / Jurish / Music as a formal

More information

acs-07: Decidability Decidability Andreas Karwath und Malte Helmert Informatik Theorie II (A) WS2009/10

acs-07: Decidability Decidability Andreas Karwath und Malte Helmert Informatik Theorie II (A) WS2009/10 Decidability Andreas Karwath und Malte Helmert 1 Overview An investigation into the solvable/decidable Decidable languages The halting problem (undecidable) 2 Decidable problems? Acceptance problem : decide

More information

LECTURER: BURCU CAN Spring

LECTURER: BURCU CAN Spring LECTURER: BURCU CAN 2017-2018 Spring Regular Language Hidden Markov Model (HMM) Context Free Language Context Sensitive Language Probabilistic Context Free Grammar (PCFG) Unrestricted Language PCFGs can

More information

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements

More information

What we have done so far

What we have done so far What we have done so far DFAs and regular languages NFAs and their equivalence to DFAs Regular expressions. Regular expressions capture exactly regular languages: Construct a NFA from a regular expression.

More information

1. Propositional Calculus

1. Propositional Calculus 1. Propositional Calculus Some notes for Math 601, Fall 2010 based on Elliott Mendelson, Introduction to Mathematical Logic, Fifth edition, 2010, Chapman & Hall. 2. Syntax ( grammar ). 1.1, p. 1. Given:

More information

A Syntax-based Statistical Machine Translation Model. Alexander Friedl, Georg Teichtmeister

A Syntax-based Statistical Machine Translation Model. Alexander Friedl, Georg Teichtmeister A Syntax-based Statistical Machine Translation Model Alexander Friedl, Georg Teichtmeister 4.12.2006 Introduction The model Experiment Conclusion Statistical Translation Model (STM): - mathematical model

More information

Prenominal Modifier Ordering via MSA. Alignment

Prenominal Modifier Ordering via MSA. Alignment Introduction Prenominal Modifier Ordering via Multiple Sequence Alignment Aaron Dunlop Margaret Mitchell 2 Brian Roark Oregon Health & Science University Portland, OR 2 University of Aberdeen Aberdeen,

More information

Probabilistic Context-Free Grammars. Michael Collins, Columbia University

Probabilistic Context-Free Grammars. Michael Collins, Columbia University Probabilistic Context-Free Grammars Michael Collins, Columbia University Overview Probabilistic Context-Free Grammars (PCFGs) The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Finite-State Transducers

Finite-State Transducers Finite-State Transducers - Seminar on Natural Language Processing - Michael Pradel July 6, 2007 Finite-state transducers play an important role in natural language processing. They provide a model for

More information

Backward and Forward Bisimulation Minimisation of Tree Automata

Backward and Forward Bisimulation Minimisation of Tree Automata Backward and Forward Bisimulation Minimisation of Tree Automata Johanna Högberg 1, Andreas Maletti 2, and Jonathan May 3 1 Dept. of Computing Science, Umeå University, S 90187 Umeå, Sweden johanna@cs.umu.se

More information

Statistical Methods for NLP

Statistical Methods for NLP Statistical Methods for NLP Stochastic Grammars Joakim Nivre Uppsala University Department of Linguistics and Philology joakim.nivre@lingfil.uu.se Statistical Methods for NLP 1(22) Structured Classification

More information

CSE 105 THEORY OF COMPUTATION. Spring 2018 review class

CSE 105 THEORY OF COMPUTATION. Spring 2018 review class CSE 105 THEORY OF COMPUTATION Spring 2018 review class Today's learning goals Summarize key concepts, ideas, themes from CSE 105. Approach your final exam studying with confidence. Identify areas to focus

More information

Proof by induction ME 8

Proof by induction ME 8 Proof by induction ME 8 n Let f ( n) 9, where n. f () 9 8, which is divisible by 8. f ( n) is divisible by 8 when n =. Assume that for n =, f ( ) 9 is divisible by 8 for. f ( ) 9 9.9 9(9 ) f ( ) f ( )

More information

Context Free Language Properties

Context Free Language Properties Context Free Language Properties Knowing that the context free languages are exactly those sets accepted by nondeterministic pushdown automata provides us a bit of information about them. We know that

More information

Improved TBL algorithm for learning context-free grammar

Improved TBL algorithm for learning context-free grammar Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 267 274 2007 PIPS Improved TBL algorithm for learning context-free grammar Marcin Jaworski

More information

Cross-Entropy and Estimation of Probabilistic Context-Free Grammars

Cross-Entropy and Estimation of Probabilistic Context-Free Grammars Cross-Entropy and Estimation of Probabilistic Context-Free Grammars Anna Corazza Department of Physics University Federico II via Cinthia I-8026 Napoli, Italy corazza@na.infn.it Giorgio Satta Department

More information

Shift-Reduce Word Reordering for Machine Translation

Shift-Reduce Word Reordering for Machine Translation Shift-Reduce Word Reordering for Machine Translation Katsuhiko Hayashi, Katsuhito Sudoh, Hajime Tsukada, Jun Suzuki, Masaaki Nagata NTT Communication Science Laboratories, NTT Corporation 2-4 Hikaridai,

More information

Chapter 14 (Partially) Unsupervised Parsing

Chapter 14 (Partially) Unsupervised Parsing Chapter 14 (Partially) Unsupervised Parsing The linguistically-motivated tree transformations we discussed previously are very effective, but when we move to a new language, we may have to come up with

More information

In this chapter, we explore the parsing problem, which encompasses several questions, including:

In this chapter, we explore the parsing problem, which encompasses several questions, including: Chapter 12 Parsing Algorithms 12.1 Introduction In this chapter, we explore the parsing problem, which encompasses several questions, including: Does L(G) contain w? What is the highest-weight derivation

More information

Tree Adjoining Grammars

Tree Adjoining Grammars Tree Adjoining Grammars TAG: Parsing and formal properties Laura Kallmeyer & Benjamin Burkhardt HHU Düsseldorf WS 2017/2018 1 / 36 Outline 1 Parsing as deduction 2 CYK for TAG 3 Closure properties of TALs

More information

Fuzzy Tree Transducers

Fuzzy Tree Transducers Advances in Fuzzy Mathematics. ISSN 0973-533X Volume 11, Number 2 (2016), pp. 99-112 Research India Publications http://www.ripublication.com Fuzzy Tree Transducers S. R. Chaudhari and M. N. Joshi Department

More information

National Centre for Language Technology School of Computing Dublin City University

National Centre for Language Technology School of Computing Dublin City University with with National Centre for Language Technology School of Computing Dublin City University Parallel treebanks A parallel treebank comprises: sentence pairs parsed word-aligned tree-aligned (Volk & Samuelsson,

More information

Vine Pruning for Efficient Multi-Pass Dependency Parsing. Alexander M. Rush and Slav Petrov

Vine Pruning for Efficient Multi-Pass Dependency Parsing. Alexander M. Rush and Slav Petrov Vine Pruning for Efficient Multi-Pass Dependency Parsing Alexander M. Rush and Slav Petrov Dependency Parsing Styles of Dependency Parsing greedy O(n) transition-based parsers (Nivre 2004) graph-based

More information

Computational Models: Class 3

Computational Models: Class 3 Computational Models: Class 3 Benny Chor School of Computer Science Tel Aviv University November 2, 2015 Based on slides by Maurice Herlihy, Brown University, and modifications by Iftach Haitner and Yishay

More information

4.2 The Halting Problem

4.2 The Halting Problem 172 4.2 The Halting Problem The technique of diagonalization was discovered in 1873 by Georg Cantor who was concerned with the problem of measuring the sizes of infinite sets For finite sets we can simply

More information

Latent Variable Models in NLP

Latent Variable Models in NLP Latent Variable Models in NLP Aria Haghighi with Slav Petrov, John DeNero, and Dan Klein UC Berkeley, CS Division Latent Variable Models Latent Variable Models Latent Variable Models Observed Latent Variable

More information

Bimorphism Machine Translation

Bimorphism Machine Translation Bimorphism Machine Translation Von der Fakultät für Mathematik und Informatik der Universität Leipzig angenommene D I S S E R T A T I O N zur Erlangung des akademischen Grades DOCTOR PHILOSOPHIAE (Dr.

More information

Dependency Parsing. Statistical NLP Fall (Non-)Projectivity. CoNLL Format. Lecture 9: Dependency Parsing

Dependency Parsing. Statistical NLP Fall (Non-)Projectivity. CoNLL Format. Lecture 9: Dependency Parsing Dependency Parsing Statistical NLP Fall 2016 Lecture 9: Dependency Parsing Slav Petrov Google prep dobj ROOT nsubj pobj det PRON VERB DET NOUN ADP NOUN They solved the problem with statistics CoNLL Format

More information

Tuning as Linear Regression

Tuning as Linear Regression Tuning as Linear Regression Marzieh Bazrafshan, Tagyoung Chung and Daniel Gildea Department of Computer Science University of Rochester Rochester, NY 14627 Abstract We propose a tuning method for statistical

More information

Probabilistic Context Free Grammars. Many slides from Michael Collins

Probabilistic Context Free Grammars. Many slides from Michael Collins Probabilistic Context Free Grammars Many slides from Michael Collins Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Optimal k-arization of synchronous tree-adjoining grammar

Optimal k-arization of synchronous tree-adjoining grammar Optimal k-arization of synchronous tree-adjoining grammar The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters itation Rebecca Nesson,

More information

Computational Linguistics

Computational Linguistics Computational Linguistics Dependency-based Parsing Clayton Greenberg Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2016 Acknowledgements These slides

More information

UNIT-III REGULAR LANGUAGES

UNIT-III REGULAR LANGUAGES Syllabus R9 Regulation REGULAR EXPRESSIONS UNIT-III REGULAR LANGUAGES Regular expressions are useful for representing certain sets of strings in an algebraic fashion. In arithmetic we can use the operations

More information

CSE 105 THEORY OF COMPUTATION

CSE 105 THEORY OF COMPUTATION CSE 105 THEORY OF COMPUTATION Spring 2017 http://cseweb.ucsd.edu/classes/sp17/cse105-ab/ Today's learning goals Summarize key concepts, ideas, themes from CSE 105. Approach your final exam studying with

More information

Rational tree relations

Rational tree relations Rational tree relations Jean-Claude Raoult Abstract We investigate forests and relations on trees generated by grammars in which the non-terminals represent relations. This introduces some synchronization

More information

How to train your multi bottom-up tree transducer

How to train your multi bottom-up tree transducer How to trin your multi bottom-up tree trnsducer Andres Mletti Universität tuttgrt Institute for Nturl Lnguge Processing Azenbergstrße 7074 tuttgrt Germny ndres.mletti@ims.uni-stuttgrt.de Abstrct The locl

More information

Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation

Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation Spyros Martzoukos Christophe Costa Florêncio and Christof Monz Intelligent Systems Lab

More information

Parsing. Weighted Deductive Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26

Parsing. Weighted Deductive Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26 Parsing Weighted Deductive Parsing Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 26 Table of contents 1 Idea 2 Algorithm 3 CYK Example 4 Parsing 5 Left Corner Example 2 / 26

More information

Processing/Speech, NLP and the Web

Processing/Speech, NLP and the Web CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 25 Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 14 th March, 2011 Bracketed Structure: Treebank Corpus [ S1[

More information

1. Propositional Calculus

1. Propositional Calculus 1. Propositional Calculus Some notes for Math 601, Fall 2010 based on Elliott Mendelson, Introduction to Mathematical Logic, Fifth edition, 2010, Chapman & Hall. 2. Syntax ( grammar ). 1.1, p. 1. Given:

More information

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing Computational Linguistics Dependency-based Parsing Dietrich Klakow & Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2013 Acknowledgements These slides

More information

Rational graphs trace context-sensitive languages

Rational graphs trace context-sensitive languages Rational graphs trace context-sensitive languages Christophe Morvan 1 and Colin Stirling 2 1 IRISA, Campus de eaulieu, 35042 Rennes, France christophe.morvan@irisa.fr 2 Division of Informatics, University

More information

Acceptance of!-languages by Communicating Deterministic Turing Machines

Acceptance of!-languages by Communicating Deterministic Turing Machines Acceptance of!-languages by Communicating Deterministic Turing Machines Rudolf Freund Institut für Computersprachen, Technische Universität Wien, Karlsplatz 13 A-1040 Wien, Austria Ludwig Staiger y Institut

More information

2. Tree-to-String [6] (Phrase Based Machine Translation, PBMT)[1] [2] [7] (Targeted Self-Training) [7] Tree-to-String [3] Tree-to-String [4]

2. Tree-to-String [6] (Phrase Based Machine Translation, PBMT)[1] [2] [7] (Targeted Self-Training) [7] Tree-to-String [3] Tree-to-String [4] 1,a) 1,b) Graham Nubig 1,c) 1,d) 1,) 1. (Phras Basd Machin Translation, PBMT)[1] [2] Tr-to-String [3] Tr-to-String [4] [5] 1 Graduat School of Information Scinc, Nara Institut of Scinc and Tchnology a)

More information

Finite State Transducers

Finite State Transducers Finite State Transducers Eric Gribkoff May 29, 2013 Original Slides by Thomas Hanneforth (Universitat Potsdam) Outline 1 Definition of Finite State Transducer 2 Examples of FSTs 3 Definition of Regular

More information

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).

Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). 1a) G = ({R, S, T}, {0,1}, P, S) where P is: S R0R R R0R1R R1R0R T T 0T ε (S generates the first 0. R generates

More information

Automata Theory and Formal Grammars: Lecture 1

Automata Theory and Formal Grammars: Lecture 1 Automata Theory and Formal Grammars: Lecture 1 Sets, Languages, Logic Automata Theory and Formal Grammars: Lecture 1 p.1/72 Sets, Languages, Logic Today Course Overview Administrivia Sets Theory (Review?)

More information

Entropy of tree automata, joint spectral radii and zero-error coding

Entropy of tree automata, joint spectral radii and zero-error coding Entropy of tree automata, joint spectral radii and zero-error coding Cătălin Dima LACL, Univ. Paris-Est Automata Theory and Symbolic Dynamics Workshop, 3-7 June, 2013 Cătălin Dima (LACL, UPEC) Entropy,

More information

arxiv: v1 [math.co] 28 Oct 2016

arxiv: v1 [math.co] 28 Oct 2016 More on foxes arxiv:1610.09093v1 [math.co] 8 Oct 016 Matthias Kriesell Abstract Jens M. Schmidt An edge in a k-connected graph G is called k-contractible if the graph G/e obtained from G by contracting

More information

Syntax-Based Decoding

Syntax-Based Decoding Syntax-Based Decoding Philipp Koehn 9 November 2017 1 syntax-based models Synchronous Context Free Grammar Rules 2 Nonterminal rules NP DET 1 2 JJ 3 DET 1 JJ 3 2 Terminal rules N maison house NP la maison

More information