Sharpening the empirical claims of generative syntax through formalization

Size: px
Start display at page:

Download "Sharpening the empirical claims of generative syntax through formalization"

Transcription

1 Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities ESSLLI, August 2015

2 Part 1: Grammars and cognitive hypotheses What is a grammar? What can grammars do? Concrete illustration of a target: Surprisal Parts 2 4: Assembling the pieces Minimalist Grammars (MGs) MGs and MCFGs Probabilities on MGs Part 5: Learning and wrap-up Something slightly different: Learning model Recap and open questions

3 Sharpening the empirical claims of generative syntax through formalization Tim Hunter ESSLLI, August 2015 Part 3 MGs and MCFGs

4 Where we re up to We ve seen: Now: Later: MGs with operations defined that manipulated trees that the structure that really matters (e.g. for recursion) can be boiled down to funny-looking derivation trees (with things like t, -k at the non-leaf nodes) A way to think of how these derivation trees relate to surface strings (without going via trees) In some ways not totally necessary for the rest of the course, but helpful Adding probabilities to MGs: in a way that sort of works, and does some good stuff, but doesn t do everything we d want Adding probabilities to MGs: in an even better way 102 / 201

5 Outline 9 A different perspective on CFGs 10 Concatenative and non-concatenative operations 11 MCFGs 12 Back to MGs 103 / 201

6 Outline 9 A different perspective on CFGs 10 Concatenative and non-concatenative operations 11 MCFGs 12 Back to MGs 104 / 201

7 Trees S the boy likes cake :: S NP VP the boy :: NP likes cake :: VP D the N boy V likes NP N cake the :: D boy :: N likes :: V cake :: NP cake :: N 105 / 201

8 Trees S the boy likes cake :: S NP VP the boy :: NP likes cake :: VP D the N boy V likes NP N cake the :: D boy :: N likes :: V cake :: NP cake :: N How to think of a tree: less as a picture of a string more as a graphical representation of how a string was constructed, with the string at the top node 105 / 201

9 Two sides of a CFG rule A rule like S NP VP says two things: What combines with what: An NP and a VP can combine to form an S How to produce a string of the new category: Put the NP-string to the left of the VP-string More explicitly: st :: S s :: NP t :: VP 106 / 201

10 Example: X-bar theory Japanese XP Spec X X Comp X English XP Spec X X X Comp 107 / 201

11 Example: X-bar theory Japanese XP Spec X X Comp X English XP Spec X X X Comp Japanese st :: XP s :: Spec t :: X st :: X s :: Comp t :: X English st :: XP s :: Spec t :: X ts :: X s :: Comp t :: X 107 / 201

12 Example: X-bar theory Japanese XP Spec X X Comp X English XP Spec X X X Comp Japanese st :: XP s :: Spec t :: X st :: X s :: Comp t :: X English st :: XP s :: Spec t :: X ts :: X s :: Comp t :: X John-ga Mary-o mita :: VP John saw Mary :: VP John-ga :: Spec Mary-o mita :: V John :: Spec saw Mary :: V Mary-o :: Comp mita :: V Mary :: Comp saw :: V 107 / 201

13 Outline 9 A different perspective on CFGs 10 Concatenative and non-concatenative operations 11 MCFGs 12 Back to MGs 108 / 201

14 Concatenative and non-concatenative operations Concatenative morphology: play + ed played play + ing playing play + s plays Non-concatenative morphology: (k,t,b) + (i,aa) kitaab ( book ) (k,t,b) + (aa,i) kaatib ( writer ) (k,t,b) + (ma,uu) maktuub ( written ) (k,t,b) + (a,i,a) katiba ( document ) 109 / 201

15 Concatenative and non-concatenative operations Concatenative morphology: play + ed played play + ing playing play + s plays Non-concatenative morphology: (k,t,b) + (i,aa) kitaab ( book ) (k,t,b) + (aa,i) kaatib ( writer ) (k,t,b) + (ma,uu) maktuub ( written ) (k,t,b) + (a,i,a) katiba ( document ) Concatenative syntax: plays + tennis plays tennis plays + soccer plays soccer John + plays soccer John plays soccer Mary + plays soccer Mary plays soccer 109 / 201

16 Concatenative and non-concatenative operations Concatenative morphology: play + ed played play + ing playing play + s plays Non-concatenative morphology: (k,t,b) + (i,aa) kitaab ( book ) (k,t,b) + (aa,i) kaatib ( writer ) (k,t,b) + (ma,uu) maktuub ( written ) (k,t,b) + (a,i,a) katiba ( document ) Concatenative syntax: plays + tennis plays tennis plays + soccer plays soccer John + plays soccer John plays soccer Mary + plays soccer Mary plays soccer Non-concatenative syntax: seems + (John, to be tall) John seems to be tall seems + (Mary, to be intelligent) Mary seems to be intelligent did + (John see, who) who did John see did + (Mary meet, who) who did Mary meet 109 / 201

17 Non-concatenative morphology kitaab :: A kaatib :: A kutub :: A k,t,b :: B i,aa :: C k,t,b :: B aa,i :: C k,t,b :: B u,u :: C stuvw :: A s, u, w :: B t, v :: C 110 / 201

18 Non-concatenative morphology kitaab :: A kaatib :: A kutub :: A k,t,b :: B i,aa :: C k,t,b :: B aa,i :: C k,t,b :: B u,u :: C stuvw :: A s, u, w :: B t, v :: C gespielt :: E gekauft :: E gemacht :: E spiel :: A ge,t :: D kauf :: A ge,t :: D mach :: A ge,t :: D stu :: E t :: A s, u :: D 110 / 201

19 Non-concatenative morphology stuvw :: A s, u, w :: B t, v :: C stu :: E t :: A s, u :: D 111 / 201

20 Non-concatenative morphology stuvw :: A s, u, w :: B t, v :: C stu :: E t :: A s, u :: D gekitaabt :: E kitaab :: A ge,t :: D k,t,b :: B i,aa :: C 111 / 201

21 Non-concatenative morphology stuvw :: A s, u, w :: B t, v :: C stu :: E t :: A s, u :: D ts, u :: D t :: F s, u :: D gekitaabt :: E ausgekitaabt :: E kitaab :: A ge,t :: D kitaab :: A ausge,t :: D k,t,b :: B i,aa :: C k,t,b :: B i,aa :: C aus :: F ge,t :: D 111 / 201

22 Non-concatenative morphology stuvw :: A s, u, w :: B t, v :: C stu :: E t :: A s, u :: D ts, u :: D t :: F s, u :: D gekitaabt :: E ausgekitaabt :: E kitaab :: A ge,t :: D kitaab :: A ausge,t :: D k,t,b :: B i,aa :: C k,t,b :: B i,aa :: C aus :: F ge,t :: D If our goal is to characterize the array of well-formed/derivable objects not to pronounce them then all we care about is what s built out of what : A B C E A D D F D 111 / 201

23 Outline 9 A different perspective on CFGs 10 Concatenative and non-concatenative operations 11 MCFGs 12 Back to MGs 112 / 201

24 Multiple Context-Free Grammars (MCFGs) st :: S s :: NP t :: VP An MCFG generalises to allow yields to be tuples of strings. t 2st 1 :: Q s :: NP t 1, t 2 :: VPWH This rule says two things: We can combine an NP with a VPWH to make a Q. The yield of the Q is t 2st 1, where s is the yield of the NP and t 1, t 2 is the yield of the VPWH. 113 / 201

25 Multiple Context-Free Grammars (MCFGs) st :: S s :: NP t :: VP An MCFG generalises to allow yields to be tuples of strings. t 2st 1 :: Q s :: NP t 1, t 2 :: VPWH This rule says two things: We can combine an NP with a VPWH to make a Q. The yield of the Q is t 2st 1, where s is the yield of the NP and t 1, t 2 is the yield of the VPWH. which girl the boy says is tall :: Q the boy :: NP says is tall,which girl :: VPWH 113 / 201

26 Some technical details Each nonterminal has a rank n, and yields only n-tuples of strings. So given this rule: t 2st 1 :: Q s :: NP t 1, t 2 :: VPWH we know that anything producing a VPWH must produce a 2-tuple....,... :: VPWH... and that anything producing an NP must produce a 1-tuple:... :: NP... (Seki et al. 1991, Weir 1988, Vijay-Shanker et al. 1987) 114 / 201

27 Some technical details Each nonterminal has a rank n, and yields only n-tuples of strings. So given this rule: t 2st 1 :: Q s :: NP t 1, t 2 :: VPWH we know that anything producing a VPWH must produce a 2-tuple....,... :: VPWH... and that anything producing an NP must produce a 1-tuple:... :: NP... The string-composition functions cannot copy pieces of their arguments. OK s t :: VP s :: V t :: NP OK t s himself :: S s :: V t :: NP Not OK t s t :: S s :: V t :: NP (Seki et al. 1991, Weir 1988, Vijay-Shanker et al. 1987) 114 / 201

28 Some technical details Each nonterminal has a rank n, and yields only n-tuples of strings. So given this rule: t 2st 1 :: Q s :: NP t 1, t 2 :: VPWH we know that anything producing a VPWH must produce a 2-tuple....,... :: VPWH... and that anything producing an NP must produce a 1-tuple:... :: NP... The string-composition functions cannot copy pieces of their arguments. OK s t :: VP s :: V t :: NP OK t s himself :: S s :: V t :: NP Not OK t s t :: S s :: V t :: NP Essentially equivalent to linear context-free rewriting systems (LCFRSs). (Seki et al. 1991, Weir 1988, Vijay-Shanker et al. 1987) 114 / 201

29 Beyond context-free t 1t 2 :: S t 1, t 2 :: P t 1u 1, t 2u 2 :: P t 1, t 2 :: P u 1, u 2 :: E ɛ, ɛ :: P a, a :: E b, b :: E { ww w {a, b} } aabaaaba :: S aaba,aaba :: P aab,aab :: P a,a :: E aa,aa :: P b,b :: E a,a :: P a,a :: E ɛ, ɛ :: P a,a :: E Unlike in a CFG, we can ensure that the two halves are extended in the same ways without concatenating them together. 115 / 201

30 For comparison t 1st 2 :: S t 1 :: A s :: S t 2 :: A t 1st 2 :: S t 1 :: B s :: S t 2 :: B ɛ :: S a :: A b :: B abaaaaba :: S a :: A baaaab :: S a :: A b :: B aaaa :: S b :: B a :: A aa :: S a :: A a :: A ɛ :: S a :: A 116 / 201

31 Outline 9 A different perspective on CFGs 10 Concatenative and non-concatenative operations 11 MCFGs 12 Back to MGs 117 / 201

32 Merge and move < =f α f β merge α β > merge =f α f β β α > +f α move β α -f β 118 / 201

33 What matters in a (derived) tree This tree: x -f -g becomes a tuple of categorized strings: s :: x, t :: -f, u :: -g or, equivalently, a tuple-of-strings, categorized by a tuple-of-categories: s, t, u :: x, -f, -g 0 0 (Michaelis 2001, Stabler and Keenan 2003) 119 / 201

34 Remember MG derivation trees? We can tell that this tree represents a well-formed derivation, by checking the feature-manipulations at each step. How can we work out which string it derives? Build up a tree according to merge and move rules, and read off leaves of the tree. But there s a simpler way. t +k t, -k will :: =v +k t =subj v think :: =t =subj v v, -k t Mary :: subj -k +k t, -k will :: =v +k t v, -k =subj v John :: subj -k eat :: =obj =subj v cake :: obj 120 / 201

35 Producing a string from a derivation tree t +k t, -k What do we need to have computed at the +k t, -k node, in order to compute the final string at the t node? Mary will think John will eat cake 121 / 201

36 Producing a string from a derivation tree t +k t, -k What do we need to have computed at the +k t, -k node, in order to compute the final string at the t node? Mary will think John will eat cake This tree would do: < will :: +k t > Mary :: -k think :: < John :: > > But all we actually need to know is: What s the string corresponding to the part that s going to move to check -k? What s the string corresponding to the leftovers? will :: > < These questions are answered by the tuple will think John will eat cake, Mary eat :: cake :: 121 / 201

37 Producing a string from a derivation tree t +k t, -k will :: =v +k t v, -k What do we need to have computed at the v, -k node, in order to compute the desired tuple will think John will eat cake, Mary at the +k t, -k node? 122 / 201

38 Producing a string from a derivation tree t +k t, -k will :: =v +k t v, -k What do we need to have computed at the v, -k node, in order to compute the desired tuple will think John will eat cake, Mary at the +k t, -k node? This tree would do: > Mary :: -k think :: v < > But all we actually need to know is: What s the string corresponding to the part that s going to move to check -k? John :: > What s the string corresponding to the leftovers? will :: > < These questions are answered by the tuple think John will eat cake, Mary eat :: cake :: 122 / 201

39 Producing a string from a derivation tree t will :: =v +k t +k t, -k =subj v v, -k Mary :: subj -k What do we need to have computed at the =subj v node, in order to compute the desired tuple think John will eat cake, Mary at the v, -k node? / 201

40 Producing a string from a derivation tree t will :: =v +k t +k t, -k =subj v v, -k Mary :: subj -k What do we need to have computed at the =subj v node, in order to compute the desired tuple think John will eat cake, Mary at the v, -k node? This tree would do: < think :: =subj v John :: > will :: > > < But all we actually need to know is: What s the string corresponding to the entire tree? (The leftovers after no movement.) This question is answered by the string think John will eat cake eat :: cake :: 123 / 201

41 What matters in a (derived) tree This tree: x -f -g becomes a tuple of categorized strings: s :: x, t :: -f, u :: -g or, equivalently, a tuple-of-strings, categorized by a tuple-of-categories: s, t, u :: x, -f, -g 0 0 (Michaelis 2001, Stabler and Keenan 2003) 124 / 201

42 MCFG rules t +k t, -k will :: =v +k t v, -k =subj v Mary :: subj -k t 2t 1 :: t t 1, t 2 :: +k t, -k Mary will think John will eat cake :: t will think John will eat cake, Mary :: +k t, -k st 1, t 2 :: +k t, -k s :: =v +k t t 1, t 2 :: v, -k will think John will eat cake, Mary :: +k t, -k will :: =v +k t think John will eat cake, Mary :: v, -k s, t :: v, -k s :: =subj v t :: subj -k think John will eat cake, Mary :: v, -k think John will eat cake :: =subj v Mary :: subj -k 125 / 201

43 One slightly annoying wrinkle We know that this is a valid derivational step: α =f α f What is the corresponding MCFG rule? Selected thing on the right? st :: α s :: =f α t :: f Selected thing on the left? ts :: α s :: =f α t :: f 126 / 201

44 One slightly annoying wrinkle We know that this is a valid derivational step: α =f α f What is the corresponding MCFG rule? Selected thing on the right? st :: α s :: =f α t :: f < p Selected thing on the left? ts :: α s :: =f α t :: f with :: p John :: with :: =d p John :: d 126 / 201

45 One slightly annoying wrinkle We know that this is a valid derivational step: α =f α f What is the corresponding MCFG rule? Selected thing on the right? st :: α s :: =f α t :: f < p Selected thing on the left? ts :: α s :: =f α t :: f > v with :: p John :: John :: < with :: =d p John :: d eat :: v cake :: < =d v John :: d eat :: =d v cake :: 126 / 201

46 One slightly annoying wrinkle Each type needs to record not only the unchecked features, but also whether the expression is lexical. I ll write lexical types as... 1 and non-lexical types as So types of the form =f α 1 act slightly differently from those of the form =f α 0. st :: α 0 s :: =f α 1 t :: f n with John :: p 0 with :: =d p 1 John :: d 1 ts :: α 0 s :: =f α 0 t :: f n John eat cake :: v 0 eat cake :: =d v 0 John :: d / 201

47 Context-free structure =subj v =q =subj v q q +wh q, -wh +wh q, -wh =t +wh q t, -wh 128 / 201

48 Context-free structure =subj v =q =subj v q q +wh q, -wh +wh q, -wh =t +wh q t, -wh General schemas for merge steps (approximate): γ, α 1,..., α j, β 1,..., β k =fγ, α 1,..., α j f, β 1,..., β k γ, α 1,..., α j, δ, β 1,..., β k =fγ, α 1,..., α j fδ, β 1,..., β k General schemas for move steps (approximate): γ, α 1,..., α i 1, α i+1,..., α k +fγ, α 1,..., α i 1, -f, α i+1,..., α k γ, α 1,..., α i 1, δ, α i+1,..., α k +fγ, α 1,..., α i 1, -fδ, α i+1,..., α k 128 / 201

49 Context-free structure =subj v =q =subj v q q +wh q, -wh +wh q, -wh =t +wh q t, -wh General schemas for merge steps (approximate): γ, α 1,..., α j, β 1,..., β k =fγ, α 1,..., α j f, β 1,..., β k γ, α 1,..., α j, δ, β 1,..., β k =fγ, α 1,..., α j fδ, β 1,..., β k General schemas for move steps (approximate): γ, α 1,..., α i 1, α i+1,..., α k +fγ, α 1,..., α i 1, -f, α i+1,..., α k γ, α 1,..., α i 1, δ, α i+1,..., α k +fγ, α 1,..., α i 1, -fδ, α i+1,..., α k move steps change something without combining it with anything Compare with unary CFG rules, or type-raising in CCG, or / 201

50 Three schemas for merge rules: st, t 1,..., t k :: γ, α 1,..., α k 0 s :: =fγ 1 t, t 1,..., t k :: f, α 1,..., α k n ts, s 1,..., s j, t 1,..., t k :: γ, α 1,..., α j, β 1,..., β k 0 s, s 1,..., s j :: =fγ, α 1,..., α j 0 t, t 1,..., t k :: f, β 1,..., β k n s, s 1,..., s j, t, t 1,..., t k :: γ, α 1,..., α j, δ, β 1,..., β k 0 s, s 1,..., s j :: =fγ, α 1,..., α j n t, t 1,..., t k :: fδ, β 1,..., β k n Two schemas for move rules: s is, s 1,..., s i 1, s i+1,..., s k :: γ, α 1,..., α i 1, α i+1,..., α k 0 s, s 1,..., s i,..., s k :: +fγ, α 1,..., α i 1, -f, α i+1,..., α k 0 s, s 1,..., s i,..., s k :: γ, α 1,..., α i 1, δ, α i+1,..., α k 0 s, s 1,..., s i,..., s k :: +fγ, α 1,..., α i 1, -fδ, α i+1,..., α k / 201

51 Part 1: Grammars and cognitive hypotheses What is a grammar? What can grammars do? Concrete illustration of a target: Surprisal Parts 2 4: Assembling the pieces Minimalist Grammars (MGs) MGs and MCFGs Probabilities on MGs Part 5: Learning and wrap-up Something slightly different: Learning model Recap and open questions

52 References I Billot, S. and Lang, B. (1989). The structure of shared forests in ambiguous parsing. In Proceedings of the 1989 Meeting of the Association of Computational Linguistics. Chomsky, N. (1965). Aspects of the Theory of Syntax. MIT Press, Cambridge, MA. Chomsky, N. (1980). Rules and Representations. Columbia University Press, New York. Ferreira, F. (2005). Psycholinguistics, formal grammars, and cognitive science. The Linguistic Review, 22: Frazier, L. and Clifton, C. (1996). Construal. MIT Press, Cambridge, MA. Gärtner, H.-M. and Michaelis, J. (2010). On the Treatment of Multiple-Wh Interrogatives in Minimalist Grammars. In Hanneforth, T. and Fanselow, G., editors, Language and Logos, pages Akademie Verlag, Berlin. Gibson, E. and Wexler, K. (1994). Triggers. Linguistic Inquiry, 25: Hale, J. (2006). Uncertainty about the rest of the sentence. Cognitive Science, 30:643 Âŋ672. Hale, J. T. (2001). A probabilistic earley parser as a psycholinguistic model. In Proceedings of the Second Meeting of the North American Chapter of the Association for Computational Linguistics. Hunter, T. (2011). Insertion Minimalist Grammars: Eliminating redundancies between merge and move. In Kanazawa, M., Kornai, A., Kracht, M., and Seki, H., editors, The Mathematics of Language (MOL 12 Proceedings), volume 6878 of LNCS, pages , Berlin Heidelberg. Springer. Hunter, T. and Dyer, C. (2013). Distributions on minimalist grammar derivations. In Proceedings of the 13th Meeting on the Mathematics of Language.

53 References II Koopman, H. and Szabolcsi, A. (2000). Verbal Complexes. MIT Press, Cambridge, MA. Lang, B. (1988). Parsing incomplete sentences. In Proceedings of the 12th International Conference on Computational Linguistics, pages Levy, R. (2008). Expectation-based syntactic comprehension. Cognition, 106(3): Michaelis, J. (2001). Derivational minimalism is mildly context-sensitive. In Moortgat, M., editor, Logical Aspects of Computational Linguistics, volume 2014 of LNCS, pages Springer, Berlin Heidelberg. Miller, G. A. and Chomsky, N. (1963). Finitary models of language users. In Luce, R. D., Bush, R. R., and Galanter, E., editors, Handbook of Mathematical Psychology, volume 2. Wiley and Sons, New York. Morrill, G. (1994). Type Logical Grammar: Categorial Logic of Signs. Kluwer, Dordrecht. Nederhof, M. J. and Satta, G. (2008). Computing partition functions of pcfgs. Research on Language and Computation, 6(2): Seki, H., Matsumara, T., Fujii, M., and Kasami, T. (1991). On multiple context-free grammars. Theoretical Computer Science, 88: Stabler, E. P. (2006). Sidewards without copying. In Wintner, S., editor, Proceedings of The 11th Conference on Formal Grammar, pages , Stanford, CA. CSLI Publications. Stabler, E. P. (2011). Computational perspectives on minimalism. In Boeckx, C., editor, The Oxford Handbook of Linguistic Minimalism. Oxford University Press, Oxford. Stabler, E. P. and Keenan, E. L. (2003). Structural similarity within and among languages. Theoretical Computer Science, 293:

54 References III Vijay-Shanker, K., Weir, D. J., and Joshi, A. K. (1987). Characterizing structural descriptions produced by various grammatical formalisms. In Proceedings of the 25th Meeting of the Association for Computational Linguistics, pages Weir, D. (1988). Characterizing mildly context-sensitive grammar formalisms. PhD thesis, University of Pennsylvania. Yngve, V. H. (1960). A model and an hypothesis for language structure. In Proceedings of the American Philosophical Society, volume 104, pages

Sharpening the empirical claims of generative syntax through formalization

Sharpening the empirical claims of generative syntax through formalization Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities NASSLLI, June 2014 Part 1: Grammars and cognitive hypotheses What is a grammar?

More information

Sharpening the empirical claims of generative syntax through formalization

Sharpening the empirical claims of generative syntax through formalization Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities NASSLLI, June 2014 Part 1: Grammars and cognitive hypotheses What is a grammar?

More information

Sharpening the empirical claims of generative syntax through formalization

Sharpening the empirical claims of generative syntax through formalization Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities ESSLLI, August 2015 Part 1: Grammars and cognitive hypotheses What is a grammar?

More information

Unifying Adjunct Islands and Freezing Effects in Minimalist Grammars

Unifying Adjunct Islands and Freezing Effects in Minimalist Grammars Unifying Adjunct Islands and Freezing Effects in Minimalist Grammars Tim Hunter Department of Linguistics University of Maryland TAG+10 1 / 26 Goal of This Talk Goal Present a unified account of two well-known

More information

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars Laura Kallmeyer & Tatiana Bladier Heinrich-Heine-Universität Düsseldorf Sommersemester 2018 Kallmeyer, Bladier SS 2018 Parsing Beyond CFG:

More information

Insertion Minimalist Grammars: Eliminating redundancies between merge and move

Insertion Minimalist Grammars: Eliminating redundancies between merge and move Insertion Minimalist Grammars: Eliminating redundancies between merge and move Tim Hunter Department of Linguistics Yale University Abstract. Minimalist Grammars (MGs) provide a setting for rigourous investigations

More information

Parsing Linear Context-Free Rewriting Systems

Parsing Linear Context-Free Rewriting Systems Parsing Linear Context-Free Rewriting Systems Håkan Burden Dept. of Linguistics Göteborg University cl1hburd@cling.gu.se Peter Ljunglöf Dept. of Computing Science Göteborg University peb@cs.chalmers.se

More information

Well-Nestedness Properly Subsumes Strict Derivational Minimalism

Well-Nestedness Properly Subsumes Strict Derivational Minimalism A revised version is to appear in: S. Pogodalla and J.-P. Prost (eds.), Logical Aspects of Computational Linguistics (LACL 2011), Lecture Notes in Artificial Intelligence Vol. 6736, pp. 112-128, Springer,

More information

An Additional Observation on Strict Derivational Minimalism

An Additional Observation on Strict Derivational Minimalism 10 An Additional Observation on Strict Derivational Minimalism Jens Michaelis Abstract We answer a question which, so far, was left an open problem: does in terms of derivable string languages the type

More information

Movement-Generalized Minimalist Grammars

Movement-Generalized Minimalist Grammars Movet-Generalized Minimalist Grammars Thomas Graf tgraf@ucla.edu tgraf.bol.ucla.edu University of California, Los Angeles LACL 2012 July 2, 2012 Outline 1 Monadic Second-Order Logic (MSO) Talking About

More information

Tree Adjoining Grammars

Tree Adjoining Grammars Tree Adjoining Grammars TAG: Parsing and formal properties Laura Kallmeyer & Benjamin Burkhardt HHU Düsseldorf WS 2017/2018 1 / 36 Outline 1 Parsing as deduction 2 CYK for TAG 3 Closure properties of TALs

More information

A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus

A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus Timothy A. D. Fowler Department of Computer Science University of Toronto 10 King s College Rd., Toronto, ON, M5S 3G4, Canada

More information

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the

More information

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46. : Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching

More information

Models of Adjunction in Minimalist Grammars

Models of Adjunction in Minimalist Grammars Models of Adjunction in Minimalist Grammars Thomas Graf mail@thomasgraf.net http://thomasgraf.net Stony Brook University FG 2014 August 17, 2014 The Theory-Neutral CliffsNotes Insights Several properties

More information

Bringing machine learning & compositional semantics together: central concepts

Bringing machine learning & compositional semantics together: central concepts Bringing machine learning & compositional semantics together: central concepts https://githubcom/cgpotts/annualreview-complearning Chris Potts Stanford Linguistics CS 244U: Natural language understanding

More information

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements

More information

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis 1 Introduction Parenthesis Matching Problem Describe the set of arithmetic expressions with correctly matched parenthesis. Arithmetic expressions with correctly matched parenthesis cannot be described

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner and Yishay Mansour. Tel Aviv University. April 3/8, 2013 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice

More information

Review. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering

Review. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering Review Earley Algorithm Chapter 13.4 Lecture #9 October 2009 Top-Down vs. Bottom-Up Parsers Both generate too many useless trees Combine the two to avoid over-generation: Top-Down Parsing with Bottom-Up

More information

Compositionality and Syntactic Structure Marcus Kracht Department of Linguistics UCLA 3125 Campbell Hall 405 Hilgard Avenue Los Angeles, CA 90095

Compositionality and Syntactic Structure Marcus Kracht Department of Linguistics UCLA 3125 Campbell Hall 405 Hilgard Avenue Los Angeles, CA 90095 Compositionality and Syntactic Structure Marcus Kracht Department of Linguistics UCLA 3125 Campbell Hall 405 Hilgard Avenue Los Angeles, CA 90095 1543 kracht@humnet.ucla.edu 1. The Questions ➀ Why does

More information

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other

More information

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other

More information

Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing. Part I. Formal Properties of TAG. Outline: Formal Properties of TAG

Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing. Part I. Formal Properties of TAG. Outline: Formal Properties of TAG Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing Laura Kallmeyer, Timm Lichte, Wolfgang Maier Universität Tübingen Part I Formal Properties of TAG 16.05.2007 und 21.05.2007 TAG Parsing

More information

Multiple Context-free Grammars

Multiple Context-free Grammars Multiple Context-free Grammars Course 4: pumping properties Sylvain Salvati INRI Bordeaux Sud-Ouest ESSLLI 2011 The pumping Lemma for CFL Outline The pumping Lemma for CFL Weak pumping Lemma for MCFL No

More information

Finite Presentations of Pregroups and the Identity Problem

Finite Presentations of Pregroups and the Identity Problem 6 Finite Presentations of Pregroups and the Identity Problem Alexa H. Mater and James D. Fix Abstract We consider finitely generated pregroups, and describe how an appropriately defined rewrite relation

More information

Improved TBL algorithm for learning context-free grammar

Improved TBL algorithm for learning context-free grammar Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 267 274 2007 PIPS Improved TBL algorithm for learning context-free grammar Marcin Jaworski

More information

Advanced Natural Language Processing Syntactic Parsing

Advanced Natural Language Processing Syntactic Parsing Advanced Natural Language Processing Syntactic Parsing Alicia Ageno ageno@cs.upc.edu Universitat Politècnica de Catalunya NLP statistical parsing 1 Parsing Review Statistical Parsing SCFG Inside Algorithm

More information

A* Search. 1 Dijkstra Shortest Path

A* Search. 1 Dijkstra Shortest Path A* Search Consider the eight puzzle. There are eight tiles numbered 1 through 8 on a 3 by three grid with nine locations so that one location is left empty. We can move by sliding a tile adjacent to the

More information

The Lambek-Grishin calculus for unary connectives

The Lambek-Grishin calculus for unary connectives The Lambek-Grishin calculus for unary connectives Anna Chernilovskaya Utrecht Institute of Linguistics OTS, Utrecht University, the Netherlands anna.chernilovskaya@let.uu.nl Introduction In traditional

More information

X-bar theory. X-bar :

X-bar theory. X-bar : is one of the greatest contributions of generative school in the filed of knowledge system. Besides linguistics, computer science is greatly indebted to Chomsky to have propounded the theory of x-bar.

More information

Computational Models - Lecture 4 1

Computational Models - Lecture 4 1 Computational Models - Lecture 4 1 Handout Mode Iftach Haitner. Tel Aviv University. November 21, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.

More information

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

CMPT-825 Natural Language Processing. Why are parsing algorithms important? CMPT-825 Natural Language Processing Anoop Sarkar http://www.cs.sfu.ca/ anoop October 26, 2010 1/34 Why are parsing algorithms important? A linguistic theory is implemented in a formal system to generate

More information

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09 Natural Language Processing : Probabilistic Context Free Grammars Updated 5/09 Motivation N-gram models and HMM Tagging only allowed us to process sentences linearly. However, even simple sentences require

More information

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this.

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this. Chapter 12 Synchronous CFGs Synchronous context-free grammars are a generalization of CFGs that generate pairs of related strings instead of single strings. They are useful in many situations where one

More information

Probabilistic Linguistics

Probabilistic Linguistics Matilde Marcolli MAT1509HS: Mathematical and Computational Linguistics University of Toronto, Winter 2019, T 4-6 and W 4, BA6180 Bernoulli measures finite set A alphabet, strings of arbitrary (finite)

More information

Probabilistic Context-Free Grammar

Probabilistic Context-Free Grammar Probabilistic Context-Free Grammar Petr Horáček, Eva Zámečníková and Ivana Burgetová Department of Information Systems Faculty of Information Technology Brno University of Technology Božetěchova 2, 612

More information

Probabilistic Context Free Grammars. Many slides from Michael Collins

Probabilistic Context Free Grammars. Many slides from Michael Collins Probabilistic Context Free Grammars Many slides from Michael Collins Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Parsing with Context-Free Grammars

Parsing with Context-Free Grammars Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars

More information

Structural Induction

Structural Induction Structural Induction In this lecture we ll extend the applicability of induction to many universes, ones where we can define certain kinds of objects by induction, in addition to proving their properties

More information

Attendee information. Seven Lectures on Statistical Parsing. Phrase structure grammars = context-free grammars. Assessment.

Attendee information. Seven Lectures on Statistical Parsing. Phrase structure grammars = context-free grammars. Assessment. even Lectures on tatistical Parsing Christopher Manning LA Linguistic Institute 7 LA Lecture Attendee information Please put on a piece of paper: ame: Affiliation: tatus (undergrad, grad, industry, prof,

More information

Parsing. Context-Free Grammars (CFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26

Parsing. Context-Free Grammars (CFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26 Parsing Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 26 Table of contents 1 Context-Free Grammars 2 Simplifying CFGs Removing useless symbols Eliminating

More information

Einführung in die Computerlinguistik

Einführung in die Computerlinguistik Einführung in die Computerlinguistik Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 22 CFG (1) Example: Grammar G telescope : Productions: S NP VP NP

More information

Computational Models - Lecture 3

Computational Models - Lecture 3 Slides modified by Benny Chor, based on original slides by Maurice Herlihy, Brown University. p. 1 Computational Models - Lecture 3 Equivalence of regular expressions and regular languages (lukewarm leftover

More information

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued CAS LX 522 Syntax I Fall 2000 October 0, 2000 Paul Hagstrom Week 5: Case Theory and θ Theory θ-theory continued From last time: verbs have θ-roles (e.g., Agent, Theme, ) to assign, specified in the lexicon

More information

Roger Levy Probabilistic Models in the Study of Language draft, October 2,

Roger Levy Probabilistic Models in the Study of Language draft, October 2, Roger Levy Probabilistic Models in the Study of Language draft, October 2, 2012 224 Chapter 10 Probabilistic Grammars 10.1 Outline HMMs PCFGs ptsgs and ptags Highlight: Zuidema et al., 2008, CogSci; Cohn

More information

CISC4090: Theory of Computation

CISC4090: Theory of Computation CISC4090: Theory of Computation Chapter 2 Context-Free Languages Courtesy of Prof. Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Spring, 2014 Overview In Chapter

More information

Complexity, Parsing, and Factorization of Tree-Local Multi-Component Tree-Adjoining Grammar

Complexity, Parsing, and Factorization of Tree-Local Multi-Component Tree-Adjoining Grammar Complexity, Parsing, and Factorization of Tree-Local Multi-Component Tree-Adjoining Grammar Rebecca Nesson School of Engineering and Applied Sciences Harvard University Giorgio Satta Department of Information

More information

Languages. Languages. An Example Grammar. Grammars. Suppose we have an alphabet V. Then we can write:

Languages. Languages. An Example Grammar. Grammars. Suppose we have an alphabet V. Then we can write: Languages A language is a set (usually infinite) of strings, also known as sentences Each string consists of a sequence of symbols taken from some alphabet An alphabet, V, is a finite set of symbols, e.g.

More information

HPSG: Binding Theory

HPSG: Binding Theory HPSG: Binding Theory Doug Arnold doug@essexacuk Introduction Binding Theory is to do with the syntactic restrictions on the distribution of referentially dependent items and their antecedents: reflexives/reciprocals

More information

The Formal Architecture of. Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple

The Formal Architecture of. Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple The Formal Architecture of Lexical-Functional Grammar Ronald M. Kaplan and Mary Dalrymple Xerox Palo Alto Research Center 1. Kaplan and Dalrymple, ESSLLI 1995, Barcelona Architectural Issues Representation:

More information

MoL 13. The 13th Meeting on the Mathematics of Language. Proceedings

MoL 13. The 13th Meeting on the Mathematics of Language. Proceedings MoL 13 The 13th Meeting on the Mathematics of Language Proceedings August 9, 2013 Sofia, Bulgaria Production and Manufacturing by Omnipress, Inc. 2600 Anderson Street Madison, WI 53704 USA c 2013 The Association

More information

Language Learning Problems in the Principles and Parameters Framework

Language Learning Problems in the Principles and Parameters Framework Language Learning Problems in the Principles and Parameters Framework Partha Niyogi Presented by Chunchuan Lyu March 22, 2016 Partha Niyogi (presented by c.c. lyu) Language Learning March 22, 2016 1 /

More information

Computational Models - Lecture 5 1

Computational Models - Lecture 5 1 Computational Models - Lecture 5 1 Handout Mode Iftach Haitner. Tel Aviv University. November 28, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.

More information

Computational Models - Lecture 4

Computational Models - Lecture 4 Computational Models - Lecture 4 Regular languages: The Myhill-Nerode Theorem Context-free Grammars Chomsky Normal Form Pumping Lemma for context free languages Non context-free languages: Examples Push

More information

A proof theoretical account of polarity items and monotonic inference.

A proof theoretical account of polarity items and monotonic inference. A proof theoretical account of polarity items and monotonic inference. Raffaella Bernardi UiL OTS, University of Utrecht e-mail: Raffaella.Bernardi@let.uu.nl Url: http://www.let.uu.nl/ Raffaella.Bernardi/personal

More information

Strong connectivity hypothesis and generative power in TAG

Strong connectivity hypothesis and generative power in TAG 15 Strong connectivity hypothesis and generative power in TAG Alessandro Mazzei, Vincenzo ombardo and Patrick Sturt Abstract Dynamic grammars are relevant for psycholinguistic modelling and speech processing.

More information

Model-Theory of Property Grammars with Features

Model-Theory of Property Grammars with Features Model-Theory of Property Grammars with Features Denys Duchier Thi-Bich-Hanh Dao firstname.lastname@univ-orleans.fr Yannick Parmentier Abstract In this paper, we present a model-theoretic description of

More information

NPy [wh] NP# VP V NP e Figure 1: An elementary tree only the substitution operation, since, by using descriptions of trees as the elementary objects o

NPy [wh] NP# VP V NP e Figure 1: An elementary tree only the substitution operation, since, by using descriptions of trees as the elementary objects o Exploring the Underspecied World of Lexicalized Tree Adjoining Grammars K. Vijay-hanker Department of Computer & Information cience University of Delaware Newark, DE 19716 vijay@udel.edu David Weir chool

More information

Algorithms for Syntax-Aware Statistical Machine Translation

Algorithms for Syntax-Aware Statistical Machine Translation Algorithms for Syntax-Aware Statistical Machine Translation I. Dan Melamed, Wei Wang and Ben Wellington ew York University Syntax-Aware Statistical MT Statistical involves machine learning (ML) seems crucial

More information

SEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL

SEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL SEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL Linguistics Department, Stanford University Department of Philosophy, Göteborg University peters csli.stanford.edu, dag.westerstahl phil.gu.se

More information

On rigid NL Lambek grammars inference from generalized functor-argument data

On rigid NL Lambek grammars inference from generalized functor-argument data 7 On rigid NL Lambek grammars inference from generalized functor-argument data Denis Béchet and Annie Foret Abstract This paper is concerned with the inference of categorial grammars, a context-free grammar

More information

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22 Parsing Probabilistic CFG (PCFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 22 Table of contents 1 Introduction 2 PCFG 3 Inside and outside probability 4 Parsing Jurafsky

More information

A Computational Approach to Minimalism

A Computational Approach to Minimalism A Computational Approach to Minimalism Alain LECOMTE CLIPS-IMAG BP-53 38041 Grenoble, cedex 9, France Email: Alain.Lecomte@upmf-grenoble.fr Abstract The aim of this paper is to recast minimalist principles

More information

Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication

Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication Shay B. Cohen University of Edinburgh Daniel Gildea University of Rochester We describe a recognition algorithm for a subset

More information

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars Salil Vadhan October 2, 2012 Reading: Sipser, 2.1 (except Chomsky Normal Form). Algorithmic questions about regular

More information

Other types of Movement

Other types of Movement Other types of Movement So far we seen Wh-movement, which moves certain types of (XP) constituents to the specifier of a CP. Wh-movement is also called A-bar movement. We will look at two more types of

More information

Efficient Parsing of Well-Nested Linear Context-Free Rewriting Systems

Efficient Parsing of Well-Nested Linear Context-Free Rewriting Systems Efficient Parsing of Well-Nested Linear Context-Free Rewriting Systems Carlos Gómez-Rodríguez 1, Marco Kuhlmann 2, and Giorgio Satta 3 1 Departamento de Computación, Universidade da Coruña, Spain, cgomezr@udc.es

More information

Introduction to Computational Linguistics

Introduction to Computational Linguistics Introduction to Computational Linguistics Olga Zamaraeva (2018) Based on Bender (prev. years) University of Washington May 3, 2018 1 / 101 Midterm Project Milestone 2: due Friday Assgnments 4& 5 due dates

More information

Introduction to Semantics. The Formalization of Meaning 1

Introduction to Semantics. The Formalization of Meaning 1 The Formalization of Meaning 1 1. Obtaining a System That Derives Truth Conditions (1) The Goal of Our Enterprise To develop a system that, for every sentence S of English, derives the truth-conditions

More information

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21 Parsing Unger s Parser Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2016/17 1 / 21 Table of contents 1 Introduction 2 The Parser 3 An Example 4 Optimizations 5 Conclusion 2 / 21 Introduction

More information

Epsilon-Free Grammars and Lexicalized Grammars that Generate the Class of the Mildly Context-Sensitive Languages

Epsilon-Free Grammars and Lexicalized Grammars that Generate the Class of the Mildly Context-Sensitive Languages psilon-free rammars and Leicalized rammars that enerate the Class of the Mildly Contet-Sensitive Languages Akio FUJIYOSHI epartment of Computer and Information Sciences, Ibaraki University 4-12-1 Nakanarusawa,

More information

The word problem in Z 2 and formal language theory

The word problem in Z 2 and formal language theory The word problem in Z 2 and formal language theory Sylvain Salvati INRIA Bordeaux Sud-Ouest Topology and languages June 22-24 Outline The group language of Z 2 A similar problem in computational linguistics

More information

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing CS 562: Empirical Methods in Natural Language Processing Unit 2: Tree Models Lectures 19-23: Context-Free Grammars and Parsing Oct-Nov 2009 Liang Huang (lhuang@isi.edu) Big Picture we have already covered...

More information

Probabilistic Context-Free Grammars. Michael Collins, Columbia University

Probabilistic Context-Free Grammars. Michael Collins, Columbia University Probabilistic Context-Free Grammars Michael Collins, Columbia University Overview Probabilistic Context-Free Grammars (PCFGs) The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Topics in Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple. Xerox PARC. August 1995

Topics in Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple. Xerox PARC. August 1995 Projections and Semantic Interpretation Topics in Lexical-Functional Grammar Ronald M. Kaplan and Mary Dalrymple Xerox PARC August 199 Kaplan and Dalrymple, ESSLLI 9, Barcelona 1 Constituent structure

More information

Notes for Comp 497 (Comp 454) Week 10 4/5/05

Notes for Comp 497 (Comp 454) Week 10 4/5/05 Notes for Comp 497 (Comp 454) Week 10 4/5/05 Today look at the last two chapters in Part II. Cohen presents some results concerning context-free languages (CFL) and regular languages (RL) also some decidability

More information

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of

More information

The interplay of fact and theory in separating syntax from meaning

The interplay of fact and theory in separating syntax from meaning 2 4 The interplay of fact and theory in separating syntax from meaning Wilfrid Hodges Queen Mary, University of London August 2005 The naive mathematician infers: Language = Syntax + (Other factors) so

More information

Generalized Quantifiers Logical and Linguistic Aspects

Generalized Quantifiers Logical and Linguistic Aspects Generalized Quantifiers Logical and Linguistic Aspects Lecture 1: Formal Semantics and Generalized Quantifiers Dag Westerståhl University of Gothenburg SELLC 2010 Institute for Logic and Cognition, Sun

More information

(Ordinary) mathematical induction is a way to prove things about natural numbers. We can write it as a rule: φ(0) x. φ(x) φ(succ(x)) x.

(Ordinary) mathematical induction is a way to prove things about natural numbers. We can write it as a rule: φ(0) x. φ(x) φ(succ(x)) x. 1 Ordinary Induction (Ordinary) mathematical induction is a way to prove things about natural numbers. We can write it as a rule: φ(0) x. φ(x) φ(succ(x)) x. φ(x) where φ(x) is a statement involving a number

More information

Natural Language Processing

Natural Language Processing SFU NatLangLab Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class Simon Fraser University September 27, 2018 0 Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class

More information

LECTURER: BURCU CAN Spring

LECTURER: BURCU CAN Spring LECTURER: BURCU CAN 2017-2018 Spring Regular Language Hidden Markov Model (HMM) Context Free Language Context Sensitive Language Probabilistic Context Free Grammar (PCFG) Unrestricted Language PCFGs can

More information

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars in der emantik cope emantics in Lexicalized Tree Adjoining Grammars Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2011/2012 LTAG: The Formalism (1) Tree Adjoining Grammars (TAG): Tree-rewriting

More information

Section 4.6 Negative Exponents

Section 4.6 Negative Exponents Section 4.6 Negative Exponents INTRODUCTION In order to understand negative exponents the main topic of this section we need to make sure we understand the meaning of the reciprocal of a number. Reciprocals

More information

In this chapter, we explore the parsing problem, which encompasses several questions, including:

In this chapter, we explore the parsing problem, which encompasses several questions, including: Chapter 12 Parsing Algorithms 12.1 Introduction In this chapter, we explore the parsing problem, which encompasses several questions, including: Does L(G) contain w? What is the highest-weight derivation

More information

The semantics of propositional logic

The semantics of propositional logic The semantics of propositional logic Readings: Sections 1.3 and 1.4 of Huth and Ryan. In this module, we will nail down the formal definition of a logical formula, and describe the semantics of propositional

More information

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):

More information

Decoding and Inference with Syntactic Translation Models

Decoding and Inference with Syntactic Translation Models Decoding and Inference with Syntactic Translation Models March 5, 2013 CFGs S NP VP VP NP V V NP NP CFGs S NP VP S VP NP V V NP NP CFGs S NP VP S VP NP V NP VP V NP NP CFGs S NP VP S VP NP V NP VP V NP

More information

A Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting

A Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting A Multi-Modal Combinatory Categorial Grammar Analysis of Japanese Nonconstituent Clefting Department of Linguistics The Ohio State University {kubota,esmith}@ling.osu.edu A basic cleft sentence Data Clefting

More information

Plan for 2 nd half. Just when you thought it was safe. Just when you thought it was safe. Theory Hall of Fame. Chomsky Normal Form

Plan for 2 nd half. Just when you thought it was safe. Just when you thought it was safe. Theory Hall of Fame. Chomsky Normal Form Plan for 2 nd half Pumping Lemma for CFLs The Return of the Pumping Lemma Just when you thought it was safe Return of the Pumping Lemma Recall: With Regular Languages The Pumping Lemma showed that if a

More information

An introduction to mildly context sensitive grammar formalisms. Combinatory Categorial Grammar

An introduction to mildly context sensitive grammar formalisms. Combinatory Categorial Grammar An introduction to mildly context sensitive grammar formalisms Combinatory Categorial Grammar Gerhard Jäger & Jens Michaelis University of Potsdam {jaeger,michael}@ling.uni-potsdam.de p.1 Basic Categorial

More information

Transforming Linear Context Free Rewriting Systems into Minimalist Grammars

Transforming Linear Context Free Rewriting Systems into Minimalist Grammars Transforming Linear Context Free Rewriting Systems into Minimalist Grammars Jens Michaelis Universität Potsdam, Institut für Linguistik, PF 601553, 14415 Potsdam, Germany michael@ling.uni-potsdam.de Abstract.

More information

Problem Session 5 (CFGs) Talk about the building blocks of CFGs: S 0S 1S ε - everything. S 0S0 1S1 A - waw R. S 0S0 0S1 1S0 1S1 A - xay, where x = y.

Problem Session 5 (CFGs) Talk about the building blocks of CFGs: S 0S 1S ε - everything. S 0S0 1S1 A - waw R. S 0S0 0S1 1S0 1S1 A - xay, where x = y. CSE2001, Fall 2006 1 Problem Session 5 (CFGs) Talk about the building blocks of CFGs: S 0S 1S ε - everything. S 0S0 1S1 A - waw R. S 0S0 0S1 1S0 1S1 A - xay, where x = y. S 00S1 A - xay, where x = 2 y.

More information

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP]. CAS LX 522 Syntax I Fall 2000 September 18, 2000 Paul Hagstrom Week 2: Movement Movement Last time, we talked about subcategorization. (1) a. I can solve this problem. b. This problem, I can solve. (2)

More information

Putting Some Weakly Context-Free Formalisms in Order

Putting Some Weakly Context-Free Formalisms in Order Putting Some Weakly Context-Free Formalisms in Order David Chiang University of Pennsylvania 1. Introduction A number of formalisms have been proposed in order to restrict tree adjoining grammar (TAG)

More information

Where linguistic meaning meets non-linguistic cognition

Where linguistic meaning meets non-linguistic cognition Where linguistic meaning meets non-linguistic cognition Tim Hunter and Paul Pietroski NASSLLI 2016 Friday: Putting things together (perhaps) Outline 5 Experiments with kids on most 6 So: How should we

More information

THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION. K. A. Jayaseelan CIEFL, Hyderabad

THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION. K. A. Jayaseelan CIEFL, Hyderabad THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION K. A. Jayaseelan CIEFL, Hyderabad 1. Introduction In many languages e.g. Malayalam, Tamil, Hindi, the same verb is used in the Experiencer

More information

Well-Nestedness Properly Subsumes Strict Derivational Minimalism

Well-Nestedness Properly Subsumes Strict Derivational Minimalism Well-Nestedness Properly Subsumes Strict Derivational Minimalism Makoto Kanazawa, Jens Michaelis, Sylvain Salvati, Ryo Yoshinaka To cite this version: Makoto Kanazawa, Jens Michaelis, Sylvain Salvati,

More information

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP  NP PP 1.0. N people 0. /6/7 CS 6/CS: Natural Language Processing Instructor: Prof. Lu Wang College of Computer and Information Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang The grammar: Binary, no epsilons,.9..5

More information