Computationele grammatica
|
|
- Egbert Rice
- 6 years ago
- Views:
Transcription
1 Computationele grammatica Docent: Paola Monachesi Contents First Last Prev Next
2 Contents 1 Unbounded dependency constructions (UDCs) Some data Filler-gap constructions Traces The Complement Extraction Lexical Rule Tough Constructions Contents First Last Prev Next
3 1. Unbounded dependency constructions (UDCs)
4 1. Unbounded dependency constructions (UDCs) The cooccurrence restrictions analyzed so far are all quite local, since they involve limitations on what can occur together as elements of a single clause.
5 1. Unbounded dependency constructions (UDCs) The cooccurrence restrictions analyzed so far are all quite local, since they involve limitations on what can occur together as elements of a single clause. This locality has been extended slightly in the analysis of raising since the cooccurence restructions of one verb are transmitted to the higher verb.
6 1. Unbounded dependency constructions (UDCs) The cooccurrence restrictions analyzed so far are all quite local, since they involve limitations on what can occur together as elements of a single clause. This locality has been extended slightly in the analysis of raising since the cooccurence restructions of one verb are transmitted to the higher verb. New class of constructions in which the locality of cooccurence restrictions appears to be violated in a more radical way.
7 1. Unbounded dependency constructions (UDCs) The cooccurrence restrictions analyzed so far are all quite local, since they involve limitations on what can occur together as elements of a single clause. This locality has been extended slightly in the analysis of raising since the cooccurence restructions of one verb are transmitted to the higher verb. New class of constructions in which the locality of cooccurence restrictions appears to be violated in a more radical way. Two elements appear far from one another in a sentence, despite the existence of a syntactic dependency between them. Contents First Last Prev Next
8 2. Some data
9 2. Some data Why are the following examples ungrammatical? (1) *They gave to the man. (2) *They gave the book. (3) * You have talked to (4) * The men discovered Contents First Last Prev Next
10
11 Why are the following examples grammatical? (5) What did they give to the man? (6) To whom did they give the book? (7) Whom have you talked to? (8) What did the men discover? Contents First Last Prev Next
12
13 Why are the following examples grammatical? (9) The book which they gave to the man... (10) The man that they gave the book to... (11) The man who you have talked to... (12) The planet that the men discovered... Contents First Last Prev Next
14
15 Why are the following examples grammatical? (13) That book, they gave to the man. (14) The man, they gave the book to. (15) That man, you have talked to. (16) A new planet, the men have discovered. Contents First Last Prev Next
16
17 In the examples above, there is a dependency between an extra phrase or filler at the beginning of a clause and a gap somewhere within the clause.
18 In the examples above, there is a dependency between an extra phrase or filler at the beginning of a clause and a gap somewhere within the clause. Elements which cannot normally be missing from a clause are allowed to be missing if there is an appropriate filler in the right place.
19 In the examples above, there is a dependency between an extra phrase or filler at the beginning of a clause and a gap somewhere within the clause. Elements which cannot normally be missing from a clause are allowed to be missing if there is an appropriate filler in the right place. If there is a filler there must be a gap.
20 In the examples above, there is a dependency between an extra phrase or filler at the beginning of a clause and a gap somewhere within the clause. Elements which cannot normally be missing from a clause are allowed to be missing if there is an appropriate filler in the right place. If there is a filler there must be a gap. The filler can be separated from the gap by extra clauses. Contents First Last Prev Next
21
22 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization)
23 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question)
24 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause)
25 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft)
26 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft) e. [ What 1 Kim loves 1 ] is Sandy. (pseudocleft)
27 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft) e. [ What 1 Kim loves 1 ] is Sandy. (pseudocleft) (18) a. I bought it 1 for Sandy to eat 1. (purpose infinitive)
28 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft) e. [ What 1 Kim loves 1 ] is Sandy. (pseudocleft) (18) a. I bought it 1 for Sandy to eat 1. (purpose infinitive) b. Sandy 1 is hard to love 1. (tough movement)
29 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft) e. [ What 1 Kim loves 1 ] is Sandy. (pseudocleft) (18) a. I bought it 1 for Sandy to eat 1. (purpose infinitive) b. Sandy 1 is hard to love 1. (tough movement) c. This is the politician 1 [ Sandy loves 1 ]. (relative clause)
30 In English, the class of UDCs, includes the following phenomena: (17) a. Kim 1, Sandy loves 1. (topicalization) b. I wonder [ who 1 Sandy loves 1 ]. (wh-question) c. This is the politician [ who 1 Sandy loves 1 ]. (wh-rel. clause) d. It s Kim [ who 1 Sandy loves 1 ]. (it-cleft) e. [ What 1 Kim loves 1 ] is Sandy. (pseudocleft) (18) a. I bought it 1 for Sandy to eat 1. (purpose infinitive) b. Sandy 1 is hard to love 1. (tough movement) c. This is the politician 1 [ Sandy loves 1 ]. (relative clause) d. It s Kim 1 [ Sandy loves 1 ]. (it-cleft) Contents First Last Prev Next
31
32 The examples in (1) are considered filler-gap constructions or strong UDC, while the examples in (2) are classified as weak UDC. Contents First Last Prev Next
33 UDCs are characterized by the fact that:
34 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries.
35 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries. The filler and the gap should be of the same syntactic category.
36 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries. The filler and the gap should be of the same syntactic category. (19) a. Kim 1, Dana believes Chris knows Sandy trusts 1
37 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries. The filler and the gap should be of the same syntactic category. (19) a. Kim 1, Dana believes Chris knows Sandy trusts 1 b. [ On Kim 1 ], Dana believes Chris knows Sandy depends 1
38 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries. The filler and the gap should be of the same syntactic category. (19) a. Kim 1, Dana believes Chris knows Sandy trusts 1 b. [ On Kim 1 ], Dana believes Chris knows Sandy depends 1 c. * [ On Kim 1 ], Dana believes Chris knows Sandy trusts 1
39 UDCs are characterized by the fact that: The relevant dependency may extend across arbitrarily many clause boundaries. The filler and the gap should be of the same syntactic category. (19) a. Kim 1, Dana believes Chris knows Sandy trusts 1 b. [ On Kim 1 ], Dana believes Chris knows Sandy depends 1 c. * [ On Kim 1 ], Dana believes Chris knows Sandy trusts 1 Given the nonlocal character of these constructions, they are accounted for by means of nonlocal features. Contents First Last Prev Next
40 SLASH (set of local structures)
41 SLASH REL (set of local structures) (set of ref indices)
42 SLASH REL QUE (set of local structures) (set of ref indices) (set of npros)
43 SLASH REL QUE (set of local structures) (set of ref indices) (set of npros) Contents First Last Prev Next
44 3. Filler-gap constructions
45 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top.
46 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top. The bottom is where the dependency is introduced.
47 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top. The bottom is where the dependency is introduced. The middle is where it is successively passed from daughter to mother.
48 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top. The bottom is where the dependency is introduced. The middle is where it is successively passed from daughter to mother. The top is where the dependency is discharged.
49 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top. The bottom is where the dependency is introduced. The middle is where it is successively passed from daughter to mother. The top is where the dependency is discharged. In the analysis presented in P&S94 the dependency is introduced by means of a trace.
50 3. Filler-gap constructions A UDC can be described as divided in three parts: a bottom, a middle and a top. The bottom is where the dependency is introduced. The middle is where it is successively passed from daughter to mother. The top is where the dependency is discharged. In the analysis presented in P&S94 the dependency is introduced by means of a trace. In the revisions presented in ch. 9 a traceless analysis is proposed. Contents First Last Prev Next
51 S ] [ NP LOCAL 1 [ S SLASH { 1 } ] Kim NP [ { } ] VP SLASH 1 we V [ { } ] S SLASH 1 know NP [ { } ] VP SLASH 1 Sandy V NP hates Contents First Last Prev Next
52 4. Traces PHON SYNSEM LOCAL 1 NONLOCAL SLASH { 1 } QUE { } REL { }
53 4. Traces PHON SYNSEM LOCAL 1 NONLOCAL SLASH { 1 } QUE { } REL { } A trace is a special lexical item with a quite impoverished structure.
54 4. Traces PHON SYNSEM LOCAL 1 NONLOCAL SLASH { 1 } QUE { } REL { } A trace is a special lexical item with a quite impoverished structure. If a trace occurs as complement of some head, it will structure share the local features which are specified for that complement by the head.
55 4. Traces PHON LOCAL 1 { } SLASH 1 SYNSEM { } NONLOCAL QUE { } REL A trace is a special lexical item with a quite impoverished structure. If a trace occurs as complement of some head, it will structure share the local features which are specified for that complement by the head. The general assumption is that traces have a detactable psycholinguistic reality. The comprehension of a filler-gap sentence is complete only when a trace Contents First Last Prev Next
56 is processed and identified with the filler.
57 is processed and identified with the filler. Study by Pickering and Barry (1991) that tries to prove the non existence of traces in filler-gap constructions. (20) Which box did you put the cake in?
58 is processed and identified with the filler. Study by Pickering and Barry (1991) that tries to prove the non existence of traces in filler-gap constructions. (20) Which box did you put the cake in? (21) Which box did you put the very large and beautifully decorated wedding cake bought from the expensive bakery in?
59 is processed and identified with the filler. Study by Pickering and Barry (1991) that tries to prove the non existence of traces in filler-gap constructions. (20) Which box did you put the cake in? (21) Which box did you put the very large and beautifully decorated wedding cake bought from the expensive bakery in? (22) In which box did you put the very large and beautifully decorated wedding cake bought from the expensive bakery?
60 is processed and identified with the filler. Study by Pickering and Barry (1991) that tries to prove the non existence of traces in filler-gap constructions. (20) Which box did you put the cake in? (21) Which box did you put the very large and beautifully decorated wedding cake bought from the expensive bakery in? (22) In which box did you put the very large and beautifully decorated wedding cake bought from the expensive bakery? Comprehension is complete not when a trace position is found, but reather when an appropriate lexial element is processed (i.e., the verbal head whose complement is associated with the filler). Contents First Last Prev Next
61 5. The Complement Extraction Lexical Rule
62 5. The Complement Extraction Lexical Rule The extraction of complements is accounted for by means of a lexical rule. The dependency is introduced without the use of traces. (23) Complement Extraction Lexical Rule (CELR) [ ] COMPS..., 3 LOC 1,... INHER SLASH 2 ARG-S..., 3,... Contents First Last Prev Next
63 COMPS... { } INHER SLASH 1 2 [ { } ] ARG-S..., 4 LOC 1, INHER SLASH 1,...
64 COMPS... { } INHER SLASH 1 2 [ { } ] ARG-S..., 4 LOC 1, INHER SLASH 1,... The middle of the dependency is where the information associated with the INHER SLASH feature is passed up to the mother.
65 COMPS... { } INHER SLASH 1 2 [ { } ] ARG-S..., 4 LOC 1, INHER SLASH 1,... The middle of the dependency is where the information associated with the INHER SLASH feature is passed up to the mother. The percolation of this feature is regulated by the Nonlocal Feature Principle:
66 COMPS... { } INHER SLASH 1 2 [ { } ] ARG-S..., 4 LOC 1, INHER SLASH 1,... The middle of the dependency is where the information associated with the INHER SLASH feature is passed up to the mother. The percolation of this feature is regulated by the Nonlocal Feature Principle: (24) Nonlocal Feature Principle (NFP) For each nonlocal feature, the INHERITED value on the mother is the union of the INHERITED values on the daughters minus the TO-BIND value on the head daughter. Contents First Last Prev Next
67
68 The TO-BIND feature is used to prevent the percolation of the SLASH feature once its value has been identified with the local features of an appropriate filler. Contents First Last Prev Next
69
70 (25) Structure of nonlocal nonlocal SLASH INHER REL QUE SLASH TO-BIND REL QUE (set of local structures) (set of ref indices) (set of npros) (set of local structures) (set of ref indices) (set of npros)
71 (25) Structure of nonlocal nonlocal SLASH INHER REL QUE SLASH TO-BIND REL QUE (set of local structures) (set of ref indices) (set of npros) (set of local structures) (set of ref indices) (set of npros) The dependency is bound off by the head-filler schema. Contents First Last Prev Next
72
73 (26) Head-filler schema hd-spr-ph NH-DTR SS [ LOC 1 ] HD-DTR SS LOC CAT HEAD [ verb VFORM fin ] SUBJ COMPS NONLC INHER [ SLASH { 1 } ] TO-BIND [ SLASH { 1 } ] Contents First Last Prev Next
74 [ { } ] S INH SLASH [ ] { } NP LOCAL 1 S INH SLASH 1 { } Kim TO-BD SLASH 1 NP [ { } ] VP INH SLASH 1 we V [ { S INH SLASH 1 know } ] NP [ { } ] VP Contents INH SLASH First Last 1 Prev Next Sandy
75 6. Tough Constructions (27) I 1 (nom) am easy to please 1 (acc)
76 6. Tough Constructions (27) I 1 (nom) am easy to please 1 (acc) This is an example of weak UDC. There is no filler corresponding to the trace, but a constituent in an argument position which is coindexed with the trace.
77 6. Tough Constructions (27) I 1 (nom) am easy to please 1 (acc) This is an example of weak UDC. There is no filler corresponding to the trace, but a constituent in an argument position which is coindexed with the trace. The trace and the coindexed subject need not have the same case
78 6. Tough Constructions (27) I 1 (nom) am easy to please 1 (acc) This is an example of weak UDC. There is no filler corresponding to the trace, but a constituent in an argument position which is coindexed with the trace. The trace and the coindexed subject need not have the same case The bottom and middle of the dependency are treated as in strong UDC, but the top differs. (28) Easy Contents First Last Prev Next
79 HEAD LOC CAT
80 HEAD adjective LOC CAT
81 HEAD adjective SUBJ LOC CAT
82 HEAD adjective SUBJ NP 1 LOC CAT COMPS
83 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ), VP
84 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ) [, VP inf,
85 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ) [ { [ ] } ], VP inf, INHER SLASH 2 NP acc :ppro 1,...
86 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ) [ { [ ] } ], VP inf, INHER SLASH 2 NP acc :ppro 1,... NONLOCAL TO-BIND SLASH
87 LOC CAT HEAD adjective SUBJ NP 1 COMPS ( PP [ for] ), VP [ inf, INHER SLASH { 2 NP [ acc ] :ppro 1,... } ] NONLOCAL TO-BIND SLASH { 2 }
88 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ) [ { [ ] } ], VP inf, INHER SLASH 2 NP acc :ppro 1,... { } NONLOCAL TO-BIND SLASH 2 The sign for the tough adjective specifies the link between the subject and the INHER SLASH value.
89 HEAD adjective SUBJ NP 1 LOC CAT ( [ COMPS PP for] ) [ { [ ] } ], VP inf, INHER SLASH 2 NP acc :ppro 1,... { } NONLOCAL TO-BIND SLASH 2 The sign for the tough adjective specifies the link between the subject and the INHER SLASH value. The structure sharing between the tough adjective TO-BIND SLASH value and the INHER SLASH value on the VP complement prevents the propagation of the SLASH value once it has been bound. Contents First Last Prev Next
90 S [ ] VP SUBJ 3 I V am AP SUBJ 3 { } INH SLASH 3 NP 1 [ { } ] A SUBJ 3 NP 1 { } VP inf, INH SLASH 2 1 TO-BD SLASH 2 NP Contents V First [ Last Prev Next { } ] easy VP INH SLASH 2
Analyzing Extraction. N.B.: Each type has its own set of variables (Church typing).
Analyzing Extraction Carl Pollard Linguistics 602.02 Mar. 8, 2007 (1) TLC Revisited To motivate our approach to analyzing nonlocal phenomena, we first look at a reformulation TLC of that makes explicit
More informationHPSG II: the plot thickens
Syntactic Models 2/21/06 HPSG II: the plot thickens 1 Passive: a lexical rule that rearranges ARG-ST! (1) Passive Lexical Rule < 1, tv - lxm ARG - ST INDEX i < FPSP 1 a, > part - lxm SYN HEAD FORM pass
More informationCh. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically
Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically NP S AUX VP Ali will V NP help D N the man A node is any point in the tree diagram and it can
More informationHPSG: Binding Theory
HPSG: Binding Theory Doug Arnold doug@essexacuk Introduction Binding Theory is to do with the syntactic restrictions on the distribution of referentially dependent items and their antecedents: reflexives/reciprocals
More informationSome binding facts. Binding in HPSG. The three basic principles. Binding theory of Chomsky
Some binding facts Binding in HPSG Binding theory has to do with constraining referentially dependent elements in terms of what is a possible antecedent, or binder. Introduction to HPSG June, 009 (1) John
More informationControl and Tough- Movement
Control and Tough- Movement Carl Pollard February 2, 2012 Control (1/5) We saw that PRO is used for the unrealized subject of nonfinite verbals and predicatives where the subject plays a semantic role
More informationControl and Tough- Movement
Department of Linguistics Ohio State University February 2, 2012 Control (1/5) We saw that PRO is used for the unrealized subject of nonfinite verbals and predicatives where the subject plays a semantic
More information(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].
CAS LX 522 Syntax I Fall 2000 September 18, 2000 Paul Hagstrom Week 2: Movement Movement Last time, we talked about subcategorization. (1) a. I can solve this problem. b. This problem, I can solve. (2)
More informationSimpler Syntax. Ling : Sign-Based Construction Grammar Instructor: Ivan A. Sag URL:
Simpler Syntax Ling 7800-065: Sign-Based Construction Grammar Instructor: Ivan A. Sag (sag@stanford.edu) URL: http://lingo.stanford.edu/sag/li11-sbcg 1/ 58 Constructs ] [mtr sign construct : dtrs nelist(sign)
More information1. Background. Task: Determine whether a given string of words is a grammatical (well-formed) sentence of language L i or not.
Constraints in Syntax [1] Phrase Structure and Derivations Düsseldorf LSA/DGfS Summerschool 2002 Gereon Müller (IDS Mannheim) gereon.mueller@ids-mannheim.de 1. Background Task: Determine whether a given
More informationA Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting
A Multi-Modal Combinatory Categorial Grammar Analysis of Japanese Nonconstituent Clefting Department of Linguistics The Ohio State University {kubota,esmith}@ling.osu.edu A basic cleft sentence Data Clefting
More informationX-bar theory. X-bar :
is one of the greatest contributions of generative school in the filed of knowledge system. Besides linguistics, computer science is greatly indebted to Chomsky to have propounded the theory of x-bar.
More informationLing 5801: Lecture Notes 7 From Programs to Context-Free Grammars
Ling 5801: Lecture otes 7 From Programs to Context-Free rammars 1. The rules we used to define programs make up a context-free grammar A Context-Free rammar is a tuple C,X,S,R, where: C is a finite set
More informationLing 240 Lecture #15. Syntax 4
Ling 240 Lecture #15 Syntax 4 agenda for today Give me homework 3! Language presentation! return Quiz 2 brief review of friday More on transformation Homework 4 A set of Phrase Structure Rules S -> (aux)
More informationConstituency. Doug Arnold
Constituency Doug Arnold doug@essex.ac.uk Spose we have a string... xyz..., how can we establish whether xyz is a constituent (i.e. syntactic unit); i.e. whether the representation of... xyz... should
More informationGrundlagenmodul Semantik All Exercises
Grundlagenmodul Semantik All Exercises Sommersemester 2014 Exercise 1 Are the following statements correct? Justify your answers in a single short sentence. 1. 11 {x x is a square number} 2. 11 {x {y y
More informationSemantics 2 Part 1: Relative Clauses and Variables
Semantics 2 Part 1: Relative Clauses and Variables Sam Alxatib EVELIN 2012 January 17, 2012 Reviewing Adjectives Adjectives are treated as predicates of individuals, i.e. as functions from individuals
More informationCAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued
CAS LX 522 Syntax I Fall 2000 October 0, 2000 Paul Hagstrom Week 5: Case Theory and θ Theory θ-theory continued From last time: verbs have θ-roles (e.g., Agent, Theme, ) to assign, specified in the lexicon
More informationRecap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees
Recap: Lexicalized PCFGs We now need to estimate rule probabilities such as P rob(s(questioned,vt) NP(lawyer,NN) VP(questioned,Vt) S(questioned,Vt)) 6.864 (Fall 2007): Lecture 5 Parsing and Syntax III
More informationBinding Theory Different types of NPs, constraints on their distribution
Binding Theory Different types of Ps, constraints on their distribution Ling 322 Read Syntax, Ch. 5 (Lecture notes based on Andrew Carnie s notes) 1 Different Types of Ps R-expressions An P that gets its
More informationA Context-Free Grammar
Statistical Parsing A Context-Free Grammar S VP VP Vi VP Vt VP VP PP DT NN PP PP P Vi sleeps Vt saw NN man NN dog NN telescope DT the IN with IN in Ambiguity A sentence of reasonable length can easily
More informationEntropy. Leonoor van der Beek, Department of Alfa-informatica Rijksuniversiteit Groningen. May 2005
Entropy Leonoor van der Beek, vdbeek@rug.nl Department of Alfa-informatica Rijksuniversiteit Groningen May 2005 What is entropy? Entropy is a measure of uncertainty or surprise or disorder. Entropy was
More informationChiastic Lambda-Calculi
Chiastic Lambda-Calculi wren ng thornton Cognitive Science & Computational Linguistics Indiana University, Bloomington NLCS, 28 June 2013 wren ng thornton (Indiana University) Chiastic Lambda-Calculi NLCS,
More informationAndrew Carnie, Structural Relations. The mathematical properties of phrase structure trees
Structural Relations The mathematical properties of phrase structure trees Important! Important! Even if you have trouble with the formal definitions, try to understand the INTUITIVE idea behind them.
More informationSharpening the empirical claims of generative syntax through formalization
Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities NASSLLI, June 2014 Part 1: Grammars and cognitive hypotheses What is a grammar?
More informationIntroduction to Semantics. The Formalization of Meaning 1
The Formalization of Meaning 1 1. Obtaining a System That Derives Truth Conditions (1) The Goal of Our Enterprise To develop a system that, for every sentence S of English, derives the truth-conditions
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars
More informationAn introduction to mildly context sensitive grammar formalisms. Combinatory Categorial Grammar
An introduction to mildly context sensitive grammar formalisms Combinatory Categorial Grammar Gerhard Jäger & Jens Michaelis University of Potsdam {jaeger,michael}@ling.uni-potsdam.de p.1 Basic Categorial
More informationCategorial Grammar. Larry Moss NASSLLI. Indiana University
1/37 Categorial Grammar Larry Moss Indiana University NASSLLI 2/37 Categorial Grammar (CG) CG is the tradition in grammar that is closest to the work that we ll do in this course. Reason: In CG, syntax
More information2013 ISSN: JATLaC Journal 8: t 1. t t Chomsky 1993 I Radford (2009) R I t t R I 2. t R t (1) (= R's (15), p. 86) He could have helped
t 1. tt Chomsky 1993 IRadford (2009) R Itt R I 2. t R t (1) (= R's (15), p. 86) He could have helped her, or [she have helped him]. 2 have I has had I I could ellipsis I gapping R (2) (= R (18), p.88)
More informationTopics in Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple. Xerox PARC. August 1995
Projections and Semantic Interpretation Topics in Lexical-Functional Grammar Ronald M. Kaplan and Mary Dalrymple Xerox PARC August 199 Kaplan and Dalrymple, ESSLLI 9, Barcelona 1 Constituent structure
More informationRaising and Passive. Jean Mark Gawron. Linguistics 522 San Diego State University
Raising and Passive Jean Mark Gawron Linguistics 522 San Diego State University gawron@mail.sdsu.edu http://www.rohan.sdsu.edu/ gawron Raising and Passive p. 1/20 Sentences Part I Raising and Passive p.
More informationTHE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION. K. A. Jayaseelan CIEFL, Hyderabad
THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION K. A. Jayaseelan CIEFL, Hyderabad 1. Introduction In many languages e.g. Malayalam, Tamil, Hindi, the same verb is used in the Experiencer
More informationClass Notes: Tsujimura (2007), Ch. 5. Syntax (1), pp (3) a. [[akai hon]-no hyooshi] b. [akai [hon-no hyooshi]]
Class otes: Tsujimura (2007), Ch. 5. yntax (1), pp. 206-220 p. 206 What is!ytx"?! n area in linguistics that deals with the REGULRLITY of how words are put together to create grammatical sentences What
More informationUnification. Two Routes to Deep Structure. Unification. Unification Grammar. Martin Kay. Stanford University University of the Saarland
Two Routes to Deep Structure Derivational! Transformational! Procedural Deep Unidirectional Transformations Surface Stanford University University of the Saarland Constraint-based! Declarative Deep Surface
More informationWord Order and the Floating Quantifier in Cebuano
Word Order and the Floating Quantifier in Cebuano Yumiko Ishikawa 1. Introduction The sentential subject long had been considered to be base-generated in Spec IP and to be assigned nominative Case under
More informationQuantification: Quantifiers and the Rest of the Sentence
Ling255: Sem & Cogsci Maribel Romero February 17, 2005 Quantification: Quantifiers and the Rest of the Sentence 1. Introduction. We have seen that Determiners express a relation between two sets of individuals
More informationModel-Theory of Property Grammars with Features
Model-Theory of Property Grammars with Features Denys Duchier Thi-Bich-Hanh Dao firstname.lastname@univ-orleans.fr Yannick Parmentier Abstract In this paper, we present a model-theoretic description of
More informationLinguistics 819: Seminar on TAG and CCG. Introduction to Combinatory Categorial Grammar
Linguistics 819: Seminar on TAG and CCG Alexander Williams, 9 April 2008 Introduction to Combinatory Categorial Grammar Steedman and Dowty 1 The biggest picture Grammatical operations apply only to constituents.
More informationOther types of Movement
Other types of Movement So far we seen Wh-movement, which moves certain types of (XP) constituents to the specifier of a CP. Wh-movement is also called A-bar movement. We will look at two more types of
More informationThe Semantics of Definite DPs 1. b. Argument Position: (i) [ A politician ] arrived from Washington. (ii) Joe likes [ the politician ].
The Semantics of Definite DPs 1 Thus far, our semantics is able to interpret common nouns that occupy predicate position (1a). However, the most common position for common nouns to occupy is internal to
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars CS 585, Fall 2017 Introduction to Natural Language Processing http://people.cs.umass.edu/~brenocon/inlp2017 Brendan O Connor College of Information and Computer Sciences
More informationWh-movement. CAS LX 522 Syntax I Fall 2001 November 6, 2001
AS LX 522 Syntax I Fall 200 November 6, 200 Paul Hagstrom Week 9: Wh-movement Preliminary tree to remind ourselves: They will bake a cake. () P [ Q] T T VP will P bake they NP a N cake Verb bake needs
More information564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.
564 Lecture 25 Nov. 23, 1999 1 Continuing note on presuppositional vs. nonpresuppositional dets. Here's the argument about the nonpresupp vs. presupp analysis of "every" that I couldn't reconstruct last
More informationCS 712: Topics in NLP Linguistic Phrases and Statistical Phrases
CS 712: Topics in NLP Linguistic Phrases and Statistical Phrases Pushpak Bhattacharyya, CSE Department, IIT Bombay 18 March, 2013 (main text: Syntax by Adrew Carnie, Blackwell Publication, 2002) Domination
More informationGrammar and Feature Unification
Grammar and Feature Unification Problems with CF Phrase Structure Grammars Difficult to capture dependencies between constituents the boy runs the boys run * the boy run * the boys runs Problems with CF
More informationCAS LX 522 Syntax I November 4, 2002 Week 9: Wh-movement, supplement
CAS LX 522 Syntax I November 4, 2002 Fall 2002 Week 9: Wh-movement, supplement () Italian Tuo fratello ( your brother ), [ CP a cui i [ TP mi domando [ CP che storie i [ TP abbiano raccontato t i t j...
More informationTree Adjoining Grammars
Tree Adjoining Grammars Feature Structure Based TAG Laura Kallmeyer & Benjamin Burkhardt HHU Düsseldorf WS 2017/2018 1 / 20 Outline 1 Why feature structures? 2 Basics of feature structure logic 3 Feature
More informationC SC 620 Advanced Topics in Natural Language Processing. Lecture 21 4/13
C SC 620 Advanced Topics in Natural Language Processing Lecture 21 4/13 Reading List Readings in Machine Translation, Eds. Nirenburg, S. et al. MIT Press 2003. 19. Montague Grammar and Machine Translation.
More informationProseminar on Semantic Theory Fall 2013 Ling 720 The Proper Treatment of Quantification in Ordinary English, Part 1: The Fragment of English
The Proper Treatment of Quantification in Ordinary English, Part 1: The Fragment of English We will now explore the analysis of English that Montague puts forth in his seminal paper, PTQ. As we ve already
More informationFocus in complex noun phrases
Focus in complex noun phrases Summary In this paper I investigate the semantics of association with focus in complex noun phrases in the framework of Alternative Semantics (Rooth 1985, 1992). For the first
More informationParasitic Scope (Barker 2007) Semantics Seminar 11/10/08
Parasitic Scope (Barker 2007) Semantics Seminar 11/10/08 1. Overview Attempts to provide a compositional, fully semantic account of same. Elements other than NPs in particular, adjectives can be scope-taking
More informationSemantics and Generative Grammar. The Semantics of Adjectival Modification 1. (1) Our Current Assumptions Regarding Adjectives and Common Ns
The Semantics of Adjectival Modification 1 (1) Our Current Assumptions Regarding Adjectives and Common Ns a. Both adjectives and common nouns denote functions of type (i) [[ male ]] = [ λx : x D
More informationSemantics and Generative Grammar. Quantificational DPs, Part 3: Covert Movement vs. Type Shifting 1
Quantificational DPs, Part 3: Covert Movement vs. Type Shifting 1 1. Introduction Thus far, we ve considered two competing analyses of sentences like those in (1). (1) Sentences Where a Quantificational
More informationA DOP Model for LFG. Rens Bod and Ronald Kaplan. Kathrin Spreyer Data-Oriented Parsing, 14 June 2005
A DOP Model for LFG Rens Bod and Ronald Kaplan Kathrin Spreyer Data-Oriented Parsing, 14 June 2005 Lexical-Functional Grammar (LFG) Levels of linguistic knowledge represented formally differently (non-monostratal):
More informationCMPT-825 Natural Language Processing. Why are parsing algorithms important?
CMPT-825 Natural Language Processing Anoop Sarkar http://www.cs.sfu.ca/ anoop October 26, 2010 1/34 Why are parsing algorithms important? A linguistic theory is implemented in a formal system to generate
More informationAssociation with traces & the copy theory of movement 1
Association with traces & copy ory of movement 1 mitcho (Michael Yoshitaka ERLEWINE), MIT, Sinn und Bedeutung 18, 13 September 2013 1 Introduction Today I will discuss Association with Focus: (1) a John
More informationA Computational Approach to Minimalism
A Computational Approach to Minimalism Alain LECOMTE CLIPS-IMAG BP-53 38041 Grenoble, cedex 9, France Email: Alain.Lecomte@upmf-grenoble.fr Abstract The aim of this paper is to recast minimalist principles
More informationSharpening the empirical claims of generative syntax through formalization
Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities ESSLLI, August 2015 Part 1: Grammars and cognitive hypotheses What is a grammar?
More informationAnother look at PSRs: Intermediate Structure. Starting X-bar theory
Another look at PSRs: Intermediate Structure Starting X-bar theory Andrew Carnie, 2006 Substitution Andrew Carnie, 2006 Substitution If a group of words can be replaced by a single word, they are a constituent.
More informationThe SUBTLE NL Parsing Pipeline: A Complete Parser for English Mitch Marcus University of Pennsylvania
The SUBTLE NL Parsing Pipeline: A Complete Parser for English Mitch Marcus University of Pennsylvania 1 PICTURE OF ANALYSIS PIPELINE Tokenize Maximum Entropy POS tagger MXPOST Ratnaparkhi Core Parser Collins
More informationThe same definition may henceforth be expressed as follows:
34 Executing the Fregean Program The extension of "lsit under this scheme of abbreviation is the following set X of ordered triples: X := { E D x D x D : x introduces y to z}. How is this extension
More informationSharpening the empirical claims of generative syntax through formalization
Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities NASSLLI, June 2014 Part 1: Grammars and cognitive hypotheses What is a grammar?
More informationIntroduction to Semantics. Common Nouns and Adjectives in Predicate Position 1
Common Nouns and Adjectives in Predicate Position 1 (1) The Lexicon of Our System at Present a. Proper Names: [[ Barack ]] = Barack b. Intransitive Verbs: [[ smokes ]] = [ λx : x D e. IF x smokes THEN
More informationThai Wh-expressions at the Left Edge of the Clause: Contrastive and Identificational Wh-clefts
Concentric: Studies in Linguistics 33.2 (July 2007): 121-157 Thai Wh-expressions at the Left Edge of the Clause: Contrastive and Identificational Wh-clefts Sugunya Ruangjaroon Srinakharinwirot University
More informationParsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing
L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the
More informationParsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.
: Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching
More informationStepanov 2007: The End of CED? Minimalism and Extraction Domains
Stepanov (2007) Stepanov 2007: The End of CED? Minimalism and Extraction Domains 1 Introduction In English (and other languages), overt wh-extraction out of subjects or adjuncts (as opposed to objects)
More informationSEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL
SEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL Linguistics Department, Stanford University Department of Philosophy, Göteborg University peters csli.stanford.edu, dag.westerstahl phil.gu.se
More informationUnterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars
in der emantik cope emantics in Lexicalized Tree Adjoining Grammars Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2011/2012 LTAG: The Formalism (1) Tree Adjoining Grammars (TAG): Tree-rewriting
More informationMulti-Component Word Sense Disambiguation
Multi-Component Word Sense Disambiguation Massimiliano Ciaramita and Mark Johnson Brown University BLLIP: http://www.cog.brown.edu/research/nlp Ciaramita and Johnson 1 Outline Pattern classification for
More informationLogical Translations Jean Mark Gawron San Diego State University. 1 Introduction 2
Logical Translations Jean Mark Gawron San Diego State University Contents 1 Introduction 2 2 Truth-Functional Connectives 2 2.1 And................................ 2 2.2 Or.................................
More informationMinimal Recursion Semantics: an Introduction
Minimal Recursion Semantics: an Introduction (Mexico City 2000) Ann Copestake Daniel Flickinger Carl Pollard Ivan A. Sag 1/ 31 Desiderata for Computational Semantics: Expressive Adequacy: The framework
More informationNatural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science
Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):
More information0A B 0C 0D 0E 0F A 0B 0C 0D 0E 0F
0A 0 03 06 0 0 0 01 02 03 04 05 06 07 08 09 0B 0C 0D 0E 0F 0 0 0 0 0 0 0 01 02 03 04 05 06 07 08 09 0A 0B 0C 0D 0E 0F 0 0 0 0 0 0 1 kracht.tex; 9/02/2005; 16:01; no v.; p.2 Syntactic Codes and Grammar
More informationModels of Adjunction in Minimalist Grammars
Models of Adjunction in Minimalist Grammars Thomas Graf mail@thomasgraf.net http://thomasgraf.net Stony Brook University FG 2014 August 17, 2014 The Theory-Neutral CliffsNotes Insights Several properties
More informationDependency grammar. Recurrent neural networks. Transition-based neural parsing. Word representations. Informs Models
Dependency grammar Morphology Word order Transition-based neural parsing Word representations Recurrent neural networks Informs Models Dependency grammar Morphology Word order Transition-based neural parsing
More informationCHAPTER THREE: RELATIONS AND FUNCTIONS
CHAPTER THREE: RELATIONS AND FUNCTIONS 1 Relations Intuitively, a relation is the sort of thing that either does or does not hold between certain things, e.g. the love relation holds between Kim and Sandy
More informationSemantics and Generative Grammar. A Little Bit on Adverbs and Events
A Little Bit on Adverbs and Events 1. From Adjectives to Adverbs to Events We ve just developed a theory of the semantics of adjectives, under which they denote either functions of type (intersective
More informationSentence Planning 2: Aggregation
Pipelined Microplanning Sentence Planning 2: Aggregation Lecture 10 February 28, 2012 Reading: Chapter 5, Reiter and Dale Document Plan! Lexical choice! PPSs Aggregation PPSs Referring Expression Gen!
More informationA proof theoretical account of polarity items and monotonic inference.
A proof theoretical account of polarity items and monotonic inference. Raffaella Bernardi UiL OTS, University of Utrecht e-mail: Raffaella.Bernardi@let.uu.nl Url: http://www.let.uu.nl/ Raffaella.Bernardi/personal
More informationMoreno Mitrović. The Saarland Lectures on Formal Semantics
,, 3 Moreno Mitrović The Saarland Lectures on Formal Semantics λ- λ- λ- ( λ- ) Before we move onto this, let's recall our f -notation for intransitive verbs 1/33 λ- ( λ- ) Before we move onto this, let's
More informationParsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)
Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements
More informationAn introduction to German Syntax. 1. Head directionality: A major source of linguistic divergence
An introduction to German Syntax 19 January 2018 1. Head directionality: A major source of linguistic divergence In English, heads uniformly precede their complements: (1) a. [ kiss Mary] a. * [ Mary kiss]
More informationSpring 2017 Ling 620. An Introduction to the Semantics of Tense 1
1. Introducing Evaluation Times An Introduction to the Semantics of Tense 1 (1) Obvious, Fundamental Fact about Sentences of English The truth of some sentences (of English) depends upon the time they
More informationThe relation of surprisal and human processing
The relation of surprisal and human processing difficulty Information Theory Lecture Vera Demberg and Matt Crocker Information Theory Lecture, Universität des Saarlandes April 19th, 2015 Information theory
More informationFeatures of Statistical Parsers
Features of tatistical Parsers Preliminary results Mark Johnson Brown University TTI, October 2003 Joint work with Michael Collins (MIT) upported by NF grants LI 9720368 and II0095940 1 Talk outline tatistical
More informationHardegree, Formal Semantics, Handout of 8
Hardegree, Formal Semantics, Handout 2015-04-07 1 of 8 1. Bound Pronouns Consider the following example. every man's mother respects him In addition to the usual demonstrative reading of he, x { Mx R[m(x),
More informationUnification Grammars and Off-Line Parsability. Efrat Jaeger
Unification Grammars and Off-Line Parsability Efrat Jaeger October 1, 2002 Unification Grammars and Off-Line Parsability Research Thesis Submitted in partial fulfillment of the requirements for the degree
More informationCAS LX 500 Topics in Linguistics: Questions Spring 2006 March 2, b: Prosody and Japanese wh-questions
CAS LX 500 Topics in Linguistics: Questions Spring 2006 March 2, 2006 Paul Hagstrom 7b: Prosody and Japanese wh-questions Prosody by phase Prosody and syntax seem to fit quite tightly together. Ishihara,
More informationGeneralized Quantifiers & Categorial Approaches & Intensionality
LING 147. Semantics of Questions Week 2 Yimei Xiang September 8, 2016 Last week Generalized Quantifiers & Categorial Approaches & Intensionality The semantics of questions is hard to characterize directly.
More informationBringing machine learning & compositional semantics together: central concepts
Bringing machine learning & compositional semantics together: central concepts https://githubcom/cgpotts/annualreview-complearning Chris Potts Stanford Linguistics CS 244U: Natural language understanding
More informationSpring 2018 Ling 620 The Basics of Intensional Semantics, Part 1: The Motivation for Intensions and How to Formalize Them 1
The Basics of Intensional Semantics, Part 1: The Motivation for Intensions and How to Formalize Them 1 1. The Inadequacies of a Purely Extensional Semantics (1) Extensional Semantics a. The interpretation
More informationFrom Kalaallisut to English:! Analysis in CCG+UC 2"
Plan for today" From Kalaallisut to English:! Analysis in CCG+UC 2" Maria Bittner " (W2: Aug 5, 2009)" Introduction: "! syn-sem traits: English SA.SU.S vs. Kalaallisut BA.TO.L"! scope corollary" UC 1 +
More information2 A not-quite-argument for X-bar structure in noun phrases
CAS LX 321 / GRS LX 621 Syntax: Introduction to Sentential Structure ovember 16, 2017 1 and pronouns (1) he linguists yodel. (2) We linguists yodel. (3) hey looked at us linguists. (4) hey looked at linguists.
More informationTwo Reconstruction Puzzles Yael Sharvit University of Connecticut
Workshop on Direct Compositionality June 19-21, 2003 Brown University Two Reconstruction Puzzles Yael Sharvit University of Connecticut yael.sharvit@uconn.edu Some constructions exhibit what is known as
More informationSharpening the empirical claims of generative syntax through formalization
Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities ESSLLI, August 2015 Part 1: Grammars and cognitive hypotheses What is a grammar?
More informationTwo Types of Remnant Movement. Gereon Müller IDS Mannheim February 10, 2001
Two Types of Remnant Movement Gereon Müller IDS Mannheim February 10, 2001 1. Introduction This article is concerned with stating and accounting for differences between two types of remnant movement that
More informationStatistiek II. John Nerbonne. February 26, Dept of Information Science based also on H.Fitz s reworking
Dept of Information Science j.nerbonne@rug.nl based also on H.Fitz s reworking February 26, 2014 Last week: one-way ANOVA generalized t-test to compare means of more than two groups example: (a) compare
More informationProseminar on Semantic Theory Fall 2015 Ling 720 Adnominal Tenses Redux: Thomas (2014) Nominal Tense and Temporal Implicatures
Adnominal Tenses Redux: Thomas (2014) Nominal Tense and Temporal Implicatures 1. Tense and Nominal Tense in Mbya: A Challenge for Tonhauser (2007) Mbya is a Guarani language of Brazil, Argentina, and Paraguay.
More information