Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically

Similar documents
X-bar theory. X-bar :

Parsing with Context-Free Grammars

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued

Another look at PSRs: Intermediate Structure. Starting X-bar theory

Computationele grammatica

1. Background. Task: Determine whether a given string of words is a grammatical (well-formed) sentence of language L i or not.

Other types of Movement

Ling 240 Lecture #15. Syntax 4

Artificial Intelligence

CS 712: Topics in NLP Linguistic Phrases and Statistical Phrases

CAS LX 523 Syntax II Spring 2001 March 13, (1) A qp. Kayne, Richard (1995). The antisymmetry of syntax. Cambridge, MA: MIT Press.

2 A not-quite-argument for X-bar structure in noun phrases

Andrew Carnie, Structural Relations. The mathematical properties of phrase structure trees

THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION. K. A. Jayaseelan CIEFL, Hyderabad

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

LECTURER: BURCU CAN Spring

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

A* Search. 1 Dijkstra Shortest Path

CAS LX 522 Syntax I November 4, 2002 Week 9: Wh-movement, supplement

Introduction to Semantics. The Formalization of Meaning 1

Wh-movement. CAS LX 522 Syntax I Fall 2001 November 6, 2001

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Dependency grammar. Recurrent neural networks. Transition-based neural parsing. Word representations. Informs Models

Constituency. Doug Arnold

Ling 98a: The Meaning of Negation (Week 5)

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing

Models of Adjunction in Minimalist Grammars

HPSG II: the plot thickens

Features. An argument DP must have been assigned Case by S-structure. A Specifier of IP must have been occupied by something by S-structure.

Some binding facts. Binding in HPSG. The three basic principles. Binding theory of Chomsky

Recap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees

Parsing with Context-Free Grammars

CS 188 Introduction to AI Fall 2005 Stuart Russell Final

HPSG: Binding Theory

CS460/626 : Natural Language

A Context-Free Grammar

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09

Recap: Tree geometry, selection, Θ-theory

The Formal Architecture of. Lexical-Functional Grammar. Ronald M. Kaplan and Mary Dalrymple

Grundlagenmodul Semantik All Exercises

An introduction to German Syntax. 1. Head directionality: A major source of linguistic divergence

Class Notes: Tsujimura (2007), Ch. 5. Syntax (1), pp (3) a. [[akai hon]-no hyooshi] b. [akai [hon-no hyooshi]]

Handout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0

Sharpening the empirical claims of generative syntax through formalization

Time Zones - KET Grammar

2013 ISSN: JATLaC Journal 8: t 1. t t Chomsky 1993 I Radford (2009) R I t t R I 2. t R t (1) (= R's (15), p. 86) He could have helped

Stepanov 2007: The End of CED? Minimalism and Extraction Domains

CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

Introduction to Semantics. Common Nouns and Adjectives in Predicate Position 1

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

Semantics and Generative Grammar. Quantificational DPs, Part 3: Covert Movement vs. Type Shifting 1

Uniformity of Theta- Assignment Hypothesis. Transit i ve s. Unaccusatives. Unergatives. Double object constructions. Previously... Bill lied.

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

Intellectual Property of Mariamalia Hidalgo

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Hierarchical and Linear Constraints on Structure

Proseminar on Semantic Theory Fall 2013 Ling 720 The Proper Treatment of Quantification in Ordinary English, Part 1: The Fragment of English

Categories and Transformations 321

CS 662 Sample Midterm

CS 6120/CS4120: Natural Language Processing

Introduction to Semantics. Pronouns and Variable Assignments. We ve seen that implicatures are crucially related to context.

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars

Semantics and Generative Grammar. Pronouns and Variable Assignments 1. We ve seen that implicatures are crucially related to context.

Spring 2017 Ling 620. An Introduction to the Semantics of Tense 1

Spring 2018 Ling 620 Introduction to Semantics of Questions: Questions as Sets of Propositions (Hamblin 1973, Karttunen 1977)

Computational Linguistics

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Simpler Syntax. Ling : Sign-Based Construction Grammar Instructor: Ivan A. Sag URL:

BARE PHRASE STRUCTURE, LCA AND LINEARIZATION

Control and Tough- Movement

Structures mathématiques du langage

DT2118 Speech and Speaker Recognition

CS 224N HW:#3. (V N0 )δ N r p r + N 0. N r (r δ) + (V N 0)δ. N r r δ. + (V N 0)δ N = 1. 1 we must have the restriction: δ NN 0.

Computer Sciences Department

Chiastic Lambda-Calculi

Constituency Parsing

Model-Theory of Property Grammars with Features

Two-phase Implementation of Morphological Analysis

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.

Control and Tough- Movement

Spring 2018 Ling 620 The Basics of Intensional Semantics, Part 1: The Motivation for Intensions and How to Formalize Them 1

Ling 5801: Lecture Notes 7 From Programs to Context-Free Grammars

POS-Tagging. Fabian M. Suchanek

Dialogue Systems. Statistical NLU component. Representation. A Probabilistic Dialogue System. Task: map a sentence + context to a database query

Context- Free Parsing with CKY. October 16, 2014

One hint from secondary predication (from Baker 1997 (8) A secondary predicate cannot take the goal argument as subject of predication, wheth

Natural Language Processing

Probabilistic Context-free Grammars

Logical Translations Jean Mark Gawron San Diego State University. 1 Introduction 2

10/17/04. Today s Main Points

Syntax and Semantics in Minimalist Grammars

Handout 3: PTQ Revisited (Muskens 1995, Ch. 4)

Grammar and Feature Unification

Tree Adjoining Grammars

Two Reconstruction Puzzles Yael Sharvit University of Connecticut

Parasitic Scope (Barker 2007) Semantics Seminar 11/10/08

Review. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering

Transcription:

Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically NP S AUX VP Ali will V NP help D N the man A node is any point in the tree diagram and it can be: Branching node like S and lower NP. Non-branching node like AUX and V. Terminal nodes: lexical items at end of tree like help Nodes are related to each other by two relations:

1. Dominance: A node X dominates node Y if: X is higher than Y and X is connected to Y by a branch. For example, NP dominates Ali, VP dominates D and NP. Immediate dominance: A node immediately dominates another if there s s no intervening node. e.g. S immediately dominates NP, AUX, VP but not help. 2. Precedence: A node X precedes Y if it is on the left and both aren t dominating each other. e.g. Ali precedes will,, but the doesn t precede man.

Phrase Structure Rules These rules derive different types of phrases and unlimited number of sentences: NP (D) (AdjP( AdjP) ) N This rule cannot predict the structure of the phrase and we need to memorize infinite number of rules. The Structure of Phrases 1. VP: It consists of - lexical category: head V - Phrasal category or maximal projection: : VP as a whole - Intermediate category: V V (part of VP)

V V is a level that contains the object and verb s s modifiers: The head verb (and its object if there is one) is in the lowest V level The verb s s modifiers are placed in higher V V levels, called adjunct Subject combines with highest V V level, i.e. specifier of VP For example: they eat lunch in school tonight VP spec V V they V V NP V PP tonight V NP in school eat lunch

Evidence for V V V is a constituent structure that can be replaced by do so: - They eat lunch in school tonight and Ali does so. (eat lunch in school tonight) - They eat lunch in school tonight and Ali did so this morning. (eat lunch in school) - They eat lunch in school tonight and Ali does so at work this morning. (eat lunch) adjuncts are recursive, i.e. repeatedly added. 2. NP N is a level that contains the object and noun s s modifiers: The head noun (and its object if there is one) is in the lowest N N level The noun s s modifiers are placed in higher N N levels, called adjunct The determiner combines with the highest N, N, i.e. specifier of VP For example, the big book of poems with the blue cover

NP Det N 1 the N 2N AP N 3 N 3 PP big N PP with blue cover book of poems Evidence for N N N is a constituent structure that can be replaced by one: - I want this [big book of poems with the blue cover] not that one (N 1) - I want this big [book of poems with the blue cover] not that small one (N 2) - I want this big [book of poems] with the blue cover not that small one with the red cover (N 3) 3. AdjP & PP We apply the same structure to these phrases:

He is [quite jealous of Ali]. He stood [right across the bridge] The lowest P P and adj includes these heads (and their complements). The spec combines with P P and adj and is placed by modifiers (like quite, very, rather, so for the adj and straight, right for the prep). General structure (x-bar) We can have these general rules that exactly predict the structure re of different types of phrases: XP spec X X (specifier) X X YP (head and complement) X X YP (adjunct= modifier)

Sentence structure Aux is the head of the sentence because it carries tense and agreement: They are working hard. But what about non-auxiliary verbs: They worked hard Aux is the head even though it isn t t overt. Evidence for aux as head of sentence: 1. Cleft sentence: work hard, they did indeed. The tense is on aux and not part of VP.

2. Pseudo-cleft: what they did was work hard. Tense is part of a node, Infl(ection),, which can be filled with overt aux or left empty. s NP I VP +tense +agr I is finite because it has (+t, +agr+ agr). Infinitive clause: I ask [Ali to work hard] To is the head of the infin clause and it s -t, -agr, non-finite I

The structure of IP IP NP I I VP will V V -ed V NP to finish the work IP spec I I I I VP VP is always a complement of I.

IP is a functional category, not a lexical category, because it is used for grammatical function: t, agr Complementizer phrase (CP) CP is another functional category since its head, c, introduces a subordinate clause: C= that, for [-wh[ wh] if, whether [+wh wh] I believe [that Ali will work hard]. I want [for Ali to work hard]. I wonder [if/whether Ali worked hard].

The above CP clauses have the following structure: CP spec C C C IP Ali will worked hard - IP is always the complement of C. - C is filled by that, for and moved will,, forming yes/no question: CP spec C C C IP will NP I [+wh wh] ] Ali I VP work hard

- C can be filled by either will or that, not both - Spec is filled by whether, if and wh-questions: Structure of CP: CP Spec C C C C IP CP spec C C when C IP will NP I Ali I VP work hard

Structural relations Government: A head governs its phrase e.g. N heads its phrase, I the IP, and C the CP. Agreement between head and non-head is established under government: 1. NP agreement NP Det N this/these N PP car/cars of Ali

There s s agreement in number between head (N) and spec (det). But agreement is poor in English since it s s shown morphologically in number not in gender and person as in French. 2. sub/ verb agreement There spec-head agreement between Infl and spec (sub) in number, person: IP spec I I He/they I VP -s s / ø play

3. CP agreement There s s a spec-head agreement in CP in [wh[ wh] ] feature: I wonder if he played. CP spec C C if C IP [+wh wh] ] [+wh wh] ] played C-command A c-commands c commands B if and if i. A doesn t t dominate B and B doesn t t dominate A; and ii. The first branching node dominating A dominates B. For instance, in (2) the spec c-commands commands every node in IP, but VP only I.

Let us now refine the notion of government: Government: A governs B if and if: A is a governor (i.e. a head) and A c-commands c commands B. Strict c-command c command & M-commandM 1. Strict c-commandc command VP V V PP V NP P P NP ate the food in the garden V strictly c-commands c commands NP because they re dominated by 1 st branching node but not PP.

2. M-commandM The head c-commands c commands the adjunct in the garden since there is a node VP that dominate both of them: A c-commands c commands B if and only if: A doesn t t dominate B and every node X (maximal projection) dominating A dominates B. V c-commands c commands NP, V doesn t t c-commands c commands PP (dominance by1 st branching node) V m-commands m NP, PP, NP (inside PP) (dominance by maximal proj.) P doesn t t m-commands m V (PP doesn t t dominate V)

Government is refined as: A governs B if and if: i. A is a governor; and ii. A m-commands m B; and iii. No barrier between A and B. In (1), the verb ate governs the PP, but it doesn t t govern the NP since it is a barrier. However the verb m-commands m NP.