X-bar theory. X-bar :

Similar documents
Another look at PSRs: Intermediate Structure. Starting X-bar theory

Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically

CS 712: Topics in NLP Linguistic Phrases and Statistical Phrases

Andrew Carnie, Structural Relations. The mathematical properties of phrase structure trees

Artificial Intelligence

Models of Adjunction in Minimalist Grammars

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].

Sharpening the empirical claims of generative syntax through formalization

Parsing with Context-Free Grammars

Introduction to Semantics. The Formalization of Meaning 1

2 A not-quite-argument for X-bar structure in noun phrases

A* Search. 1 Dijkstra Shortest Path

Ling 240 Lecture #15. Syntax 4

Sharpening the empirical claims of generative syntax through formalization

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued

Semantics and Generative Grammar. A Little Bit on Adverbs and Events

Recap: Tree geometry, selection, Θ-theory

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Lecture 7. Logic. Section1: Statement Logic.

1. Background. Task: Determine whether a given string of words is a grammatical (well-formed) sentence of language L i or not.

Uniformity of Theta- Assignment Hypothesis. Transit i ve s. Unaccusatives. Unergatives. Double object constructions. Previously... Bill lied.

CAS LX 522 Syntax I November 4, 2002 Week 9: Wh-movement, supplement

A Context-Free Grammar

Some binding facts. Binding in HPSG. The three basic principles. Binding theory of Chomsky

CS460/626 : Natural Language

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Semantics 2 Part 1: Relative Clauses and Variables

HPSG: Binding Theory

Computationele grammatica

LECTURER: BURCU CAN Spring

CAS LX 523 Syntax II Spring 2001 March 13, (1) A qp. Kayne, Richard (1995). The antisymmetry of syntax. Cambridge, MA: MIT Press.

Recap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Control and Tough- Movement

cis32-ai lecture # 18 mon-3-apr-2006

Control and Tough- Movement

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing

Proseminar on Semantic Theory Fall 2010 Ling 720. Remko Scha (1981/1984): Distributive, Collective and Cumulative Quantification

Top Down and Bottom Up Composition. 1 Notes on Notation and Terminology. 2 Top-down and Bottom-Up Composition. Two senses of functional application

CHAPTER THREE: RELATIONS AND FUNCTIONS

CAS LX 522 Syntax I. We give trees to ditransitives. We give trees to ditransitives. We give trees to ditransitives. Problems continue UTAH (4.3-4.

Penn Treebank Parsing. Advanced Topics in Language Processing Stephen Clark

Semantics and Generative Grammar. Pronouns and Variable Assignments 1. We ve seen that implicatures are crucially related to context.

Decoding and Inference with Syntactic Translation Models

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.

Introduction to Semantics. Pronouns and Variable Assignments. We ve seen that implicatures are crucially related to context.

Bringing machine learning & compositional semantics together: central concepts

Introduction to Semantic Parsing with CCG

1 Boolean Algebra Simplification

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

Model-Theory of Property Grammars with Features

The Semantics of Definite DPs 1. b. Argument Position: (i) [ A politician ] arrived from Washington. (ii) Joe likes [ the politician ].

CAS LX 500 Topics in Linguistics: Questions Spring 2006 March 2, b: Prosody and Japanese wh-questions

Turing machines and linear bounded automata

Other types of Movement

Spring 2017 Ling 620. An Introduction to the Semantics of Tense 1

Quantification: Quantifiers and the Rest of the Sentence

CAS LX 522 Syntax I. We give trees to ditransitives. We give trees to ditransitives. We give trees to ditransitives. Problems continue * VP

THE DRAVIDIAN EXPERIENCER CONSTRUCTION AND THE ENGLISH SEEM CONSTRUCTION. K. A. Jayaseelan CIEFL, Hyderabad

Linearization. {α,β}= def. α and β aresisters

CS 3110: Proof Strategy and Examples. 1 Propositional Logic Proof Strategy. 2 A Proof Walkthrough

BARE PHRASE STRUCTURE, LCA AND LINEARIZATION

Dependency grammar. Recurrent neural networks. Transition-based neural parsing. Word representations. Informs Models

Computational Linguistics

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Generalized Quantifiers Logical and Linguistic Aspects

Roger Levy Probabilistic Models in the Study of Language draft, October 2,

0A B 0C 0D 0E 0F A 0B 0C 0D 0E 0F

Spring 2018 Ling 620 The Basics of Intensional Semantics, Part 1: The Motivation for Intensions and How to Formalize Them 1

Categorial Grammar. Larry Moss NASSLLI. Indiana University

HPSG II: the plot thickens

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing

Movement-Generalized Minimalist Grammars

Semantics and Pragmatics of NLP

2013 ISSN: JATLaC Journal 8: t 1. t t Chomsky 1993 I Radford (2009) R I t t R I 2. t R t (1) (= R's (15), p. 86) He could have helped

Semantics and Generative Grammar. Quantificational DPs, Part 3: Covert Movement vs. Type Shifting 1

Vectors for Zero Robotics students

Presuppositions (introductory comments)

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung

CS1800: Strong Induction. Professor Kevin Gold

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars

One hint from secondary predication (from Baker 1997 (8) A secondary predicate cannot take the goal argument as subject of predication, wheth

Recurrent neural network grammars

Probabilistic Context-free Grammars

Lecture 13: Soundness, Completeness and Compactness

Sentence Planning 2: Aggregation

Manual of Logical Style

Wh-movement. CAS LX 522 Syntax I Fall 2001 November 6, 2001

A Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting

ALGEBRA. 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers

Semantics and Generative Grammar. An Introduction to Intensional Semantics 1

Introduction to Computational Linguistics

Lecture 10: Gentzen Systems to Refinement Logic CS 4860 Spring 2009 Thursday, February 19, 2009

AP Physics 1 Summer Assignment

Turing machines and linear bounded automata

Semantics and Generative Grammar. The Semantics of Adjectival Modification 1. (1) Our Current Assumptions Regarding Adjectives and Common Ns

Transcription:

is one of the greatest contributions of generative school in the filed of knowledge system. Besides linguistics, computer science is greatly indebted to Chomsky to have propounded the theory of x-bar. X-bar is a theory because it is testable and it captures the generalization by bringing the desired results which are systematic in nature. If you recall, we have had instances, for example, in case of c-command, domination, precedence and even in government, where we talked about the intervening nodes between the two nodes. If one tries to sum up the essence of x-bar in one sentence, s/he would like to say that.it facilitates the generative grammar to explain/capture and describe this intervening node by providing an intermediate node for the configurational relationship in a phrase. This explanation may not look convincing at the outset of the discussion, but we will gradually realize that what we have said about the x-bar is very logical and explanatory in nature!

Carnie (2012;167) gives a bibliographic note about x-bar theory. He says that the first presentation of appeared in Chomsky (1970). However, Jackendoff s (1977) seminal book X-bar Syntax is the source of many of the ideas surrounding. If one wants to know the best account of x-bar theory one should consult Radford s (1988) Transformational Grammar: A First Course. This textbook presents one of the most comprehensive arguments for. So, as we mentioned earlier, the need of an intermediate node (pronounced bar ) is the NEED of the hour in order to explain the items that are conjoined in the internal structure of sentences and they must be explained with care. In other words, every node does not have one and same relationship in a tree-diagram, and if we want to distinguish these differences, we must avoid the flat structure of a sentence, where all the attachments in a tree diagram seem to bear similar relationship. The hierarchy in a tree-diagram can be explained better if we have a binary-branching for the bifurcation of the nodes (branches) in a tree-diagram. The helps in doing so and it is a means of the original mechanism for indicating intermediate categories in a tree-diagram.

. X-bar : 1. NP (D) N' 2. N' (AdjP) N' or N' (PP) 3. N' N (PP) 4. VP V' 5. V' V' (PP) or V' (AdvP) 6. V' V (NP) 7. AdvP Adv' 8. Adv' (AdvP) Adv' 9. Adv' Adv (PP) 10. AdjP Adj' 11. Adj' (AdvP) Adj' 12. Adj' Adj (PP) 13. PP P' 14. P' P' (PP) or (AdvP) P' 15. P' P (NP)

It is important to notice that in all the rules in the system that are formulated for the category is the same. In every rule, there is ONE element that is a MUST and it can t be optional. For example, in the NP rule (the first rule in our list), the element that isn t optional is N' (i.e. en-bar or N-bar). Similarly, the only obligatory element in N' is either another N' or N. This is a very general yet important notion in x-bar structure. We call it headedness, and all phrases appear to have heads. A head is the most prominent element in a phrasal category and gives its grammatical category to the whole phrase. Note that we don t have any rule as given below: * NP V AdjP This rule not only seems meaningless, it is unattested in the system we are trying to developed here. We have learned that the requirement of endocentricity demands that every phrase must have head. Thus, the only obligatory element in a phrase is the head.

We can condense the rules that we have proposed into a simple set. To do this we are going to make use of variables (like variables in algebra) to stand for particular parts of speech. Let X be a variable that can stand for any category N, V, Adj, Adv, P. XP is an umbrella-term that covers all the phrases such as NP, VP, AP, and PP. Similarly X' stands for an intermediate node that hasn t yet terminated into lexical items and they are thus called bar-level N', V', Adj', Adv', and P', and Finally X represents the terminal nodes which are lexical items such as N, V, Adj, Adv, and P, and they are the heads of their respective phrases. So, with this three-tiered mechanism inbuilt in, the is an attempt to capture these similarities among rules (i.e. for each kind of phrase, the same kind of rules should appear).

Please note that the non-head material in all the rules is not only phrasal but also optional (with the exception of the determiner in the NP rule for time being) So, we never find a rule like this: * V' Adv V Therefore, anything in an that isn t a head must be a phrase and optional (except NP rule for time being). Also notice that for each major category there are three rules; One that introduces the NP, VP, AdvP, AdjP, and PP, One that takes a bar level and repeats it (e.g., N' N' (PP)), and The one that takes a bar level and spells out the head (e.g., N' N (PP)). So, these rules should be viewed as; 1. XP (YP) X' 2. X' X' (ZP) or X' (ZP) X' 3. X' X (WP)

Let s now examine as to how should we explained different layers of the strucures of a phrase under the new schema; 1.a. a) N' N (PP) b) V' V (NP) c) Adj' Adj (PP) d) Adv' Adv (PP) e) P' P (NP) 2.a. a) N' (AdjP) N' or N' (PP) b) V' V' (PP) or V' (AdvP) c) Adj' (AdvP) Adj' d) Adv' (AdvP) Adv' e) P' P' (PP) or (AdvP) P' 1.b. By using the variable notation we can generalize across these rules with the single general statement: 2.b. Use single general statement: 1.c. X' X (WP)(to be revised) 2.c. X' (ZP) X' or X' (ZP) (to be revised)

Finally, let s consider the rule that introduces the topmost layer of the structure our phrase under the new schema; 3.a. a) NP (D) N' DP D' (NP) b) VP V' (NP) c) AdjP Adj' (AdjP) d) AdvP Adv' (AdvP) e) PP P' (NP) 3.b. By using the variable notation we can generalize across these rules with the single general statement: 3.c. XP (YP) X' (to be revised)

Back to the trees: X-bar Theory Consider our current NP rule: NP: (D) (AdjP+) N (PP+) This yields a flat structure where all of the components of DP c-command each other. NP D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP I bought this big book of poems with the blue cover. You bought this small one. NP D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP We can substitute one for book of poems with the blue cover, which should mean book of poems with the blue cover is a constituent, but it isn t in our structure. NP D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP I bought this small one with the red cover. We can also substitute one in for book of poems alone, which should thus also be a constituent. NP D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP This suggests a more deeply embedded structure: NP?? D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP We ll call these intermediate nodes of NP N (N-bar). Notice that you can also say I bought this one. NP N N D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP So, our final NP looks like this: NP N N N D AdjP N PP PP this Adj book P NP P NP big of N with D AdjP N poems the Adj cover blue

X-bar Theory: NP We need to break up our NP rule; instead of NP: (D) (AdjP+) N (PP+) We have: NP: (D) N N : AdjP N N : N PP N : N (PP) Notice that these yield the same results on the surface (note the recursion and the optionality) but produce different structures (in terms of constituency). Also notice that under the new schema, all nodes of NP have ONLY two daughters. This seems to be a big deal as it changes the tree-relationship into a binary branching.

X-bar Theory: VP The same kind of thing holds true for VP as well as NP. Instead of using one (which stands for N ) we can try doing replacements using do so, and we ll get a very similar result. Our old rule generated a flat structure for VP as well It had all PPs, NPs, CPs, etc. in a VP and they all c-commanded each other). VP: (AdvP+) V ({NP/CP}) (PP+) (AdvP+)

X-bar Theory: VP VP: (AdvP+) V ({NP/CP}) (PP+) (AdvP+) The chef ate the beans with a fork The chef quickly left after Mary did so. The chef left quickly after Mary did so. The chef ate the pizza with joy and Mary did so with quiet reverse. The chef ate the pizza with joy immediately and Mary did so later.

X-bar Theory: VP Again, it looks like we need to break our rule into parts using V (for which do so can substitute). VP: (AdvP+) V ({NP/CP}) (PP+) (AdvP+) To: VP: V V : AdvP V V : V PP V : V AdvP V : V ({NP/CP}) Again, this produces all the same on the surface, but yields a different structure. This again bifurcates the tree into a binary branching.

X-bar Theory: AdjP We should now be growing suspicious of our other rules, now that we have had to split up NP and VP and introduce N and V nodes. The governor was [ AdjP very concerned about housing costs ]. The director was [ AdjP unusually pleased with his actors and confident of success ]. This gives us evidence of AdjP: (AdvP) Adj Adj : Adj (PP)

X-bar Theory: PP The Frisbee landed on the roof. It landed right on the edge. John knocked it right off the roof and into the trashcan. So, this gives us (assuming right is an AdjP): PP: (AdjP) P P : P (PP) P : P NP

The main idea behind is to explain the similarity between the rules for each category. It is an attempt to generalize over the rules we have. PP: (AdjP) P P : P (PP) P : P DP AdjP: (AdvP) Adj Adj : Adj (PP) NP: (D) N N : AdjP N N : N PP N : N (PP) VP: V V : AdvP V V : V PP V : V AdvP V : V ({NP/CP})

The X in is a variable over categories. When we talk of XP, we mean to be describing any kind of phrase (VP, NP, AdjP, AdvP, PP, TP, CP, ). PP: (AdjP) P P : P (PP) P : P DP AdjP: (AdvP) Adj Adj : Adj (PP) NP: (D) N N : AdjP N N : N PP N : N (PP) VP: V V : AdvP V V : V PP V : V AdvP V : V ({NP/CP})

All the rules all have the following form: XP: YP X X : (ZP) X / X : X (ZP) X : X (WP) PP: (AdjP) P P : P (PP) P : P DP AdjP: (AdvP) Adj Adj : Adj (PP) NP: (D) N N : AdjP N N : N PP N : N (PP) VP: V V : AdvP V V : V PP V : V AdvP V : V ({NP/CP})

elevates this to a principle of phrase structure; it hypothesizes that all phrases in a syntactic tree conform to this template. XP : (YP) X A phrase (XP) consists of optionally another phrase and a barlevel projection (X ). X : ZP X or X : X ZP A bar-level projection (X ) can consist of another X and another phrase (recursive). X : X (WP) A bar-level projection (X ) consists of a head of the same category (X) and optionally another phrase.

Structurally, this looks like this (of course, there can be any number of X nodes, here we see three). YP XP X Different parts of this structure are given different names (and they act different from one another, as we ll see). ZP X X X WP ZP

The phrase which is immediately dominated by XP (designated YP here) is the specifier. YP XP X ZP X X ZP X WP

The phrase which is immediately dominated by XP (designated YP here) is the specifier. YP XP X A phrase dominated by X and the sister of X is an adjunct. ZP X X ZP X WP

The phrase which is immediately dominated by XP (designated YP here) is the specifier. YP XP X A phrase dominated by X and the sister of X is an adjunct. The phrase which is sister to X is the complement. ZP X X X WP ZP