Structures mathématiques du langage

Similar documents
2 Compositional Semantics and Generative Grammar

Introduction to Semantics. The Formalization of Meaning 1

Ling 98a: The Meaning of Negation (Week 5)

Semantics 2 Part 1: Relative Clauses and Variables

Proseminar on Semantic Theory Fall 2010 Ling 720. Remko Scha (1981/1984): Distributive, Collective and Cumulative Quantification

Quantification in the predicate calculus

Generalized Quantifiers Logical and Linguistic Aspects

Introduction to Metalogic

Semantics and Generative Grammar. Quantificational DPs, Part 3: Covert Movement vs. Type Shifting 1

Semantics and Generative Grammar. The Semantics of Adjectival Modification 1. (1) Our Current Assumptions Regarding Adjectives and Common Ns

CHAPTER THREE: RELATIONS AND FUNCTIONS

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars

CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL LOGIC

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.

Scope Ambiguities through the Mirror

Semantics and Generative Grammar. Quantificational DPs, Part 2: Quantificational DPs in Non-Subject Position and Pronominal Binding 1

Moreno Mitrović. The Saarland Lectures on Formal Semantics

Reflexives and non-fregean quantifiers

Semantics and Generative Grammar. A Little Bit on Adverbs and Events

INTRODUCTION TO LOGIC. Propositional Logic. Examples of syntactic claims

Models of Adjunction in Minimalist Grammars

a. Develop a fragment of English that contains quantificational NPs. b. Develop a translation base from that fragment to Politics+λ

The semantics of propositional logic

Parsing with Context-Free Grammars

X-bar theory. X-bar :

Introduction to Semantics. Pronouns and Variable Assignments. We ve seen that implicatures are crucially related to context.

Denotation of Predicates

Introduction to Semantics. Common Nouns and Adjectives in Predicate Position 1

Section 2.1: Introduction to the Logic of Quantified Statements

3. Only sequences that were formed by using finitely many applications of rules 1 and 2, are propositional formulas.

Andrew Carnie, Structural Relations. The mathematical properties of phrase structure trees

Models of Computation,

1 The standard quantifiers in FOL

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].

Focus in complex noun phrases

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Lecture 4: Proposition, Connectives and Truth Tables

Semantics and Generative Grammar. Expanding Our Formalism, Part 1 1

Model-Theory of Property Grammars with Features

Propositional Logic: Syntax

Propositional logic. First order logic. Alexander Clark. Autumn 2014

Movement-Generalized Minimalist Grammars

Quantification: Quantifiers and the Rest of the Sentence

Sharpening the empirical claims of generative syntax through formalization

Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically

cis32-ai lecture # 18 mon-3-apr-2006

Spring 2018 Ling 620 Introduction to Semantics of Questions: Questions as Sets of Propositions (Hamblin 1973, Karttunen 1977)

Introduction to Metalogic 1

cse541 LOGIC FOR COMPUTER SCIENCE

Top Down and Bottom Up Composition. 1 Notes on Notation and Terminology. 2 Top-down and Bottom-Up Composition. Two senses of functional application

September 13, Cemela Summer School. Mathematics as language. Fact or Metaphor? John T. Baldwin. Framing the issues. structures and languages

Parasitic Scope (Barker 2007) Semantics Seminar 11/10/08

Compositionality and Syntactic Structure Marcus Kracht Department of Linguistics UCLA 3125 Campbell Hall 405 Hilgard Avenue Los Angeles, CA 90095

Semantics and Generative Grammar. Pronouns and Variable Assignments 1. We ve seen that implicatures are crucially related to context.

Seminar in Semantics: Gradation & Modality Winter 2014

Spring 2017 Ling 620. An Introduction to the Semantics of Tense 1

Grundlagenmodul Semantik All Exercises

Proving Completeness for Nested Sequent Calculi 1

Proseminar on Semantic Theory Fall 2013 Ling 720 Problem Set on the Analysis of Quantification: Answers and Notes

Expressive Power, Mood, and Actuality

1. The Semantic Enterprise. 2. Semantic Values Intensions and Extensions. 3. Situations

SOME NOTES ON LOGIC AND QUANTIFIER RAISING

First-Order Logic. 1 Syntax. Domain of Discourse. FO Vocabulary. Terms

Extensions to the Logic of All x are y: Verbs, Relative Clauses, and Only

Lecture 2: Syntax. January 24, 2018

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Intensionality. 1. Intensional Propositional Logic (IntPL).

Equational Logic. Chapter Syntax Terms and Term Algebras

The Lambek-Grishin calculus for unary connectives

Introduction to first-order logic:

Prefixed Tableaus and Nested Sequents

Lecture Notes on Inductive Definitions

An Inquisitive Formalization of Interrogative Inquiry

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

Semantics I, Rutgers University Week 3-1 Yimei Xiang September 17, Predicate logic

Computational Tasks and Models

A Computational Approach to Minimalism

Propositional Logic: Part II - Syntax & Proofs 0-0

Relational Reasoning in Natural Language

Canonical Calculi: Invertibility, Axiom expansion and (Non)-determinism

D2: For each type 1 quantifier Q, Q acc (R) = {a : Q(aR) = 1}.

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued

The Semantics of Definite DPs 1. b. Argument Position: (i) [ A politician ] arrived from Washington. (ii) Joe likes [ the politician ].

SEMANTICS OF POSSESSIVE DETERMINERS STANLEY PETERS DAG WESTERSTÅHL

CS1021. Why logic? Logic about inference or argument. Start from assumptions or axioms. Make deductions according to rules of reasoning.

1 Classical scalar implicature

Introduction to Logic in Computer Science: Autumn 2006

Introduction to Divide and Conquer

THE SYNTAX AND SEMANTICS OF STANDARD QUANTIFICATION THEORY WITH IDENTITY (SQT)

The Importance of Being Formal. Martin Henz. February 5, Propositional Logic

Hardegree, Formal Semantics, Handout of 8

CS460/626 : Natural Language Processing/Speech, NLP and the Web

Ling 130 Notes: Predicate Logic and Natural Deduction

ESSLLI 2007 COURSE READER. ESSLLI is the Annual Summer School of FoLLI, The Association for Logic, Language and Information

The mathematical core of Tarski s truth definition

A NOTE ON COMPOSITIONALITY IN THE FIRST ORDER LANGUAGE

THE LOGIC OF QUANTIFIED STATEMENTS. Predicates and Quantified Statements I. Predicates and Quantified Statements I CHAPTER 3 SECTION 3.

Syntax. Notation Throughout, and when not otherwise said, we assume a vocabulary V = C F P.

Generalized Quantifiers & Categorial Approaches & Intensionality

Transcription:

tructures mathématiques du langage Alain Lecomte 16 février 2014 1 Heim and Kratzer s theory Montague s grammar was conceived and built during the sixties of the last century, without much attention paid to works in the generative paradigm. Important syntactic notions like projection, move or control were ignored by the logician and philosopher. Many similar phenomena (like extrapositions and questions, quantifiers in subject or object position...) were treated in a non uniform way. HeimKratzerHeim and Kratzer, as generativists, have tried to overcome these difficulties. They have, moreover, tried to give semantic interpretations in terms of truth values independently of any intermediary formal language, aiming to execute the Fregean program, as they declared in the second chapter of their book. 1.1 Interpreting derivation trees As we know, a first order languagefirst order language L is interpreted relative to some structurestructure M =< D, I > where D is a non-empty set (the domain or universe of the interpretation) and I is an interpretation functioninterpretation function defined at the beginning only on constants in L, and then extended to terms and formulae by means of a recursive procedure. In this frame, variables are interpreted by means of assignmentassignments, and the truth value of quantified formulae is obtained by making assignments vary. An assignment may be viewed either as a partial function defined in V ar (the denumerable set of variables in L) with values in D or as a possibly infinite list of elements of D, (a 1, a 2,..., a n,...) where for each i, a i is the value assigned by the assignment to the i th variable in L (with some a i possibly replaced by for undefined when there is no value assigned to the i th variable). atural language is supposed to be interpreted similarly by Heim and Kratzer. What is required for that is simply a universe D and an interpretation function I which sends constant words to appropriate elements and sets definable from D and the set of truth values {0, 1}. For instance, we shall assume : for every proper noun c, c) is an element of D, for every intransitive verb V, V ) is a function f V from D to {0, 1}, for every common noun, ) is a function f from D to {0, 1}, for every transitive verb V, V ) is a function f V from D to the set of functions from D to {0, 1} As an example, we may suppose D = {Ann, Mary, Paul, Ibrahim} and I defined as : 1

Ann) = Ann Mary) = Mary Paul) = Paul Ibrahim) = Ibrahim smokes) = the function f : D {0, 1} such that f(ann) = 0, f(mary) = 0, f(paul) = 1, f(ibrahim) = 1 kisses) = the function f : D {0, 1} D such that f(ann) = the function f Ann : D {0, 1} such that f Ann (Ann) = 0, f Ann (Mary) = 0, f Ann (Paul) = 1, f Ann (Ibrahim) = 0 f(mary) = the function f Mary : D {0, 1} such that f Mary (Ann) = 0, f Mary (Mary) = 0, f Mary (Paul) = 0, f Mary (Ibrahim) = 1 f(paul) = the function f P aul : D {0, 1} such that f P aul (Ann) = 1, f P aul (Mary) = 0, f P aul (Paul) = 0, f P aul (Ibrahim) = 0 f(ibrahim) = the function f Ibrahim : D {0, 1} such that f Ibrahim (Ann) = 0, f Ibrahim (Mary) = 1, f Ibrahim (Paul) = 0, f Ibrahim (Ibrahim) = 0 Then, rules may be interpreted in order to define the denotation of a sentence in a small language defined on this lexicon. Let us assume the following grammar rules : rule 1 : P VP rule 2 : P rule 3 : VP VI rule 4 : VP VT P rule 5 : VI smokes rule 6 : VT kisses rule 7 : Ann Mary Paul Ibrahim Given a derivation tree τ in such a grammar, the yield of which is a sequence of words σ, we may interpret τ according to the following rules. - rule 1 : if τ has a root labelled by, and two branches α, the root of which is labelled by P and β, the root of which is labelled by VP, then τ) = β)(α)) - rule 2 : if τ has a root labelled by P, and one branch α, the root of which is labelled by, then τ) = α) - rule 3 : if τ has a root labelled by VP, and one branch α, the root of which is labelled by VI, then τ) = α) - rule 4 : if τ has a root labelled by VP, and two branches α, the root of which is labelled by VT and β, the root of which is labelled by P, then τ) = α)(β)) - rule 5 : if τ has a root labelled by VI, and one branch α, simply labelled by smokes, then τ) = smokes) - rule 6 : if τ has a root labelled by VT, and one branch α, simply labelled by kisses, then τ) = kisses) - rule 7 : if τ has a root labelled by, and one branch α, simply labelled by Ann, Mary, Paul or Ibrahim, then τ) = Ann), Mary), Paul) or Ibrahim) 2

Let us consider the sentence : Exemple 1 Ann kisses Paul Its derivation tree τ is : P VP VT P The interpretation of the tree is given by : Ann kisses Paul and we have : and τ) = VP VT P kisses Paul P Ann ) = VP VT P kisses ) = VT kisses Ann Paul )( )( P Ann )) ) = Ann) = Ann P Paul )) = kisses)(paul)) Applying the definition of I as defined on the constants of the language, we get : kisses)(p aul) = f P aul and therefore τ) = [kisses)(paul)](ann) = f P aul (Ann) = 1 Let us notice here that the interpretation of a transitive verb in this grammar is such that to each individual x D, is associated a function f x from D to {0, 1} such that for each individual y D, f x (y) is equal to 1 if and only if x is the object of the verb and y its subject and there is between x and y the relation denotated by the transitive verb. Heim and Kratzer extend their notions and denote by [[.]] the function of interpretation that we can compute on each expression of the grammar (but we shall still use soimes I as a notation for this 3

function). An easier definition of the procedure amounts to defining it not on trees but on nodes. The interpretation of a node is then the previous interpretation of the (sub)-tree it is the root of. With this change of perspective, we can generally define the interpretation of any node, starting from terminal nodes and concluding that (Interpretability Condition) all nodes in a phrase structure tree must be in the domain of the interpretation function [[. ]]. 1. Terminal nodes : If α is a terminal node, then α belongs to the domain of [[. ]] if [[α]] is given by the lexicon, 2. on branching nodes : If α is a non branching node and β is its daughter, then α belongs to the domain of [[. ]] if β belongs to it, and then, [[α]] = [[β]], 3. Branching nodes : If α is a branching node and β and γ are its daughters, then α belongs to the domain of [[. ]] if β and γ belong to it and [[β]] is a function defined on [[γ]]. In this case, [[α]] = [[β]]([[γ]]). In the previous example, for instance, let us take the node α labelled by the symbol VP : it is a branching node. β (labelled by VT) and γ (labelled by P) are its daughters and it happens that [[β]] is defined on [[γ]], therefore, [[α]] is defined and [[α]] = [[γ]]([[β]]). Heim and Kratzer make also use of the notion of type. Thus, a necessary condition for [[β]] be a function defined on [[γ]] is that [[β]] be of type a b and [[γ]] of type a, but cases may exist where this condition is not sufficient. Let us call interpretability conditioninterpretability condition the set of conditions expressed above. Of course the fact that [[β]] (for instance in the case of a transitive verb) is a function defined on [[γ]] may seem reminiscent of the theta-criterionθ-criterion, that is the fact that : (θ-criterion) : every argument receives one and only one θ-role, and each θ-role is assigned to one and only one element. Let us see nevertheless that the interpretatibility condition may soimes be satisfied when the θ- criterion is not. For instance it is often admitted that a common noun assigns a theta-roleθ-role, it is what happens in a sentence like : Paul is a student where Paul receives the θ-role assigned by student. The sentence also satisfies the interpretability condition since a common noun is given the type e t, and Paul is supposed to have the type e. But in the case of a P like the student, the θ-role is not assigned whereas the condition is still satisfied : this time, it is the definite the which behaves as a function and absorbs the demand for an argument. In fact, the is of type (e t) e, so that the student is again of type e, just as any proper name (for the time being) 1. 1.2 Predicate modification As their name indicates, modifiers modify items to which they apply. evertheless, application in this context must be a priori understood differently from the standard application of the semantic of an expression (say a verbal phrase for instance) to the semantic of another one (say its subject for instance). The modifiermodifier in blue may modify the predicate man in such a way that man becomes man in blue, but the two expressions have the same type (they are both of type e t). It seems to be regular to assign type e t also to the modifier in blue since it can work also as a predicate. There are therefore cases where a node α is a branching node, but none of its daughters can apply to the other in the regular way (satisfying the interpretability condition). This requires a new semantic rule, that Heim and Kratzer name a composition rulecomposition rule (even if it does not correspond to what we usually name composition 1 The function associated with the definite is called a choice functionchoice function since it must select an individual in a set under some conditions 4

for two functions f and g). uch a rule takes two functions as inputs and returns a new function as output. For instance, the two inputs are : f man : the function D {0, 1} such that for all i D, f man (i) = 1 if and only if i is a man f in blue : the function D {0, 1} such that for all i D, f in blue (i) = 1 if and only if i is in blue and the output is : f man in blue : the function D {0, 1} such that for all i D, f man in blue (i) = 1 if and only if i is a man in blue The easiest way to express this transformation rests on the use of λ-functions. Let us denote the functions f man and f in blue by λ-expressions : f man : λx D. x is a man f in blue : λx D. x is in blue The resulting function f man in blue is equal to λx D. (x is a man) (x is in blue), or also : λx D. (x is a man) = (x is in blue) = true Finally we get, as a new semantic rule predicate modification : 1. Predicate Modification : If α is a branching node and β and γ are its daughters, and [[β]] and [[γ]] are both of type e t, then [[α]] = λx e.([[β]](x) = [[γ(x)]] = 1). 1.3 Variables and binding In the previous section, we had no need for variables since we only elaborated on phrase structure rules (or merge operations), things are changing as soon as we introduce displacements of constituents and therefore traces. A relative clauserelative clause, for instance, like whom Mary, is obtained after a Move operation which displaces the object of and leaves a trace behind. In a like the man whom Mary, man and whom Mary are two properties which are combined exactly like in Predicate Modification : whom Mary is a modifier. Its denotation is the function : [[whom Mary ]] = λx D.M ary x (1) oimes, following Heim & Kratzer, we shall denote this function by : λx D. (Mary x = 1) but the reader knows that for a boolean variable X, there is no distinction between the evaluations of X and of X = 1! (X = 1 is true if and only if X is true!). It remains to know how the variable x is introduced. In fact the two questions, the syntactic one concerning Move and the semantic one, concerning the introduction of a variable x amount to the same mechanism. Let us assume the following syntactic analysis of (1) : CP COMP whom 1 VP Mary VT The point here is that we will translate the trace by a variable. It remains to interpret a node marked by a trace. Like the interpretation of formulae of First Order Logic, this requires the use of assignments. From 5

now on, trees (and nodes) will be interpreted not only relative to a structure M =< D, I >, but also to an assignment g : D and we will have, for any trace t i (indexed by an integer i ) : [[t i ]] M,g = g(i) It results that the interpretation relative to M, g of the tree representing Mary (that we will denote as τ Mary t1 ) is calculated in the following way : with VP ) M,g = Mary VT VP VT ) M,g (Mary) M,g ) VP VT ) M,g = ) M,g ( ) M,g ) = [λx.λy.y x](g(1)) = λy.y g(1) so that, finally, τ Mary t1 ) M,g = [λy.y g(1)](mary) = Mary g(1), which clearly depends on the assignment g. The next step consists in defining the role of whom. Heim and Kratzer introduce a new semantic rule : Predicate Abstractionpredicate abstraction. Predicate abstraction : If α is a branching node whose daughters are a relative pronoun and β(t i ), then [[α]] M = λx D.[[β(t i )]] M,g[i:=x] In this rule, β(t i ) designates a tree β containing a trace t i and the notation [[.]] M,g[i:=x] means an interpretation relative to a structure M and some assignment g modified in i in order to assign a variable x to the integer i. By using this rule we can compute the denotation of whom Mary : τ whom Mary ) M,g = λx D.τ Mary t1 ) M,g[1:=x] = λx D.[Mary g(1)] M,g[1:=x] = λx D.Mary x At the last line, we get rid of g because the expression obtained no longer depends on it, and we get rid of M since all the constants which have to be interpreted with regard to I are already interpreted (Mary by the individual Mary and by the function λx D.λy D.y x). As we see it, the form which results is independent of any assignment, we can say that it is closed. In the last step, we can interpret the expression man whom Mary by Predicate Modification thus obtaining : λx D.(x is a man) & (Mary x) (2) If moreover we interpret the definite the as a function from sets of individuals to individuals in the standard way : the) = the function which associates to any subset E of D the unique element of E if E is a singleton, 6

and is undefined for other subsets, Assuming that a function like (2) is the characteristic functioncharacteristic function of some set and that therefore we may also interpret man whom Mary as a set (the subset of D whose elements are precisely the men whom Mary and which is a mere singleton), by Functional Application, we obtain for the expression the man whom Mary, the denotation : the man whom Mary ) = - the unique element m such that m D and m is a man and Mary m if there is one and only one such individual - and undefined if not Let us recall that, in a Montagovian setting, the translation of this nominal phrase is : using Russell s operator ι. Let us summarize now Heim and Kratzer s rules for interpretation : ιx.man(x) (M ary, x) (3) 1. Terminal nodes : If α is a terminal node, then α belongs to the domain of [[. ]] if [[α]] is given by the lexicon, 2. on branching nodes : If α is a non branching node and β is its daughter, then α belongs to the domain of [[. ]] if β belongs to it, and then, [[α]] = [[β]], 3. Functional Application : If α is a branching node and β and γ are its daughters, then α belongs to the domain of [[. ]] if β and γ belong to it and [[β]] is a function defined on [[γ]]. In this case, [[α]] = [[β]]([[γ]]). 4. Predicate Modification : If α is a branching node and β and γ are its daughters, and [[β]] and [[γ]] are both of type e t, then [[α]] = λx e.([[β]](x) = [[γ(x)]] = 1). 5. Predicate Abstraction : If α is a branching node whose daughters are a relative pronoun and β(t i ), then [[α]] M = λx D.[[β(t i )]] M,g[i:=x] This allows some precise semantic definitions concerning Binding Theory. Of course, that we consider traces as variables is coherent with the following sense we give to this notion of variable in the linguistic theory : Définition 1 (Variables and binders) 1. A terminal symbol α is a variable if and only if, there are at least two assignments a and a such that [[α]] a [[α]] a 2. If α is a pronoun or a trace, a is an assignment defined on i, then [[α i ]] a = a(i) 3. a terminal node is a constant if and only if for any two assignments a and a, [[α]] a = [[α]] a 4. An expression α is a variable binder in a language L if and only if there are trees β and assignments a such that : (a) β is not in the domain of a, (b) there is a tree whose immediate constituents are α and β and this tree belongs to the domain of a 7

β γ δ ɛ α n FIG. 1 ɛ binds α n For instance, if we go back to our previous example, it is obvious that there is an assignment f for which [[τ Mary t1 ]] is not defined, it suffices to take the empty function as an assignment (or any partial function defined on {1}) but since [[τ whom Mary t1 ]] no longer depends on an assignment, it is defined even for the assignment (or any assignment not defined on 1), therefore whom is a variable binder. ow, to define the notions of free variablefree and bound variablebound variables poses the problem of cases where, in a same expression some occurrences of a variable are bound and others free. we therefore define freeness and boundness for occurrences (HK98 p. 118) : Définition 2 (freeness) 1. Let α n be an occurrence of a variable α in a tree β. (a) α n is free in β if no subtree γ of β meets the following two conditions : i. γ contains α n ii. there is an assignment a such that α is not the domain of [[. ]] a, but γ is (b) α n is bound in β if and only if α n is not free in β Of course, if there was some subtree γ of β (including β itself) containing α n and such that γ would be defined for some assignment a which fails to give a value to α n that would amount to saying that γ contains a variable binder for α n and α n would be bound. It may be shown thanks to this definition that : Théorème 1 α n is bound in β if and only if β contains a variable binder which c-commands α n Proof (cf. fig. 1) : suppose α n is bound in β, then, there is a subtree γ of β such that γ contains α n and there is an assignment a such that α is not in the domain of [[. ]] a, but γ is. Let in fact γ be the smallest subtree with these properties. γ has two daughters, one, δ, which contains α n and another one, ɛ which does not. Let a be an assignment such that γ but not α is in the domain of [[. ]] a, because γ is the smallest subtree satisfiyng (i) and (ii), δ cannot satisfy them. Because (i) is true for δ, (ii) must therefore be false for it. Therefore δ is not in the domain of [[. ]] a, this entails that ɛ is a variable binder., and it is precisely in a position to c-command α n. Heim & Kratzer can then define the semantic notion of bindingbinding : Définition 3 (Binding) 1. Let β n a variable binder occurrence in a tree γ, and let α m be a variable occurrence in the same tree γ, which is bound in γ, then β n binds α m if and only if the sister node of β n is the largest subtree of γ in which α m is free. We have now a good notion of a binderbinder. But is that enough to deal with numerous questions like those Montague coped with, like the interpretation of sentences with quantifiers? For a sentence like every boy Mary the translation of which was x (boy(x) (x, Mary)), a binding relation is assumed. Where does it come from? What is the binder? As we have seen above, the translations of respectively the boy whom Mary and Mary a boy are rather similar : ιx.boy(x) (Mary, x) and x.boy(x) (Mary, x). For the nominal phrase, this 8

a boy 1 VP V Mary FIG. 2 a boy Mary Det P the boy whom 1 VP Mary V FIG. 3 the boy whom Mary is due to the use of the Predicate Modification rule, a use which is made possible by the facts that boy(x) and (Mary, x) are two properties which are applied to the same individual x. In (Mary, x), the occurrence of the variable x comes from the trace in the object position of the verb. From what trace comes the same variable x in the case of the sentence? ecessarily, it must come from a trace but we don t see what move could have let such a trace behind in this kind of sentence! From here comes the solution very early proposed by MayFiengoR. May and R. Fiengo MayFiengo (later on endorsed by Chomsky. Chomsky) according to which quantificational sentences have a logical form coming from the quantifier raisingquantifier-raising transformation, which amounts to moving the quantificational phrase higher in the tree, so that it adjoins to the (or IP ) node : see fig. 2. evertheless, if we compare the tree of fig. 2 with the one associated with the, on the figure 3, we may see that on the later, whom is the binder, which is coindexed with the trace, and not boy. It is the presence of whom, as a variable binder, which transforms Mary into the closed term λx.(mary, x). If we wish to have a similar construct for the quantificational sentence, we must have a similar element in its tree, in such a way that we could have two λ-terms : λx.boy(x) and λx.(mary, x) and the quantifier a applying to both, thus getting x.boy(x) (Mary, x), but what can play this role in this case? Heim & Kratzer propose to insert a binder coindexed with the trace for every move of a quantificational phrase, thus producing for instance the tree of fi. 4. We must of course remember that we keep the same analysis of quantifiers that the one performed in Montague s frame, that is a quantifier like a or every has the type (e t) ((e t) t), which compels us to have two (e t) type predicates as arguments 9

a boy 1 VP V Mary FIG. 4 The binder 1 of the determiner. By taking this solution, we easily see how the meaning of such a quantificational is obtained : VP V ) = λy.((y Mary) = 1) Mary VP V ) = (x 1 Mary) = 1 By Predicate Abstraction : Mary 1 VP V ) = λx 1.((x 1 Mary) = 1) Mary Det ) = λg <e,t>. x.(boy(x) = 1)&(g(x) = 1) a boy and finally, by Functional Application : τ a boy Mary ) = [λg <e,t>. x.(boy(x) = 1)&(g(x) = 1)](λx 1.(x 1 Mary = 1)) = x.(boy(x) = 1)&([λx 1.(x 1 Mary) = 1)](x) = 1) = x.(boy(x) = 1)&((x Mary) = 1) Of course, predicate abstractionpredicate Abstraction has been generalized : 10

Définition 4 (Predicate abstraction) If α is a branching node whose daughters are a binder which bears the index i and β(t i ), then [[α]] M = λx D.[[β(t i )]] M,g[i:=x] It is worth noticing here that such a solution has strong advantages. For instance, let us take a case where the quantificational phrase is in the object position. As we know, in this case, we have in principle a type mismatch : VP Mary V someone where someone is of type (e t) t and of type e (e t), thus preventing any application of the Functional Application rule 2. Quantifier raisingquantifier raising mixed with this introduction of an artificial binder gives a correct solution, since we have now : 1 someone VP Mary V which yields the expected meaning (provided that, of course, we keep considering a trace t as belonging to the e type 3 ). There is one point which still deserves attention : when, exactly, such moves, which have been apparent in the case of quantificational phrases, may or must occur? Heim and Kratzer call adjunctionq-adjunction such a transformation and they say that the adjunctions may attach to other nodes than (or IP), but also (why not?), to VPs, CPs and even s. It is worthwhile here to notice that Heim & Kratzer s viewpoint is based on a strict distinction of two syntactic levels :surface structurelogical form the level of urface tructure (), which may be seen as the syntactic level properly speaking and the level of Logical Form (LF), which seems to be only legitimated by a transformation like QR, which makes a distortion with regard to the (phonetically) observed syntactic forms (since QR is only interpreted at LF and not at PF, the phonetic form") They show that binding may be defined at each level (syntactic binding for as opposed to semantic binding at LF) and they postulate, in their Binding Principle that the two notions exactly correspond to each other. Binding Principlebinding principle : Let α and β be s, where β is not phonetically empty, then α binds β syntactically at if and only if α binds β semantically at LF. 2 Of course, Functional Composition could apply here, but this is another solution, that we shall examine later on, in the Categorial Grammar framework. 3 All the traces are not of this type, think to sentences where another constituent than a is extracted, like a PP, or even a VP, let us keep these issues aside for the time being. 11

Here the notion of semantic binding is slightly different from the one we presented above (which only involves a variable binder and a trace or a pronoun). On the syntactic side, we know that co-indexing comes from the movement of a, which may be seen as previously indexed, and which leaves a trace with the same index behind it, at its original site, therefore a and a trace are co-indexed. That means that, syntactically, when looking at a quantificational sentence like Mary a man, the a man and the trace it leaves are co-indexed : there is no variable binder at, and the representation is : a man 1 VP Mary V When passing to the Logical Form, we may of course keep this co-indexation, even if we add up a supplementary node for the binder. a man 1 1 VP Mary V We may say in this configuration that the α (here a man) binds its trace t, even if it is, in fact, the binder 1 which does the job. This abus de langage is more clearly motivated if we now consider sentences with a reflexive, like a man defended himself, where the usual Binding Principles impose himself be bound by a man. In this case, the Logical Form is : a man 1 1 VP V defended himself 1 where it appears that a man and himself are bound by the same variable binder, which is the only way to give the expected reading. We can then propose a new definition of semantical binding between two s (and not a and its trace like previously) : 12

Définition 5 (emantic binder) A α semantically binds a β if and only if β and the trace of α are semantically bound by the same variable binder. This explains why in the principle above, we excluded traces for values of β. The need for QR would then be a consequence of this principle : the quantifier could bind a β only by putting the quantificational phrase α in a position where it c-commands β, therefore by moving it upward. But what if we tend to dispense with such a distinction (like in the most modern formulations of Chomsky s Minimalist Program)? If the LF-level is created only for the sake of some special transformations like QR, shouldn t we think it is ad hoc? Could it be possible to avoid the redundancy involved by two notions of Binding, which cover the same cases at the end, and to keep only one universal notion, which could be based on the familiar use of binding in Logic? We shall tend in the sequel towards a monostratal view according to which a single level of syntactic analysis (and not three like in the former theory of Deep tructure (D) / urface tructure () and Logical Form (LF)) is opposed to semantic evaluation. Moreover, we arrive now at a stage where analogies with logical proofs may show advantages. We will therefore at first make apparent this analogy and will provide more logical formulations of the theory, before entering solutions which help us getting rid of these distinct levels. 1.4 Towards a proof theoretic account of Binding uppose that, instead of inserting a new node 1 at the specifier node of, we simply create a unary branch : a man 1 VP Mary V and suppose we rotate this tree by 180 o, transforming it into a proof treeproof-tree : a man 1 Mary V V P uppose then grammatical rules are expressed as deduction rules : Lexical ones : Mary V a man 13

Grammar rules : V V P V P traces are seen as hypotheses, and there is a discharge ruledischarge rule which is semantically interpreted as the binding of the variable associated with some hypothesis : A [ : x] B : γ B : λx.γ (see section??), then our inverted tree may be seen as a proof using these rules. Mary Lex Lex V V P R2 : x Hyp R1 a man 1 Lex [Discharging] R3 where in fact, =. We use in that system since, as said in??, exactly one hypothesis must be discharged. This opens the field to the Logical approach to syntax and semantics, that we shall study in the next sections of that book. One such approach is provided by categorial grammarcategorial Grammar, a formalism very early introduced by Polish logicians after HusserlHusserl s philosophical work Husserl and LesniewskiLesniewski s proposals Lesniewski with regard to the notion of semantic type. By doing so, we obtain the Binding Principle for free since the surface structure and the logical form coincide by construction, and binding is simply associated with the use of a discharge rule, that is what MoortgatM. Moortgat calls a rule of hypothetical reasoninghypothetical reasoning. Modulo some generalization with regard to the strict consumption discipline, we could arrive at a proposition like : Définition 6 (emantic binding-2) A α semantically binds a β semantically represented by a variable x if and only if α applies to a λ-term λx.φ reflecting a proof the last step of which is an introduction of the connective. 14