Artificial Intelligence

Similar documents
CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

CS460/626 : Natural Language

CS 712: Topics in NLP Linguistic Phrases and Statistical Phrases

X-bar theory. X-bar :

CS460/626 : Natural Language Processing/Speech, NLP and the Web

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically

Parsing with Context-Free Grammars

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

CS : Speech, NLP and the Web/Topics in AI

Review. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering

Processing/Speech, NLP and the Web

CS460/626 : Natural Language

Ling 240 Lecture #15. Syntax 4

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016

Natural Language Processing

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Probabilistic Context-Free Grammars. Michael Collins, Columbia University

Computational Linguistics II: Parsing

Parsing with Context-Free Grammars

Handout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0

(7) a. [ PP to John], Mary gave the book t [PP]. b. [ VP fix the car], I wonder whether she will t [VP].

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing

LING 130: Quantified Noun Phrases

Bringing machine learning & compositional semantics together: central concepts

DT2118 Speech and Speaker Recognition

Another look at PSRs: Intermediate Structure. Starting X-bar theory

Grammar and Feature Unification

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars

CAS LX 522 Syntax I Fall 2000 October 10, 2000 Week 5: Case Theory and θ Theory. θ-theory continued

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung

Model-Theory of Property Grammars with Features

CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 8 POS tagset) Pushpak Bhattacharyya CSE Dept., IIT Bombay 17 th Jan, 2012

Andrew Carnie, Structural Relations. The mathematical properties of phrase structure trees

Ling 5801: Lecture Notes 7 From Programs to Context-Free Grammars

Context Free Grammars: Introduction. Context Free Grammars: Simplifying CFGs

Computational Models - Lecture 4 1

Context-Free Grammars and Languages. Reading: Chapter 5

CS 662 Sample Midterm

A Context-Free Grammar

Probabilistic Context Free Grammars. Many slides from Michael Collins

Computational Models - Lecture 4 1

2 A not-quite-argument for X-bar structure in noun phrases

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

LECTURER: BURCU CAN Spring

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Compiling Techniques

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09

Categorial Grammar. Larry Moss NASSLLI. Indiana University

Einführung in die Computerlinguistik

Decoding and Inference with Syntactic Translation Models

BLOOM PUBLIC SCHOOL Vasant Kunj, New Delhi QUARTERLY III ( ) ENGLISH Main Paper

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

Models of Adjunction in Minimalist Grammars

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22

Context Free Grammars: Introduction

Constituency Parsing

CS 6120/CS4120: Natural Language Processing

Introduction to Semantics. The Formalization of Meaning 1

Logic for Computer Science - Week 2 The Syntax of Propositional Logic

Natural Language Processing

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

Spatial Role Labeling CS365 Course Project

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is

CS344: Introduction to Artificial Intelligence. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 10 Club and Circuit

Context Free Grammars

Parsing. Left-Corner Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 17

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars

Semantics 2 Part 1: Relative Clauses and Variables

Synchronous Grammars

A Supertag-Context Model for Weakly-Supervised CCG Parser Learning

10/17/04. Today s Main Points

FLAC Context-Free Grammars

Parsing Algorithms. CS 4447/CS Stephen Watt University of Western Ontario

Context- Free Parsing with CKY. October 16, 2014

What we have done so far

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Other types of Movement

Syntax-Based Decoding

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21

Probabilistic Context-free Grammars

Recap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees

Recurrent neural network grammars

Semantics and Generative Grammar. The Semantics of Adjectival Modification 1. (1) Our Current Assumptions Regarding Adjectives and Common Ns

Introduction to Probablistic Natural Language Processing

Quantification: Quantifiers and the Rest of the Sentence

CISC4090: Theory of Computation

CISC 4090 Theory of Computation

Computational Models - Lecture 4

Propositional logic ( ): Review from Mat 1348

Compositionality and Syntactic Structure Marcus Kracht Department of Linguistics UCLA 3125 Campbell Hall 405 Hilgard Avenue Los Angeles, CA 90095

Computational Models - Lecture 3

Proseminar on Semantic Theory Fall 2010 Ling 720. Remko Scha (1981/1984): Distributive, Collective and Cumulative Quantification

Spectral Unsupervised Parsing with Additive Tree Metrics

Constituency. Doug Arnold

Remembering subresults (Part I): Well-formed substring tables

Lecture 4: Proposition, Connectives and Truth Tables

Alessandro Mazzei MASTER DI SCIENZE COGNITIVE GENOVA 2005

Transcription:

CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20-21 Natural Language Parsing

Parsing of Sentences

Are sentences flat linear structures? Why tree? Is there a principle in branching When should the constituent give rise to children? What is the hierarchy building principle?

Structure Dependency: A Case Study Interrogative Inversion (1) John will solve the problem. Will John solve the problem? Declarative Interrogative (2) a. Susan must leave. Must Susan leave? b. Harry can swim. Can Harry swim? c. Mary has read the book. Has Mary read the book? d. Bill is sleeping. Is Bill sleeping?. The section, Structure dependency a case study here is adopted from a talk given by Howard Lasnik (2003) in Delhi university.

Interrogative inversion Structure Independent (1 st attempt) (3)Interrogative inversion process Beginning with a declarative, invert the first and second words to construct an interrogative. Declarative Interrogative (4) a. The woman must leave. *Woman the must leave? b. A sailor can swim. *Sailor a can swim? c. No boy has read the book. *Boy no has read the book? d. My friend is sleeping. *Friend my is sleeping?

Interrogative inversion correct pairings Compare the incorrect pairings in (4) with the correct pairings ii in (5): Declarative Interrogative (5) a. The womanmust leave. Must the womanleave? b. A sailor can swim. Can a sailor swim? c. No boy has read the book. Has no boy read the book? d. My friend is sleeping. Is my friend sleeping?

Interrogative inversion Structure Independent (2 nd attempt) p) (6) Interrogative inversion process: Beginning i with a declarative, move the auxiliary verb to the front to construct an interrogative. Declarative Interrogative (7) a. Bill could be sleeping. *Be Bill could sleeping? Could Bill be sleeping? b. Mary has been reading. *Been Mary has reading? Has Mary been reading? c. Susan should have left. *Have Susan should left? Should Susan have left?

Structure independent (3 rd attempt): (8) Interrogative inversion process Beginning i with a declarative, move the first auxiliary verb to the front to construct an interrogative. Declarative Interrogative (9) a. The manwho is here can swim. *Is the manwho here can swim? b. The boy who will play has left. *Will the boy who play has left?

Structure Dependent Correct Pairings For the above examples, fronting the second auxiliary verb gives the correct form: Declarative Interrogative (10) a.the man who is here can swim. Can the man who is here swim? b.the boy who will play has left. Has the boy who will play left?

Natural transformations are structure dependent (11) Does the child acquiring English learn these properties? (12) We are not dealing with a peculiarity of English. No known human language g has a transformational process that would produce pairings like those in (4), (7) and (9), repeated below: (4) a. The woman must leave. *Woman the must leave? (7) a. Bill could be sleeping. *Be Bill could sleeping? (9) a. The man who is here can swim. *Is the man who here can swim?

Deeper trees needed for capturing sentence structure NP This wont do! Flat structure! The AP book with the blue cover big of poems [The big book of poems with the [The big book of poems with the Blue cover] is on the table.

Other languages NP English The AP book with the blue cover big NP of poems AP Hindi kitaab niil jilda vaalii kavita kii badii [niil jilda vaalii kavita kii kitaab]

Other languages: contd NP English The AP book with the blue cover big NP of poems AP Bengali niil malaat deovaa kavitar motaa ti bai [niil malaat deovaa kavitar bai ti]

s are at the same level: flat with respect to the head word book NP No distinction in terms of dominance or c-command command The AP book with the blue cover big of poems [The big book of poems with the [The big book of poems with the Blue cover] is on the table.

Constituency test of Replacement runs into problems One-replacement: I bought the big [book of poems with the blue cover] not the small [one] One-replacemen targets book of poems with the blue cover Another one-replacement: I bought the big [book of poems] with the blue cover not the small [one] with the red cover One-replacemen targets book of poems

More deeply embedded structure NP N 1 The AP N 2 big N 3 N book with the blue cover of poems

To target N 1 I want [ NP this [ N big book of poems with the red cover] and not [ N that [ N one]]

Bar-level projections Add intermediate structures NP (D) N N (AP) N N () N () () indicates optionality

New rules produce this tree NP N-bar N 1 The AP N 2 big N 3 N book with the blue cover of poems

As opposed to this tree NP The AP book with the blue cover big of poems

V-bar What is the element in verbs corresponding to one-replacement for nouns do-so or did-so

As opposed to this tree NP The AP book with the blue cover big of poems

I [eat beans with a fork] VP eat NP with a fork beans No constituent that groups together V and NP and excludes No constituent that groups together V and NP and excludes

Need for intermediate constituents I [eat beans] with VP a fork but Ram [does so] with a spoon V 1 VP V V V () V V (NP) V 2 V NP with a fork eat beans

How to target V 1 I [eat beans with VPa fork], and Ram [does so] too. V 1 VP V V V () V V (NP) V 2 V NP with a fork eat beans

Parsing Algorithms

A simplified grammar S NP VP NP DT N N VP V ADV V

A segment of English Grammar S (C) S S {NP/S } VP VP (AP+) (VAUX) V (AP+) ({NP/S }) (AP+) (+) (AP+) NP (D) (AP+) N (+) P NP AP (AP) A

Example Sentence People e laugh 1 2 3 These are positions Lexicon: People - N, V Laugh - N, V This indicate that both Noun and Verb is possible for the word People

Top-Down Parsing State Backup State Action ----------------------------------------------------------------------------------------------------- 1. ((S) 1) - - Position of input pointer 2. ((NP VP)1) - - 3a. ((DT N VP)1) ((N VP) 1) - 3b. ((N VP)1) - - 4. ((VP)2) - Consume People 5a. ((V ADV)2) ((V)2) - 6. ((ADV)3) ((V)2) Consume laugh 5b. ((V)2) - - 6. ((.)3) - Consume laugh Termination Condition : All inputs over. No symbols remaining. Note: Input symbols can be pushed back.

Discussion for Top-Down Parsing This kind of searching is goal driven This kind of searching is goal driven. Gives importance to textual precedence (rule precedence). No regard for data, a priori (useless expansions made).

Bottom-Up Parsing Some conventions: N 12 Represents positions S 1? -> NP 12 VP 2? End position unknown Work on the LHS done, while the work on RHS remaining

Bottom-Up Parsing (pictorial representation) S -> NP 12 VP 23 People Laugh 1 2 3 N 12 N 23 12 23 V 12 V 23 NP 12 -> N 12 NP 23 -> N 23 VP 12 -> V 12 VP 23 -> V 23 12 12 23 23 S 1? -> NP 12 VP 2?

Problem with Top-Down Parsing Left Recursion Suppose you have A-> AB rule. Then we will have the expansion as follows: ((A)K) -> ((AB)K) -> ((ABB)K)..

Combining i top-down and bottom-up strategies

Top-Down Bottom-Up Chart Parsing Combines advantages of top-down & bottomup pparsing. Does not work in case of left recursion. e.g. People laugh People noun, verb Laugh noun, verb Grammar S NP VP NP DT N N VP V ADV V

Transitive Closure People laugh 1 2 3 S NP VP NP N VP V NP DT N S NP VP S NP VP NP N N VP V V ADV success VP V

Arcs in Parsing Each arc represents a chart which records Completed work (left of ) Expected work (right of )

Example People laugh loudly 1 2 3 4 S NP VP NP N VP V VP V ADV NP DT N S NP VP VP V ADV S NP VP NP N VP V ADV S NP VP VP V

Dealing With Structural Ambiguity Multiple parses for a sentence The man saw the boy with a telescope. The man saw the mountain with a telescope. The man saw the boy with the ponytail. At the level of syntax, all these sentences are ambiguous. But semantics can disambiguate 2 nd &3 rd sentence.

Prepositional Phrase () Attachment Problem V NP 1 P NP 2 (Here P means preposition) NP 2 attaches to NP 1? or NP 2 attaches to V?

Parse Trees for a Structurally Ambiguous Sentence Let the grammar be S NP VP NP DT N DT N PNP VP V NP V NP For the sentence, I saw a boy with a telescope

Parse Tree - 1 S NP VP N V NP I saw Det N a boy P NP with Det N a telescope

Parse Tree -2 S NP VP N V NP I saw Det N a boy P with NP Det N a telescope