CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

Similar documents
CS460/626 : Natural Language

Artificial Intelligence

CS460/626 : Natural Language Processing/Speech, NLP and the Web

Parsing with Context-Free Grammars

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Processing/Speech, NLP and the Web

CS : Speech, NLP and the Web/Topics in AI

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Review. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering

Natural Language Processing

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 8 POS tagset) Pushpak Bhattacharyya CSE Dept., IIT Bombay 17 th Jan, 2012

Handout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0

CS460/626 : Natural Language

LECTURER: BURCU CAN Spring

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016

Natural Language Processing : Probabilistic Context Free Grammars. Updated 5/09

10/17/04. Today s Main Points

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

Context Free Grammars

Advanced Natural Language Processing Syntactic Parsing

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21

CS 6120/CS4120: Natural Language Processing

Probabilistic Context-free Grammars

Parsing with Context-Free Grammars

Computational Models - Lecture 4 1

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is

DT2118 Speech and Speaker Recognition

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Probabilistic Context-Free Grammars. Michael Collins, Columbia University

Maxent Models and Discriminative Estimation

CS 712: Topics in NLP Linguistic Phrases and Statistical Phrases

Compiling Techniques

Empirical Methods in Natural Language Processing Lecture 11 Part-of-speech tagging and HMMs

Attendee information. Seven Lectures on Statistical Parsing. Phrase structure grammars = context-free grammars. Assessment.

Natural Language Processing 1. lecture 7: constituent parsing. Ivan Titov. Institute for Logic, Language and Computation

Part A. P (w 1 )P (w 2 w 1 )P (w 3 w 1 w 2 ) P (w M w 1 w 2 w M 1 ) P (w 1 )P (w 2 w 1 )P (w 3 w 2 ) P (w M w M 1 )

Lecture 13: Structured Prediction

POS-Tagging. Fabian M. Suchanek

Computational Linguistics II: Parsing

Introduction to Probablistic Natural Language Processing

Probabilistic Context Free Grammars. Many slides from Michael Collins

Penn Treebank Parsing. Advanced Topics in Language Processing Stephen Clark

Context Free Grammars: Introduction. Context Free Grammars: Simplifying CFGs

Unit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing

Einführung in die Computerlinguistik

Decoding and Inference with Syntactic Translation Models

Computational Models - Lecture 4 1

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars

Effectiveness of complex index terms in information retrieval

CISC 4090 Theory of Computation

Stochastic Parsing. Roberto Basili

Lecture 9: Hidden Markov Model

Computational Models - Lecture 3

Constituency Parsing

A Context-Free Grammar

Computers playing Jeopardy!

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Department of Computer Science and Engineering Indian Institute of Technology, Kanpur. Spatial Role Labeling

Grammar and Feature Unification

Natural Language Processing

A* Search. 1 Dijkstra Shortest Path

FLAC Context-Free Grammars

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this.

Dependency grammar. Recurrent neural networks. Transition-based neural parsing. Word representations. Informs Models

Spatial Role Labeling CS365 Course Project

Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically

Probabilistic Context-Free Grammar

Lecture 12: Algorithms for HMMs

CISC4090: Theory of Computation

Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing

Natural Language Processing. Lecture 13: More on CFG Parsing

Context-Free Grammars and Languages. Reading: Chapter 5

Introduction to Computers & Programming

Compiling Techniques

Multiword Expression Identification with Tree Substitution Grammars

Suppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis

Lecture 12: Algorithms for HMMs

Dependency Parsing. Statistical NLP Fall (Non-)Projectivity. CoNLL Format. Lecture 9: Dependency Parsing

Harvard CS 121 and CSCI E-207 Lecture 10: Ambiguity, Pushdown Automata

Alessandro Mazzei MASTER DI SCIENZE COGNITIVE GENOVA 2005

Graphical models for part of speech tagging

TALP at GeoQuery 2007: Linguistic and Geographical Analysis for Query Parsing

Lecture 6: Formal Syntax & Propositional Logic. First: Laziness in Haskell. Lazy Lists. Monads Later. CS 181O Spring 2016 Kim Bruce

PCFGs 2 L645 / B659. Dept. of Linguistics, Indiana University Fall PCFGs 2. Questions. Calculating P(w 1m ) Inside Probabilities

Log-Linear Models with Structured Outputs

Parsing. Left-Corner Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 17

Categorial Grammar. Larry Moss NASSLLI. Indiana University

Computational Linguistics

Statistical Methods for NLP

Spatial Role Labeling: Towards Extraction of Spatial Relations from Natural Language

Statistical Methods for NLP

Compiling Techniques

Bringing machine learning & compositional semantics together: central concepts

Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing. Part I. Formal Properties of TAG. Outline: Formal Properties of TAG

Transcription:

CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

Parsing Problem Semantics Part of Speech Tagging NLP Trinity Morph Analysis Marathi French CRF HMM MEMM Hindi English Language Algorithm

A note on Language Modeling Example sentence ^ The tortoise beat the hare in the race. Guided Guided by Guided by by frequency Language world Knowledge Knowledge N-gram (n=3) CFG Probabilistic CFG Dependency Grammar Prob. DG ^ the tortoise 5*10-3 S-> NP VP S->NP VP 1.0 the tortoise beat NP->DT N NP->DT N 3*10-2 0.5 tortoise beat the VP->V NP 7*10-5 PP VP->V NP PP 0.4 beat the hare PP-> P NP PP-> P NP 5*10-6 1.0 Semantic Roles agt, obj, sen, etc. Semantic Rules are always between Heads Semantic Roles with probabilities

Parse Tree S NP VP DT N V NP PP The Tortoise beat DT N P NP the hare in DT N the race

UNL Expression beat@past agt obj scn (scene) tortoise@def hare@def race@def

Purpose of LM Prediction of next word (Speech Processing) Language Identification (for same script) Belongingness check (parsing) P(NP->DT N) means what is the probability that the YIELD of the non terminal NP is DT N

Computation of Parsing

Parsing Essentially a Yes/No problem (belongingness) Parse Tree is a side effect 2 general ways Expand from grammar or Resolve through data Language Representation Intrinsic Representation Grammar for STR: S asb ϵ Extrinsic Representation STR: {ϵ, ab, aabb,, a n b n } -Compact representation -Less Kolomogorov Complexity -Enumerated Representation

Points to consider: Should POS Tagging precede parsing? Is POS Tagging necessary for Parsing? POS Tagging increases implementation efficiency Data People laugh Lexicon People- Noun, Verb laugh- Noun, Verb Grammar Going back again and again to the lexicon isn t required when POS Tagging has been done before Parsing

Two issues are at the crux of parsing: Ambiguity in Data Ambiguity in Grammar Parsing Algorithms: Top-Down Parsing Predictive Parsing, Expectation Driven Parsing, Theory Driven Parsing, Grammar Driven Parsing Suffers from Left-recursion Bottom-Up Parsing Data Driven parsing Ambiguity on POS Tags can lead to useless steps while parsing Chart Parsing

Example sentence: 1 People 2 laugh 3 loudly 4 Multiple parse trees possible for ambiguous sentences The man saw a boy with a telescope Partial parses are also important Text entailment Question Answering People laugh loudly Who laughs? People laugh

Grammar and Parsing Algorithms

A simplified grammar S NP VP NP DT N N VP V ADV V

Example Sentence People laugh 1 2 3 These are positions Lexicon: People - N, V Laugh - N, V This indicate that both Noun and Verb is possible for the word People

Top-Down Parsing State Backup State Action ----------------------------------------------------------------------------------------------------- 1. ((S) 1) - - Position of input pointer 2. ((NP VP)1) - - 3a. ((DT N VP)1) ((N VP) 1) - 3b. ((N VP)1) - - 4. ((VP)2) - Consume People 5a. ((V ADV)2) ((V)2) - 6. ((ADV)3) ((V)2) Consume laugh 5b. ((V)2) - - 6. ((.)3) - Consume laugh Termination Condition : All inputs over. No symbols remaining. Note: Input symbols can be pushed back.

Discussion for Top-Down Parsing This kind of searching is goal driven. Gives importance to textual precedence (rule precedence). No regard for data, a priori (useless expansions made).

Bottom-Up Parsing Some conventions: N 12 Represents positions S 1? -> NP 12 VP 2? End position unknown Work on the LHS done, while the work on RHS remaining

Bottom-Up Parsing (pictorial representation) S -> NP 12 VP 23 People Laugh 1 2 3 N 12 N 23 V 12 V 23 NP 12 -> N 12 NP 23 -> N 23 VP 12 -> V 12 VP 23 -> V 23 S 1? -> NP 12 VP 2?

Problem with Top-Down Parsing Left Recursion Suppose you have A-> AB rule. Then we will have the expansion as follows: ((A)K) -> ((AB)K) -> ((ABB)K)..

Combining top-down and bottom-up strategies

Top-Down Bottom-Up Chart Parsing Combines advantages of top-down & bottomup parsing. Does not work in case of left recursion. e.g. People laugh People noun, verb Laugh noun, verb Grammar S NP VP NP DT N N VP V ADV V

Transitive Closure People laugh 1 2 3 S NP VP NP N VP V NP DT N S NP VP S NP VP NP N VP V ADV success VP V

Arcs in Parsing Each arc represents a chart which records Completed work (left of ) Expected work (right of )

Example People laugh loudly 1 2 3 4 S NP VP NP N VP V VP V ADV NP DT N S NP VP VP V ADV S NP VP NP N VP V ADV S NP VP VP V

Dealing With Structural Ambiguity Multiple parses for a sentence The man saw the boy with a telescope. The man saw the mountain with a telescope. The man saw the boy with the ponytail. At the level of syntax, all these sentences are ambiguous. But semantics can disambiguate 2 nd & 3 rd sentence.

Prepositional Phrase (PP) Attachment Problem V NP 1 P NP 2 (Here P means preposition) NP 2 attaches to NP 1? or NP 2 attaches to V?

Parse Trees for a Structurally Ambiguous Sentence Let the grammar be S NP VP NP DT N DT N PP PP P NP VP V NP PP V NP For the sentence, I saw a boy with a telescope

Parse Tree - 1 S NP VP N V NP I saw Det N PP a boy P NP with Det N a telescope

Parse Tree -2 S NP VP N V NP PP I saw Det N P NP Det N a boy with a telescope

Dealing With Structural Ambiguity Multiple parses for a sentence The man saw the boy with a telescope. The man saw the mountain with a telescope. The man saw the boy with the ponytail. At the level of syntax, all these sentences are ambiguous. But semantics can disambiguate 2 nd & 3 rd sentence.

Prepositional Phrase (PP) Attachment Problem V NP 1 P NP 2 (Here P means preposition) NP 2 attaches to NP 1? or NP 2 attaches to V?

Xerox Parsers XLE Parser (Freely available) XIP Parser (expensive) Linguistic Data Consortium (LDC at UPenn) Europe India European Language Resource Association (ELRA) Linguistic Data Consortium for Indian Languages (LDC-IL)

Parse Trees for a Structurally Ambiguous Sentence Let the grammar be S NP VP NP DT N DT N PP PP P NP VP V NP PP V NP For the sentence, I saw a boy with a telescope

Parse Tree - 1 S NP VP N V NP I saw Det N PP a boy P NP with Det N a telescope

Parse Tree -2 S NP VP N V NP PP I saw Det N P NP Det N a boy with a telescope

Parsing Structural Ambiguity

Parsing for Structurally Ambiguous Sentences Sentence I saw a boy with a telescope Grammar: S NP VP NP ART N ART N PP PRON VP V NP PP V NP ART a an the N boy telescope PRON I V saw

Ambiguous Parses Two possible parses: PP attached with Verb (i.e. I used a telescope to see) ( S ( NP ( PRON I ) ) ( VP ( V saw ) ( NP ( (ART a ) ( N boy )) ( PP (P with ) (NP ( ART a ) ( N telescope ))))) PP attached with Noun (i.e. boy had a telescope) ( S ( NP ( PRON I ) ) ( VP ( V saw ) ( NP ( (ART a ) ( N boy ) (PP (P with ) (NP ( ART a ) ( N telescope ))))))

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) ART does not match I, backup state (b) used

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) ART does not match I, backup state (b) used

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used 6 ( ( NP PP ) 3 ) Consumed saw

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used 6 ( ( NP PP ) 3 ) Consumed saw 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 )

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used 6 ( ( NP PP ) 3 ) Consumed saw 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used 6 ( ( NP PP ) 3 ) Consumed saw 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 2 ( ( NP VP ) 1 ) Use NP ART N ART N PP PRON 3 ( ( ART N VP ) 1 ) (a) ( ( ART N PP VP ) 1 ) (b) ( ( PRON VP ) 1) 3 B ( ( PRON VP ) 1 ) 4 ( ( VP ) 2 ) Consumed I ART does not match I, backup state (b) used 5 ( ( V NP PP ) 2 ) ( ( V NP ) 2 ) Verb Attachment Rule used 6 ( ( NP PP ) 3 ) Consumed saw 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed boy 10 ( ( P NP ) 5 ) 11 ( ( NP ) 6 ) Consumed with

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed boy 10 ( ( P NP ) 5 ) 11 ( ( NP ) 6 ) Consumed with 12 ( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6)

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed boy 10 ( ( P NP ) 5 ) 11 ( ( NP ) 6 ) Consumed with 12 ( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6) 13 ( ( N ) 7 ) Consumed a

Top Down Parse State Backup State Action Comments 1 ( ( S ) 1 ) Use S NP VP 7 ( ( ART N PP ) 3 ) (a) ( ( ART N PP PP ) 3 ) (b) ( ( PRON PP ) 3 ) 8 ( ( N PP) 4 ) Consumed a 9 ( ( PP ) 5 ) Consumed boy 10 ( ( P NP ) 5 ) 11 ( ( NP ) 6 ) Consumed with 12 ( ( ART N ) 6 ) (a) ( ( ART N PP ) 6 ) (b) ( ( PRON ) 6) 13 ( ( N ) 7 ) Consumed a 14 ( ( ) 8 ) Consume telescope Finish Parsing

Top Down Parsing - Observations Top down parsing gave us the Verb Attachment Parse Tree (i.e., I used a telescope) To obtain the alternate parse tree, the backup state in step 5 will have to be invoked Is there an efficient way to obtain all parses?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 Colour Scheme : Blue for Normal Parse Green for Verb Attachment Parse Purple for Noun Attachment Parse Red for Invalid Parse

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 S 1? NP 12 VP 2?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP??

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? NP 68 ART 67 N 78 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5?

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? NP 68 ART 67 N 78 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5? PP 58 P 56 NP 68

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? NP 68 ART 67 N 78 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5? PP 58 P 56 NP 68 NP 38 ART 34 N 45 PP 58

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? NP 68 ART 67 N 78 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5? PP 58 P 56 NP 68 NP 38 ART 34 N 45 PP 58 VP 28 V 23 NP 35 PP 58 VP 28 V 23 NP 38

Bottom Up Parse I saw a boy with a telescope 1 2 3 4 5 6 7 8 NP 12 PRON 12 VP 2? V 23 NP 3? NP 35 ART 34 N 45 NP 35 ART 34 N 45 PP 5? P 56 NP 6? NP 68 ART 67 N 7? NP 68 ART 67 N 78 S 1? NP 12 VP 2? VP 2? V 23 NP 3? PP?? NP 3? ART 34 N 45 PP 5? NP 3? ART 34 N 45 PP 5? NP6? ART 67 N 78 PP 8? VP 25 V 23 NP 35 S 15 NP 12 VP 25 VP 2? V 23 NP 35 PP 5? PP 58 P 56 NP 68 NP 38 ART 34 N 45 PP 58 VP 28 V 23 NP 35 PP 58 VP 28 V 23 NP 38 S 18 NP 12 VP 28

Bottom Up Parsing - Observations Both Noun Attachment and Verb Attachment Parses obtained by simply systematically applying the rules Numbers in subscript help in verifying the parse and getting chunks from the parse

Exercise For the sentence, The man saw the boy with a telescope & the grammar given previously, compare the performance of top-down, bottom-up & top-down chart parsing.