Introduction to Semantic Parsing with CCG

Size: px
Start display at page:

Download "Introduction to Semantic Parsing with CCG"

Transcription

1 Introduction to Semantic Parsing with CCG Kilian Evang Heinrich-Heine-Universität Düsseldorf

2 Table of contents 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References 2 2 / 42

3 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References 3 3 / 42

4 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References 4 4 / 42

5 Plain old context-free grammars S NP VP N V Mary sings NP N Mary S VP V NP kicks N John NP N Mary S VP VP AdvP V NP Adv kicks N happily John 5 5 / 42

6 Plain old context-free grammars S NP VP N V Mary sings NP N Mary S VP V NP kicks N John NP N Mary S VP VP AdvP V NP Adv kicks N happily John S NP VP NP N AdvP Adv VP V VP V NP VP VP AdvP 6 5 / 42

7 Categorial grammars S NP S \ NP Mary sings NP Mary S S \ NP (S \ NP)/NP NP kicks John NP Mary S S \ NP S \ NP (S \ NP)\(S \ NP) (S \ NP)/NP NP happily kicks John 7 6 / 42

8 Categorial grammars S NP S \ NP Mary sings NP Mary S S \ NP (S \ NP)/NP NP kicks John NP Mary S S \ NP S \ NP (S \ NP)\(S \ NP) (S \ NP)/NP NP happily kicks John X X/Y Y X Y X\Y 8 6 / 42

9 Notation Mary sings NP S \ NP S < 0 Mary NP kicks John (S \ NP)/NP NP > 0 S \ NP S < 0 Mary NP kicks John (S \ NP)/NP NP > 0 S \ NP happily (S \ NP)\(S \ NP) < 0 S \ NP < 0 S 9 7 / 42

10 Notation Mary sings NP S \ NP S < 0 Mary NP kicks John (S \ NP)/NP NP > 0 S \ NP S < 0 Mary NP kicks John (S \ NP)/NP NP > 0 S \ NP happily (S \ NP)\(S \ NP) < 0 S \ NP < 0 S X/Y Y X (> 0 ) (forward application) Y X\Y X (< 0 ) (backward application) 10 7 / 42

11 Semantics Mary NP kicks (S \ NP)/NP John NP 11 8 / 42

12 Semantics Mary NP : mary kicks (S \ NP)/NP John NP 12 9 / 42

13 Semantics Mary NP : mary kicks (S \ NP)/NP : λx.λy.kicks(y, x) John NP / 42

14 Semantics Mary NP : mary kicks (S \ NP)/NP : λx.λy.kicks(y, x) John NP : john / 42

15 Semantics Mary NP : mary kicks (S \ NP)/NP : λx.λy.kicks(y, x) John NP : john X/Y : f Y : g X : f (g) (> 0 ) (forward application) Y : g X\Y : f X : f (g) (< 0 ) (backward application)

16 Semantics Mary NP : mary kicks John (S \ NP)/NP : λx.λy.kicks(y, x) NP : john S \ NP : λy.kicks(y, john) > 0 X/Y : f Y : g X : f (g) (> 0 ) (forward application) Y : g X\Y : f X : f (g) (< 0 ) (backward application)

17 Semantics Mary NP : mary kicks (S \ NP)/NP : λx.λy.kicks(y, x) John NP : john S \ NP : λy.kicks(y, john) > 0 S : kicks(mary, john) < 0 X/Y : f Y : g X : f (g) (> 0 ) (forward application) Y : g X\Y : f X : f (g) (< 0 ) (backward application) / 42

18 Coordination of Functors Mary NP mary sings S \ NP λx.sings(x) and ((S \ NP)\(S \ NP))/(S \ NP) λf.λg.λx.g(x) f (x) (S \ NP)\(S \ NP) λg.λx.g(x) dances(x) S \ NP λx.sings(x) dances(x) S sings(mary) dances(mary) dances S \ NP λx.dances(x) > 0 < 0 < / 42

19 Coordination of Arguments Mary NP mary S/(S \ NP) λf.f (mary) and John sing ((S/(S \ NP))\(S/(S \ NP)))/(S/(S \ NP)) λf.λg.λh.g(h) f (h) NP john S \ NP λx.sings(x) T > T > S/(S \ NP) λf.f (john) > 0 (S/(S \ NP))\(S/(S \ NP)) λg.λh.g(h) h(john) S/(S \ NP) λh.h(mary) h(john) S sing(mary) sing(john) < 0 > 0 Y : g T/(T\Y) : λf.f (g) (T > ) (forward type raising) Y : g T\(T/Y) : λf.f (g) (T < ) (backward type raising) / 42

20 How CGs differ from CFGs 1 informative labels (categories) 2 general rules (combinators) 3 transparent syntax-semantics interface (syntactic dependency function application) / 42

21 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

22 Combinatory Categorial Grammar type of CG developed by Mark Steedman [1] additional combinators elegant treatment of incomplete constituents object relative clauses non-canonical coordination / 42

23 Object relative clauses the NP/N car N that Sue bought (N \ N)/(S/NP) NP (S \ NP)/NP T > S/(S \ NP) > 1 S/NP > 0 N \ N < 0 N > 0 NP X/Y Y/Z X/Z (> 1 ) (forward harmonic composition) Y\Z X\Y X\Z (< 1 ) (backward harmonic composition) X/Y Y\Z X\Z (> 1 ) (forward crossing composition) Y/Z X\Y X/Z (< 1 ) (backward crossing composition) / 42

24 Non-canonical coordination John NP made (S \ NP)/NP and ((S/NP)\(S/NP))/(S/NP) Mary NP ate (S \ NP)/NP T > S/(S \ NP) T > S/(S \ NP) S/NP > 1 S/NP > 1 > 0 (S/NP)\(S/NP) < 0 S/NP S the NP/N marmalade N > 0 NP > / 42

25 What makes CCG suitable for semantic parsing? few labels few rules flexible constituency: syntax adapts to semantics clear syntax-semantics interface there is not much to learn about syntax, can focus on semantics / 42

26 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

27 Zettlemoyer and Collins (2005) Luke S. Zettlemoyer and Michael Collins (2005): Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars [3] First paper to apply CCG to semantic parser learning, many followed / 42

28 Characteristics of ZC05 s parser NLU representation list of words MRL GeoQuery and Jobs (in lambda format) data sentences (NLUs) + logical forms (MRs) evaluation metric exact match (modulo renaming variables) lexicon representation CCG categories with lambda terms / 42

29 Characteristics of ZC05 s parser (cont.) candidate lexicon generation using templates (GENLEX) parsing algorithm CKY-style (chart) features only lexical features model log-linear training algorithm alternates between GENLEX, pruning, and parameter estimation / 42

30 Characteristics of ZC05 s parser (cont.) experimental setup Geo880, Jobs640 datasets, standard splits / 42

31 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

32 Example parse What (S/(S \ NP))/N λf.λg.λx.f (x) g(x) S/(S \ NP) λg.λx.state(x) g(x) states N λx.state(x) > 0 border (S \ NP)/NP λx.λy.borders(y, x) S λx.state(x) borders(x, texas) S \ NP λy.borders(y, texas) Texas NP texas > 0 > / 42

33 Learning lexicon entries Input (a training example) (1) a. What states border Texas b. λx.state(x) borders(x, texas) Desired output What := (S/(S \ NP))/N : λf.λg.λx.f (x) g(x) states := N : λx.state(x) border := (S \ NP)/NP : λx.λy.borders(y, x) Texas := NP : texas / 42

34 Hard-coded cross-domain lexicon entries What := (S/(S \ NP))/N : λf.λg.λx.f (x) g(x) / 42

35 Lexical entries for names in the domain database Texas := NP : texas Michigan := NP : michigan Seattle := NP : seattle / 42

36 Candidate lexicon generation with GENLEX Input (a training example) (2) a. Utah borders Idaho b. borders(utah, idaho) GENLEX output Utah := NP : utah Idaho := NP : idaho borders := (S \ NP)/NP : λx.λy.borders(y, x) borders := NP : idaho borders Utah := (S \ NP)/NP : λx.λy.borders(y, x) spurious items need to be pruned later / 42

37 GENLEX GENLEX(S, L) = {x := y x W(S), y C(L)} where S is a sentence L is its logical form W(S) is the set of all subsequences of words in S C maps L to a set of categories through rules (see next slide) / 42

38 GENLEX (cont.) / 42

39 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

40 Probabilistic CCG parsing mapping a sentence S to (T, L) where T is a parse tree (derivation) and L is a logical form even with a perfect lexicon, one S can have multiple (T, L) pairs (lexical ambiguity, attachment ambiguity, spurious ambiguity...) solution: define probability distribution P(L, T S) / 42

41 Probabilistic CCG parsing (cont.) here: features = lexicon entries used in a parse feature function f maps parses (L, T, S) to a feature vector that indicates for each lexicon entry how often it is used in T assume a parameter vector θ that scores individual lexical features such that P(L, T S; θ) = f (L,T,S) θ (log-linear model) (L,T) ef (L,T,S) θ e Given S, return the parse (L, T) with the highest P(L, T S; θ) problem: how to find a good lexicon and a good θ? / 42

42 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} / 42

43 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries / 42

44 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs / 42

45 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation / 42

46 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) / 42

47 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) / 42

48 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π / 42

49 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π end for / 42

50 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π end for Λ t := Λ 0 n i=1 λ i update the lexicon / 42

51 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π end for Λ t := Λ 0 n i=1 λ i update the lexicon θ t := ESTIMATE(Λ t, E, θ t 1 ) parameter estimation / 42

52 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π end for Λ t := Λ 0 n i=1 λ i update the lexicon θ t := ESTIMATE(Λ t, E, θ t 1 ) parameter estimation end for / 42

53 Bird s-eye view of the learning algorithm Input: initial lexicon Λ 0, training examples E = {(S i, L i ) : i = 1... n} Initialization: θ 0 := vector with 0.1 for lexicon entries in Λ 0, 0.01 for all other lexicon entries for t = 1... T do epochs for i = 1... n do lexical generation λ := Λ 0 GENLEX(S i, L i ) π := PARSE(S i, L i, λ, θ t 1 ) λ i := the set of lexicon entries in π end for Λ t := Λ 0 n i=1 λ i update the lexicon θ t := ESTIMATE(Λ t, E, θ t 1 ) end for Output: lexicon Λ T, parameters θ T parameter estimation / 42

54 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

55 Results Geo880 Jobs640 Precision Recall Precision Recall Zettlemoyer & Collins [3] Tang & Mooney [2] / 42

56 Some learned lexicon entries states := N : λx.state(x) major := N/N : λf.λx.major(x) f (x) population := N : λx.population(x) cities := N : λx.river(x) rivers := N : λx.river(x) run through := (S \ NP)/NP : λx.λy.traverse(y, x) the largest := NP/N : λf. arg max(f, λx.size(x)) river := N : λx.river(x) the highest := NP/N : λf. arg max(f, λx.elev(x)) the longest := NP/N : λf. arg max(f, λx.len(x)) / 42

57 Outline 1 Introduction to CCG Categorial Grammar (CG) Combinatory Categorial Grammar (CCG) 2 Zettlemoyer and Collins (2005) Candidate lexicon generation Features, model, learning Results 3 References / 42

58 [1] Steedman, Mark The syntactic process. The MIT Press. [2] Tang, Lappoon R. & Raymond J. Mooney Using multiple clause constructors in inductive logic programming for semantic parsing. In Proceedings of the 12th european conference on machine learning, [3] Zettlemoyer, Luke & Michael Collins Learning to map sentences to logical form: structured classification with probabilistic categorial grammars. In Proceedings of the twenty-first conference annual conference on uncertainty in artificial intelligence (UAI-05), AUAI Press.

Lecture 5: Semantic Parsing, CCGs, and Structured Classification

Lecture 5: Semantic Parsing, CCGs, and Structured Classification Lecture 5: Semantic Parsing, CCGs, and Structured Classification Kyle Richardson kyle@ims.uni-stuttgart.de May 12, 2016 Lecture Plan paper: Zettlemoyer and Collins (2012) general topics: (P)CCGs, compositional

More information

Cross-lingual Semantic Parsing

Cross-lingual Semantic Parsing Cross-lingual Semantic Parsing Part I: 11 Dimensions of Semantic Parsing Kilian Evang University of Düsseldorf 1 / 94 Abbreviations NL natural language e.g., English, Bulgarian NLU natural language utterance

More information

Driving Semantic Parsing from the World s Response

Driving Semantic Parsing from the World s Response Driving Semantic Parsing from the World s Response James Clarke, Dan Goldwasser, Ming-Wei Chang, Dan Roth Cognitive Computation Group University of Illinois at Urbana-Champaign CoNLL 2010 Clarke, Goldwasser,

More information

Semantic Parsing with Combinatory Categorial Grammars

Semantic Parsing with Combinatory Categorial Grammars Semantic Parsing with Combinatory Categorial Grammars Yoav Artzi, Nicholas FitzGerald and Luke Zettlemoyer University of Washington ACL 2013 Tutorial Sofia, Bulgaria Learning Data Learning Algorithm CCG

More information

Bringing machine learning & compositional semantics together: central concepts

Bringing machine learning & compositional semantics together: central concepts Bringing machine learning & compositional semantics together: central concepts https://githubcom/cgpotts/annualreview-complearning Chris Potts Stanford Linguistics CS 244U: Natural language understanding

More information

A Probabilistic Forest-to-String Model for Language Generation from Typed Lambda Calculus Expressions

A Probabilistic Forest-to-String Model for Language Generation from Typed Lambda Calculus Expressions A Probabilistic Forest-to-String Model for Language Generation from Typed Lambda Calculus Expressions Wei Lu and Hwee Tou Ng National University of Singapore 1/26 The Task (Logical Form) λx 0.state(x 0

More information

Surface Reasoning Lecture 2: Logic and Grammar

Surface Reasoning Lecture 2: Logic and Grammar Surface Reasoning Lecture 2: Logic and Grammar Thomas Icard June 18-22, 2012 Thomas Icard: Surface Reasoning, Lecture 2: Logic and Grammar 1 Categorial Grammar Combinatory Categorial Grammar Lambek Calculus

More information

A Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting

A Multi-Modal Combinatory Categorial Grammar. A MMCCG Analysis of Japanese Nonconstituent Clefting A Multi-Modal Combinatory Categorial Grammar Analysis of Japanese Nonconstituent Clefting Department of Linguistics The Ohio State University {kubota,esmith}@ling.osu.edu A basic cleft sentence Data Clefting

More information

Learning Dependency-Based Compositional Semantics

Learning Dependency-Based Compositional Semantics Learning Dependency-Based Compositional Semantics Semantic Representations for Textual Inference Workshop Mar. 0, 0 Percy Liang Google/Stanford joint work with Michael Jordan and Dan Klein Motivating Problem:

More information

Lab 12: Structured Prediction

Lab 12: Structured Prediction December 4, 2014 Lecture plan structured perceptron application: confused messages application: dependency parsing structured SVM Class review: from modelization to classification What does learning mean?

More information

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science

Natural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):

More information

LECTURER: BURCU CAN Spring

LECTURER: BURCU CAN Spring LECTURER: BURCU CAN 2017-2018 Spring Regular Language Hidden Markov Model (HMM) Context Free Language Context Sensitive Language Probabilistic Context Free Grammar (PCFG) Unrestricted Language PCFGs can

More information

Probabilistic Context-Free Grammars. Michael Collins, Columbia University

Probabilistic Context-Free Grammars. Michael Collins, Columbia University Probabilistic Context-Free Grammars Michael Collins, Columbia University Overview Probabilistic Context-Free Grammars (PCFGs) The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Probabilistic Context Free Grammars. Many slides from Michael Collins and Chris Manning

Probabilistic Context Free Grammars. Many slides from Michael Collins and Chris Manning Probabilistic Context Free Grammars Many slides from Michael Collins and Chris Manning Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic

More information

Categorial Grammar. Larry Moss NASSLLI. Indiana University

Categorial Grammar. Larry Moss NASSLLI. Indiana University 1/37 Categorial Grammar Larry Moss Indiana University NASSLLI 2/37 Categorial Grammar (CG) CG is the tradition in grammar that is closest to the work that we ll do in this course. Reason: In CG, syntax

More information

Probabilistic Context Free Grammars. Many slides from Michael Collins

Probabilistic Context Free Grammars. Many slides from Michael Collins Probabilistic Context Free Grammars Many slides from Michael Collins Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar

More information

Sharpening the empirical claims of generative syntax through formalization

Sharpening the empirical claims of generative syntax through formalization Sharpening the empirical claims of generative syntax through formalization Tim Hunter University of Minnesota, Twin Cities NASSLLI, June 2014 Part 1: Grammars and cognitive hypotheses What is a grammar?

More information

Parsing with Context-Free Grammars

Parsing with Context-Free Grammars Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars

More information

Probabilistic Context-free Grammars

Probabilistic Context-free Grammars Probabilistic Context-free Grammars Computational Linguistics Alexander Koller 24 November 2017 The CKY Recognizer S NP VP NP Det N VP V NP V ate NP John Det a N sandwich i = 1 2 3 4 k = 2 3 4 5 S NP John

More information

NLU: Semantic parsing

NLU: Semantic parsing NLU: Semantic parsing Adam Lopez slide credits: Chris Dyer, Nathan Schneider March 30, 2018 School of Informatics University of Edinburgh alopez@inf.ed.ac.uk Recall: meaning representations Sam likes Casey

More information

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)

Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements

More information

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.

S NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP  NP PP 1.0. N people 0. /6/7 CS 6/CS: Natural Language Processing Instructor: Prof. Lu Wang College of Computer and Information Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang The grammar: Binary, no epsilons,.9..5

More information

Advanced Natural Language Processing Syntactic Parsing

Advanced Natural Language Processing Syntactic Parsing Advanced Natural Language Processing Syntactic Parsing Alicia Ageno ageno@cs.upc.edu Universitat Politècnica de Catalunya NLP statistical parsing 1 Parsing Review Statistical Parsing SCFG Inside Algorithm

More information

An introduction to mildly context sensitive grammar formalisms. Combinatory Categorial Grammar

An introduction to mildly context sensitive grammar formalisms. Combinatory Categorial Grammar An introduction to mildly context sensitive grammar formalisms Combinatory Categorial Grammar Gerhard Jäger & Jens Michaelis University of Potsdam {jaeger,michael}@ling.uni-potsdam.de p.1 Basic Categorial

More information

Outline. Learning. Overview Details Example Lexicon learning Supervision signals

Outline. Learning. Overview Details Example Lexicon learning Supervision signals Outline Learning Overview Details Example Lexicon learning Supervision signals 0 Outline Learning Overview Details Example Lexicon learning Supervision signals 1 Supervision in syntactic parsing Input:

More information

X-bar theory. X-bar :

X-bar theory. X-bar : is one of the greatest contributions of generative school in the filed of knowledge system. Besides linguistics, computer science is greatly indebted to Chomsky to have propounded the theory of x-bar.

More information

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22

Parsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22 Parsing Probabilistic CFG (PCFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 22 Table of contents 1 Introduction 2 PCFG 3 Inside and outside probability 4 Parsing Jurafsky

More information

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars

Unterspezifikation in der Semantik Scope Semantics in Lexicalized Tree Adjoining Grammars in der emantik cope emantics in Lexicalized Tree Adjoining Grammars Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2011/2012 LTAG: The Formalism (1) Tree Adjoining Grammars (TAG): Tree-rewriting

More information

Multiword Expression Identification with Tree Substitution Grammars

Multiword Expression Identification with Tree Substitution Grammars Multiword Expression Identification with Tree Substitution Grammars Spence Green, Marie-Catherine de Marneffe, John Bauer, and Christopher D. Manning Stanford University EMNLP 2011 Main Idea Use syntactic

More information

Linguistics 819: Seminar on TAG and CCG. Introduction to Combinatory Categorial Grammar

Linguistics 819: Seminar on TAG and CCG. Introduction to Combinatory Categorial Grammar Linguistics 819: Seminar on TAG and CCG Alexander Williams, 9 April 2008 Introduction to Combinatory Categorial Grammar Steedman and Dowty 1 The biggest picture Grammatical operations apply only to constituents.

More information

Features of Statistical Parsers

Features of Statistical Parsers Features of tatistical Parsers Preliminary results Mark Johnson Brown University TTI, October 2003 Joint work with Michael Collins (MIT) upported by NF grants LI 9720368 and II0095940 1 Talk outline tatistical

More information

Decoding and Inference with Syntactic Translation Models

Decoding and Inference with Syntactic Translation Models Decoding and Inference with Syntactic Translation Models March 5, 2013 CFGs S NP VP VP NP V V NP NP CFGs S NP VP S VP NP V V NP NP CFGs S NP VP S VP NP V NP VP V NP NP CFGs S NP VP S VP NP V NP VP V NP

More information

A* Search. 1 Dijkstra Shortest Path

A* Search. 1 Dijkstra Shortest Path A* Search Consider the eight puzzle. There are eight tiles numbered 1 through 8 on a 3 by three grid with nine locations so that one location is left empty. We can move by sliding a tile adjacent to the

More information

A Supertag-Context Model for Weakly-Supervised CCG Parser Learning

A Supertag-Context Model for Weakly-Supervised CCG Parser Learning A Supertag-Context Model for Weakly-Supervised CCG Parser Learning Dan Garrette Chris Dyer Jason Baldridge Noah A. Smith U. Washington CMU UT-Austin CMU Contributions 1. A new generative model for learning

More information

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing

Parsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the

More information

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.

Parsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46. : Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching

More information

Outline. Learning. Overview Details

Outline. Learning. Overview Details Outline Learning Overview Details 0 Outline Learning Overview Details 1 Supervision in syntactic parsing Input: S NP VP NP NP V VP ESSLLI 2016 the known summer school is V PP located in Bolzano Output:

More information

Statistical Machine Translation

Statistical Machine Translation Statistical Machine Translation -tree-based models (cont.)- Artem Sokolov Computerlinguistik Universität Heidelberg Sommersemester 2015 material from P. Koehn, S. Riezler, D. Altshuler Bottom-Up Decoding

More information

Semantics and Generative Grammar. Expanding Our Formalism, Part 1 1

Semantics and Generative Grammar. Expanding Our Formalism, Part 1 1 Expanding Our Formalism, Part 1 1 1. Review of Our System Thus Far Thus far, we ve built a system that can interpret a very narrow range of English structures: sentences whose subjects are proper names,

More information

Natural Language Processing

Natural Language Processing SFU NatLangLab Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class Simon Fraser University September 27, 2018 0 Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class

More information

Lambek Grammars as Combinatory Categorial Grammars

Lambek Grammars as Combinatory Categorial Grammars Lambek Grammars as Combinatory Categorial Grammars GERHARD JÄGER, Zentrum für Allgemeine Sprachwissenschaft (ZAS), Jägerstr 10/11, 10117 Berlin, Germany E-mail: jaeger@zasgwz-berlinde Abstract We propose

More information

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is

Parsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is Unger s Parser Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2012/2013 a top-down parser: we start with S and

More information

Learning to translate with neural networks. Michael Auli

Learning to translate with neural networks. Michael Auli Learning to translate with neural networks Michael Auli 1 Neural networks for text processing Similar words near each other France Spain dog cat Neural networks for text processing Similar words near each

More information

c(a) = X c(a! Ø) (13.1) c(a! Ø) ˆP(A! Ø A) = c(a)

c(a) = X c(a! Ø) (13.1) c(a! Ø) ˆP(A! Ø A) = c(a) Chapter 13 Statistical Parsg Given a corpus of trees, it is easy to extract a CFG and estimate its parameters. Every tree can be thought of as a CFG derivation, and we just perform relative frequency estimation

More information

AN ABSTRACT OF THE DISSERTATION OF

AN ABSTRACT OF THE DISSERTATION OF AN ABSTRACT OF THE DISSERTATION OF Kai Zhao for the degree of Doctor of Philosophy in Computer Science presented on May 30, 2017. Title: Structured Learning with Latent Variables: Theory and Algorithms

More information

Recap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees

Recap: Lexicalized PCFGs (Fall 2007): Lecture 5 Parsing and Syntax III. Recap: Charniak s Model. Recap: Adding Head Words/Tags to Trees Recap: Lexicalized PCFGs We now need to estimate rule probabilities such as P rob(s(questioned,vt) NP(lawyer,NN) VP(questioned,Vt) S(questioned,Vt)) 6.864 (Fall 2007): Lecture 5 Parsing and Syntax III

More information

A proof theoretical account of polarity items and monotonic inference.

A proof theoretical account of polarity items and monotonic inference. A proof theoretical account of polarity items and monotonic inference. Raffaella Bernardi UiL OTS, University of Utrecht e-mail: Raffaella.Bernardi@let.uu.nl Url: http://www.let.uu.nl/ Raffaella.Bernardi/personal

More information

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other

More information

Maschinelle Sprachverarbeitung

Maschinelle Sprachverarbeitung Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other

More information

Soft Inference and Posterior Marginals. September 19, 2013

Soft Inference and Posterior Marginals. September 19, 2013 Soft Inference and Posterior Marginals September 19, 2013 Soft vs. Hard Inference Hard inference Give me a single solution Viterbi algorithm Maximum spanning tree (Chu-Liu-Edmonds alg.) Soft inference

More information

Scope Ambiguities through the Mirror

Scope Ambiguities through the Mirror Scope Ambiguities through the Mirror Raffaella Bernardi In this paper we look at the interpretation of Quantifier Phrases from the perspective of Symmetric Categorial Grammar. We show how the apparent

More information

The SUBTLE NL Parsing Pipeline: A Complete Parser for English Mitch Marcus University of Pennsylvania

The SUBTLE NL Parsing Pipeline: A Complete Parser for English Mitch Marcus University of Pennsylvania The SUBTLE NL Parsing Pipeline: A Complete Parser for English Mitch Marcus University of Pennsylvania 1 PICTURE OF ANALYSIS PIPELINE Tokenize Maximum Entropy POS tagger MXPOST Ratnaparkhi Core Parser Collins

More information

A Context-Free Grammar

A Context-Free Grammar Statistical Parsing A Context-Free Grammar S VP VP Vi VP Vt VP VP PP DT NN PP PP P Vi sleeps Vt saw NN man NN dog NN telescope DT the IN with IN in Ambiguity A sentence of reasonable length can easily

More information

Lagrangian Relaxation Algorithms for Inference in Natural Language Processing

Lagrangian Relaxation Algorithms for Inference in Natural Language Processing Lagrangian Relaxation Algorithms for Inference in Natural Language Processing Alexander M. Rush and Michael Collins (based on joint work with Yin-Wen Chang, Tommi Jaakkola, Terry Koo, Roi Reichart, David

More information

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21

Parsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21 Parsing Unger s Parser Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2016/17 1 / 21 Table of contents 1 Introduction 2 The Parser 3 An Example 4 Optimizations 5 Conclusion 2 / 21 Introduction

More information

Quasi-Synchronous Phrase Dependency Grammars for Machine Translation. lti

Quasi-Synchronous Phrase Dependency Grammars for Machine Translation. lti Quasi-Synchronous Phrase Dependency Grammars for Machine Translation Kevin Gimpel Noah A. Smith 1 Introduction MT using dependency grammars on phrases Phrases capture local reordering and idiomatic translations

More information

Maximum Entropy Models for Natural Language Processing

Maximum Entropy Models for Natural Language Processing Maximum Entropy Models for Natural Language Processing James Curran The University of Sydney james@it.usyd.edu.au 6th December, 2004 Overview a brief probability and statistics refresher statistical modelling

More information

CS460/626 : Natural Language

CS460/626 : Natural Language CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 23, 24 Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th,

More information

Marrying Dynamic Programming with Recurrent Neural Networks

Marrying Dynamic Programming with Recurrent Neural Networks Marrying Dynamic Programming with Recurrent Neural Networks I eat sushi with tuna from Japan Liang Huang Oregon State University Structured Prediction Workshop, EMNLP 2017, Copenhagen, Denmark Marrying

More information

A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus

A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus Timothy A. D. Fowler Department of Computer Science University of Toronto 10 King s College Rd., Toronto, ON, M5S 3G4, Canada

More information

Introduction to Semantics. The Formalization of Meaning 1

Introduction to Semantics. The Formalization of Meaning 1 The Formalization of Meaning 1 1. Obtaining a System That Derives Truth Conditions (1) The Goal of Our Enterprise To develop a system that, for every sentence S of English, derives the truth-conditions

More information

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars

Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars Parsing Beyond Context-Free Grammars: Tree Adjoining Grammars Laura Kallmeyer & Tatiana Bladier Heinrich-Heine-Universität Düsseldorf Sommersemester 2018 Kallmeyer, Bladier SS 2018 Parsing Beyond CFG:

More information

Penn Treebank Parsing. Advanced Topics in Language Processing Stephen Clark

Penn Treebank Parsing. Advanced Topics in Language Processing Stephen Clark Penn Treebank Parsing Advanced Topics in Language Processing Stephen Clark 1 The Penn Treebank 40,000 sentences of WSJ newspaper text annotated with phrasestructure trees The trees contain some predicate-argument

More information

Collaborative topic models: motivations cont

Collaborative topic models: motivations cont Collaborative topic models: motivations cont Two topics: machine learning social network analysis Two people: " boy Two articles: article A! girl article B Preferences: The boy likes A and B --- no problem.

More information

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars

Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars Salil Vadhan October 2, 2012 Reading: Sipser, 2.1 (except Chomsky Normal Form). Algorithmic questions about regular

More information

Structured Prediction Models via the Matrix-Tree Theorem

Structured Prediction Models via the Matrix-Tree Theorem Structured Prediction Models via the Matrix-Tree Theorem Terry Koo Amir Globerson Xavier Carreras Michael Collins maestro@csail.mit.edu gamir@csail.mit.edu carreras@csail.mit.edu mcollins@csail.mit.edu

More information

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016

CKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016 CKY & Earley Parsing Ling 571 Deep Processing Techniques for NLP January 13, 2016 No Class Monday: Martin Luther King Jr. Day CKY Parsing: Finish the parse Recognizer à Parser Roadmap Earley parsing Motivation:

More information

Outline. Parsing. Approximations Some tricks Learning agenda-based parsers

Outline. Parsing. Approximations Some tricks Learning agenda-based parsers Outline Parsing CKY Approximations Some tricks Learning agenda-based parsers 0 Parsing We need to compute compute argmax d D(x) p θ (d x) Inference: Find best tree given model 1 Parsing We need to compute

More information

Parts 3-6 are EXAMPLES for cse634

Parts 3-6 are EXAMPLES for cse634 1 Parts 3-6 are EXAMPLES for cse634 FINAL TEST CSE 352 ARTIFICIAL INTELLIGENCE Fall 2008 There are 6 pages in this exam. Please make sure you have all of them INTRODUCTION Philosophical AI Questions Q1.

More information

TALP at GeoQuery 2007: Linguistic and Geographical Analysis for Query Parsing

TALP at GeoQuery 2007: Linguistic and Geographical Analysis for Query Parsing TALP at GeoQuery 2007: Linguistic and Geographical Analysis for Query Parsing Daniel Ferrés and Horacio Rodríguez TALP Research Center Software Department Universitat Politècnica de Catalunya {dferres,horacio}@lsi.upc.edu

More information

Natural Language Processing

Natural Language Processing SFU NatLangLab Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class Simon Fraser University October 9, 2018 0 Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class

More information

Machine Learning for natural language processing

Machine Learning for natural language processing Machine Learning for natural language processing Classification: Maximum Entropy Models Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 24 Introduction Classification = supervised

More information

Parsing. Left-Corner Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 17

Parsing. Left-Corner Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 17 Parsing Left-Corner Parsing Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 17 Table of contents 1 Motivation 2 Algorithm 3 Look-ahead 4 Chart Parsing 2 / 17 Motivation Problems

More information

Parsing. Weighted Deductive Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26

Parsing. Weighted Deductive Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26 Parsing Weighted Deductive Parsing Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 26 Table of contents 1 Idea 2 Algorithm 3 CYK Example 4 Parsing 5 Left Corner Example 2 / 26

More information

Semantics and Pragmatics of NLP Pronouns

Semantics and Pragmatics of NLP Pronouns Semantics and Pragmatics of NLP Pronouns School of Informatics University of Edinburgh Outline Observations About Data 1 Observations of what factors influence the way pronouns get resolved 2 Some algorithms

More information

Model-Theory of Property Grammars with Features

Model-Theory of Property Grammars with Features Model-Theory of Property Grammars with Features Denys Duchier Thi-Bich-Hanh Dao firstname.lastname@univ-orleans.fr Yannick Parmentier Abstract In this paper, we present a model-theoretic description of

More information

Logic and machine learning review. CS 540 Yingyu Liang

Logic and machine learning review. CS 540 Yingyu Liang Logic and machine learning review CS 540 Yingyu Liang Propositional logic Logic If the rules of the world are presented formally, then a decision maker can use logical reasoning to make rational decisions.

More information

Logical Agents (I) Instructor: Tsung-Che Chiang

Logical Agents (I) Instructor: Tsung-Che Chiang Logical Agents (I) Instructor: Tsung-Che Chiang tcchiang@ieee.org Department of Computer Science and Information Engineering National Taiwan Normal University Artificial Intelligence, Spring, 2010 編譯有誤

More information

Artificial Intelligence

Artificial Intelligence CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20-21 Natural Language Parsing Parsing of Sentences Are sentences flat linear structures? Why tree? Is

More information

Dual Decomposition for Natural Language Processing. Decoding complexity

Dual Decomposition for Natural Language Processing. Decoding complexity Dual Decomposition for atural Language Processing Alexander M. Rush and Michael Collins Decoding complexity focus: decoding problem for natural language tasks motivation: y = arg max y f (y) richer model

More information

Computational Linguistics

Computational Linguistics Computational Linguistics Dependency-based Parsing Clayton Greenberg Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2016 Acknowledgements These slides

More information

Models of Adjunction in Minimalist Grammars

Models of Adjunction in Minimalist Grammars Models of Adjunction in Minimalist Grammars Thomas Graf mail@thomasgraf.net http://thomasgraf.net Stony Brook University FG 2014 August 17, 2014 The Theory-Neutral CliffsNotes Insights Several properties

More information

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets. 564 Lecture 25 Nov. 23, 1999 1 Continuing note on presuppositional vs. nonpresuppositional dets. Here's the argument about the nonpresupp vs. presupp analysis of "every" that I couldn't reconstruct last

More information

Natural Language Processing. Lecture 13: More on CFG Parsing

Natural Language Processing. Lecture 13: More on CFG Parsing Natural Language Processing Lecture 13: More on CFG Parsing Probabilistc/Weighted Parsing Example: ambiguous parse Probabilistc CFG Ambiguous parse w/probabilites 0.05 0.05 0.20 0.10 0.30 0.20 0.60 0.75

More information

CS : Speech, NLP and the Web/Topics in AI

CS : Speech, NLP and the Web/Topics in AI CS626-449: Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-17: Probabilistic parsing; insideoutside probabilities Probability of a parse tree (cont.) S 1,l NP 1,2

More information

Probabilistic Graphical Models: Lagrangian Relaxation Algorithms for Natural Language Processing

Probabilistic Graphical Models: Lagrangian Relaxation Algorithms for Natural Language Processing Probabilistic Graphical Models: Lagrangian Relaxation Algorithms for atural Language Processing Alexander M. Rush (based on joint work with Michael Collins, Tommi Jaakkola, Terry Koo, David Sontag) Uncertainty

More information

LC Graphs for the Lambek calculus with product

LC Graphs for the Lambek calculus with product LC Graphs for the Lambek calculus with product Timothy A D Fowler July 18, 2007 1 Introduction The Lambek calculus, introduced in Lambek (1958), is a categorial grammar having two variants which will be

More information

CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012

CS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012 CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012 Parsing Problem Semantics Part of Speech Tagging NLP Trinity Morph Analysis

More information

Introduction to Semantics. Common Nouns and Adjectives in Predicate Position 1

Introduction to Semantics. Common Nouns and Adjectives in Predicate Position 1 Common Nouns and Adjectives in Predicate Position 1 (1) The Lexicon of Our System at Present a. Proper Names: [[ Barack ]] = Barack b. Intransitive Verbs: [[ smokes ]] = [ λx : x D e. IF x smokes THEN

More information

Multi-Component Word Sense Disambiguation

Multi-Component Word Sense Disambiguation Multi-Component Word Sense Disambiguation Massimiliano Ciaramita and Mark Johnson Brown University BLLIP: http://www.cog.brown.edu/research/nlp Ciaramita and Johnson 1 Outline Pattern classification for

More information

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing

Computational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing Computational Linguistics Dependency-based Parsing Dietrich Klakow & Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2013 Acknowledgements These slides

More information

Parsing with Context-Free Grammars

Parsing with Context-Free Grammars Parsing with Context-Free Grammars CS 585, Fall 2017 Introduction to Natural Language Processing http://people.cs.umass.edu/~brenocon/inlp2017 Brendan O Connor College of Information and Computer Sciences

More information

On rigid NL Lambek grammars inference from generalized functor-argument data

On rigid NL Lambek grammars inference from generalized functor-argument data 7 On rigid NL Lambek grammars inference from generalized functor-argument data Denis Béchet and Annie Foret Abstract This paper is concerned with the inference of categorial grammars, a context-free grammar

More information

Algorithms for Syntax-Aware Statistical Machine Translation

Algorithms for Syntax-Aware Statistical Machine Translation Algorithms for Syntax-Aware Statistical Machine Translation I. Dan Melamed, Wei Wang and Ben Wellington ew York University Syntax-Aware Statistical MT Statistical involves machine learning (ML) seems crucial

More information

CMPT-825 Natural Language Processing. Why are parsing algorithms important?

CMPT-825 Natural Language Processing. Why are parsing algorithms important? CMPT-825 Natural Language Processing Anoop Sarkar http://www.cs.sfu.ca/ anoop October 26, 2010 1/34 Why are parsing algorithms important? A linguistic theory is implemented in a formal system to generate

More information

Statistical Methods for NLP

Statistical Methods for NLP Statistical Methods for NLP Stochastic Grammars Joakim Nivre Uppsala University Department of Linguistics and Philology joakim.nivre@lingfil.uu.se Statistical Methods for NLP 1(22) Structured Classification

More information

Predicate Logic. Xinyu Feng 09/26/2011. University of Science and Technology of China (USTC)

Predicate Logic. Xinyu Feng 09/26/2011. University of Science and Technology of China (USTC) University of Science and Technology of China (USTC) 09/26/2011 Overview Predicate logic over integer expressions: a language of logical assertions, for example x. x + 0 = x Why discuss predicate logic?

More information

Natural Language Processing 1. lecture 7: constituent parsing. Ivan Titov. Institute for Logic, Language and Computation

Natural Language Processing 1. lecture 7: constituent parsing. Ivan Titov. Institute for Logic, Language and Computation atural Language Processing 1 lecture 7: constituent parsing Ivan Titov Institute for Logic, Language and Computation Outline Syntax: intro, CFGs, PCFGs PCFGs: Estimation CFGs: Parsing PCFGs: Parsing Parsing

More information

Quasi-Second-Order Parsing for 1-Endpoint-Crossing, Pagenumber-2 Graphs

Quasi-Second-Order Parsing for 1-Endpoint-Crossing, Pagenumber-2 Graphs Quasi-Second-Order Parsing for 1-Endpoint-Crossing, Pagenumber-2 Graphs Junjie Cao, Sheng Huang, Weiwei Sun, Xiaojun Wan Institute of Computer Science and Technology Peking University September 5, 2017

More information

LEARNING COMPOSITIONALITY

LEARNING COMPOSITIONALITY LARNING COMPOSIIONALIY Ignacio Cases with Christopher Potts Stanford University Meaning and Semantic Representation Compositional Semantics Semantic Parsing xecution Message Semantic Representation Denotation

More information