NLU: Semantic parsing

Similar documents
Abstract Meaning Representation. Computational Linguistics Emory University Jinho D. Choi

A Discriminative Model for Semantics-to-String Translation

A Probabilistic Forest-to-String Model for Language Generation from Typed Lambda Calculus Expressions

Conditional Language Modeling. Chris Dyer

Introduction to Semantic Parsing with CCG

Learning Goals of CS245 Logic and Computation

Bringing machine learning & compositional semantics together: central concepts

Better Conditional Language Modeling. Chris Dyer

Cross-lingual Semantic Parsing

Parts 3-6 are EXAMPLES for cse634

Propositional Logic Not Enough

Top Down Intro to Neural Semantic Parsing. Ofir

A proof theoretical account of polarity items and monotonic inference.

Deduction by Daniel Bonevac. Chapter 3 Truth Trees

Overview. CS389L: Automated Logical Reasoning. Lecture 7: Validity Proofs and Properties of FOL. Motivation for semantic argument method

Languages. Languages. An Example Grammar. Grammars. Suppose we have an alphabet V. Then we can write:

Solving Equations by Adding and Subtracting

Natural Language Understanding. Recap: probability, language models, and feedforward networks. Lecture 12: Recurrent Neural Networks and LSTMs

CSC321 Lecture 16: ResNets and Attention

Logic: The Big Picture

Discriminative Training

CRF Word Alignment & Noisy Channel Translation

Driving Semantic Parsing from the World s Response

Recurrent neural network grammars

Lecture 15: Recurrent Neural Nets

Predicates, Quantifiers and Nested Quantifiers

PROPOSITIONAL LOGIC. VL Logik: WS 2018/19

Sharpening the empirical claims of generative syntax through formalization

Natural Language Processing

CSC321 Lecture 15: Recurrent Neural Networks

Ling 98a: The Meaning of Negation (Week 5)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Final Exam (100 points)

Surface Reasoning Lecture 2: Logic and Grammar

Propositional Logic and Semantics

Lecture 5: Semantic Parsing, CCGs, and Structured Classification

Moreno Mitrović. The Saarland Lectures on Formal Semantics

Learning to translate with neural networks. Michael Auli

Bound and Free Variables. Theorems and Proofs. More valid formulas involving quantifiers:

A Logic Primer. Stavros Tripakis University of California, Berkeley

Semantics and Pragmatics of NLP

CS1021. Why logic? Logic about inference or argument. Start from assumptions or axioms. Make deductions according to rules of reasoning.

Quantification in the predicate calculus

Predicate Logic 1. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic. The Need for Predicate Logic

Discriminative Training. March 4, 2014

AN ABSTRACT OF THE DISSERTATION OF

Why Learning Logic? Logic. Propositional Logic. Compound Propositions

173 Logic and Prolog 2013 With Answers and FFQ

Natural Language Processing (CSEP 517): Machine Translation (Continued), Summarization, & Finale

Quasi-Second-Order Parsing for 1-Endpoint-Crossing, Pagenumber-2 Graphs

Google s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

Introduction to first-order logic:

Final exam of ECE 457 Applied Artificial Intelligence for the Spring term 2007.

Announcements Multiple & Mixed Quantifiers Understanding Quantification. Quantification What We ve Done. Outline. William Starr

This kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this.

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

a. Develop a fragment of English that contains quantificational NPs. b. Develop a translation base from that fragment to Politics+λ

Intermediate Logic. First-Order Logic

Introduction to Semantics. Common Nouns and Adjectives in Predicate Position 1

Announcements CompSci 102 Discrete Math for Computer Science

Predicate Logic: Introduction and Translations

Quantification What We ve Done. Multiple & Mixed Quantifiers Understanding Quantification. William Starr

a. ~p : if p is T, then ~p is F, and vice versa

Requirements Validation. Content. What the standards say (*) ?? Validation, Verification, Accreditation!! Correctness and completeness

A Logic Primer. Stavros Tripakis University of California, Berkeley. Stavros Tripakis (UC Berkeley) EE 144/244, Fall 2014 A Logic Primer 1 / 35

Proseminar on Semantic Theory Fall 2013 Ling 720 Problem Set on the Analysis of Quantification: Answers and Notes

Lecture 11: Proofs, Games, and Alternation

Adam Blank Spring 2017 CSE 311. Foundations of Computing I

Ling 130 Notes: Syntax and Semantics of Propositional Logic

Introduction to Semantics. The Formalization of Meaning 1

Ella failed to drop the class. Ella dropped the class.

Sequence Modeling with Neural Networks

CS103 Handout 22 Fall 2012 November 30, 2012 Problem Set 9

First-Order Logic (FOL)

CPSC 121: Models of Computation

EXERCISE 10 SOLUTIONS

Mathematical Logic Part Three

Deep Learning Sequence to Sequence models: Attention Models. 17 March 2018

Logic for Computer Science - Week 2 The Syntax of Propositional Logic

Natural Language Processing

Theory of Computation CS3102 Spring 2014

Semantics and Pragmatics of NLP Pronouns

Chapter 4: Computation tree logic

Introduction to Theoretical Computer Science

Solutions to Exercises (Sections )

Mathematical Preliminaries. Sipser pages 1-28

Final exam of ECE 457 Applied Artificial Intelligence for the Fall term 2007.

Deep Learning for NLP

564 Lecture 25 Nov. 23, Continuing note on presuppositional vs. nonpresuppositional dets.

Hybrid Logic Dependency Semantics

Word Alignment III: Fertility Models & CRFs. February 3, 2015

Knowledge Representation and Reasoning

Computational Models - Lecture 3

Andrew/CS ID: Midterm Solutions, Fall 2006

IBM Model 1 for Machine Translation

Syllogistic Logic and its Extensions

CS103 Handout 12 Winter February 4, 2013 Practice Midterm Exam

THE LOGIC OF COMPOUND STATEMENTS

Probability Review. September 25, 2015

Lecture Notes on From Rules to Propositions

Transcription:

NLU: Semantic parsing Adam Lopez slide credits: Chris Dyer, Nathan Schneider March 30, 2018 School of Informatics University of Edinburgh alopez@inf.ed.ac.uk

Recall: meaning representations Sam likes Casey likes(sam, Casey) Anna s dog Mr. PeanutButter misses her misses(mrpb, Anna) dog(mrpb) Kim likes everyone x.likes(x, Kim)

Recall: meaning representations Meaning representations are verifiable, unambiguous, canonical. Predicate-argument structure is a good match for FOL, as well as structures with argument-like elements (e.g. NPs) Determiners, quantifiers (e.g. everyone, anyone ), and negation can be expressed in first order logic.

Representing the meaning of arbitrary NL is hard Much of natural language is unverifiable, ambiguous, non-canonical. What is the finite set of predicates? When do two words map to the same predicate? When are homonyms mapped to different predicates? Can we decompose lexical meanings into a finite set of predicates (possibly composed)? What is the finite set of constants?

Easier: representing a closed domain Example: GEOQUERY dataset What states border Texas? λx. state(x) borders(x,texas) What is the largest state? argmax(λx. state(x) λx.size(x)) Semantic parsing is the problem of returning a logical form for an input natural language sentence.

Pairs of NL sentences with structured MR can be collected Example: IFTTT dataset (Quirk et al. 2015)

WikiTableQuestions

similar information powers Google s knowledge graph

Viewing MR as a string, semantic parsing is just conditional language modeling p(y 1,...,y y x 1,...,x x ) Model using standard sequence models

Viewing MR as a string, semantic parsing is just conditional language modeling p(y 1,...,y y x 1,...,x x ) Model using standard sequence models with one additional element

x 1 x 2 x 3 x 4 Ich möchte ein Bier

! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

h 1 h 2 h 3 h 4! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

f i =[h i ;! h i ] h 1 h 2 h 3 h 4! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

f i =[h i ;! h i ] h 1 h 2 h 3 h 4! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

f i =[h i ;! h i ] h 1 h 2 h 3 h 4! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

f i =[h i ;! h i ] h 1 h 2 h 3 h 4! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier

f i =[h i ;! h i ] h 1 h 2 h 3 h 4 F 2 R 2n f! h 1! h 2! h 3! h 4 x 1 x 2 x 3 x 4 Ich möchte ein Bier Ich möchte ein Bier

0 Ich möchte ein Bier

0 z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

0 z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

I'd 0 z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

I'd like 0 I'd z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

I'd like a 0 I'd like z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

I'd like a beer 0 I'd like a z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

I'd like a beer STOP stop 0 I'd like a beer z } { Attention history: a > 1 a > 2 a > 3 a > 4 a > 5 Ich möchte ein Bier

English-French English-German

Since logical forms are treelike, can use treelstm decoder

Model learns to translate words into predicates they invoke

Abstract meaning PENMAN notation representation (AMR) The edges (ARG0 and ARG1) are relations Each node in the graph has a variable! They are labeled with concepts! d / dog means d is an instance of dog! The dog is eating a bone (e / eat-01 :ARG0 (d / dog) :ARG1 (b / bone)) e/eat-01 ARG0 ARG1 d/dog b/bone 7

Abstract meaning Reentrancy representation (AMR) What if something is referenced multiple times? Notice how dog has two incoming roles now. To do this in PENMAN format, repeat the variable. We call this a reentrancy. (want-01 :ARG0 (d / dog) :ARG1 (e / eat-01 :ARG0 d! :ARG1 (b / bone))) w/want-01 The dog wants to eat the bone ARG1 ARG0 e/eat-01 ARG0 ARG1 d/dog b/bone 9

Coreference Bob wants Anna to give him a job. Q: who does him refer to?

Coreference Charles just graduated, and now Bob wants Anna to give him a job. Q: who does him refer to?

Metonymy Westminster decided to distribute funds throughout England, Wales, Northern Island, and Scotland

Metonymy Westminster decided to distribute funds throughout England, Wales, Northern Island, and Scotland decided(westminster, )

Metonymy Westminster decided to distribute funds throughout England, Wales, Northern Island, and Scotland decided(westminster, ) decided(parliament, )

Implicature That cake looks delicious

Implicature That cake looks delicious What Rogelio was really thinking: I would like a piece of that cake.

Even more phenomena Abbreviations (e.g. National Health Service=NHS) Nicknames (JLaw=Jennifer Lawrence) Metaphor (crime is a virus infecting the city) Time expressions and change of state Many others

Summary In many cases, meaning representation can be captured in firstorder logic. But wide-coverage meaning representation is hard; closed domains are easier, and can sometimes be harvested automatically. This leads to a proliferation of domain-specific MRs. Trainable alternative to compositional approaches: encoderdecoder neural models. The encoder and decoder can be mixed and matched: RNN, top-down tree RNN, etc. Works well on small, closed domains if we have training data, but there are many unsolved phenomena/ problems in semantics.