Efficient Divide-and-Conquer Parsing of Practical Context-Free Languages
|
|
- Pamela Wilkinson
- 5 years ago
- Views:
Transcription
1 Efficient Divide-and-onquer Parsing of Practical ontext-free Languages Jean-Philippe ernardy Koen laessen LASP Seminar, Nov 14, 2016
2 FP workhorse: lists : 1 : 2 : 3 : 4 []
3 FP workhorse: lists : 1 : 2 : 3 : 4 [] 0
4 FP workhorse: lists : 1 : 2 : 3 : 4 4 [] 0
5 FP workhorse: lists : 1 : 2 : 7 3 : 4 4 [] 0
6 FP workhorse: lists : 1 : 9 2 : 7 3 : 4 4 [] 0
7 FP workhorse: lists : 10 1 : 9 2 : 7 3 : 4 4 [] 0
8 FP workhorse: lists : 10 1 : 9 2 : 7 3 : 4 4 [] 0
9 FP workhorse: lists : 10 1 : 9 2 : 7 uilt-in sequentiality 3 : 4 4 [] 0
10 FP workhorse: lists : 10 1 : 9 2 : 7 uilt-in sequentiality ad! 3 : 4 4 [] 0
11 Exploiting parallelism: sum over a tree
12 Exploiting parallelism: sum over a tree
13 Exploiting parallelism: sum over a tree
14 Exploiting parallelism: sum over a tree
15 Exploiting parallelism: sum over a tree Picture a little computer at each node. The program flows down and the data flows up. omputers of the future will have such a fractal structure.
16 hart parsing she 2 eats 3 a 4 fish 5 with 6 a 7 fork G (NF) S NP VP VP VP PP VP VP NP VP eats PP P NP NP Det N NP she P with N fish N fork Det a
17 hart parsing she 2 eats 3 a 4 fish 5 with 6 a 7 fork G (NF) S NP VP VP VP PP VP VP NP VP eats PP P NP NP Det N NP she P with N fish N fork Det a R ij = all non-terminals generating the input substring w i..j
18 hart parsing NP 1 she VP 2 eats Det 3 a N 4 fish P 5 with Det 6 a N 7 fork G (NF) S NP VP VP VP PP VP VP NP VP eats PP P NP NP Det N NP she P with N fish N fork Det a R i,i+1 = {A A w i G}
19 hart parsing NP S S S 1 she VP VP VP 2 eats Det NP 3 a N 4 fish P PP 5 with Det NP 6 a N 7 fork G (NF) S NP VP VP VP PP VP VP NP VP eats PP P NP NP Det N NP she P with N fish N fork Det a R i,i+1 = {A A w i G} R ij = {A k [i + 1..j 1], R ik, R kj, A G}
20 Parsing Specification (1) Structure on sets of non-terminals: x + y = x y x y = {N A x, y, N A G}
21 Parsing Specification (1) Structure on sets of non-terminals: x + y = x y x y = {N A x, y, N A G} Structure on matrices: (A + ) ij = A ij + ij (A ) ij = k A ik kj
22 Parsing Specification (1) Structure on sets of non-terminals: x + y = x y x y = {N A x, y, N A G} Structure on matrices: (A + ) ij = A ij + ij (A ) ij = k A ik kj Is associative?
23 Parsing Specification (2) Find smallest R, such that R =I (w) + R R (1) (I (w)) i,i+1 = {A A w i G} (2)
24 hart parsing as Divide and onquer
25 hart parsing as Divide and onquer
26 hart parsing as Divide and onquer
27 hart parsing as Divide and onquer
28 Efficiency space usage quadratic in the size of input string runtime cubic in the size of input string The combination operator takes cubic time
29 A sparse matrix NP S S S 1 she VP VP VP 2 eats Det NP 3 a N 4 fish P PP 5 with Det NP 6 a N 7 fork #A α (i,j) dom(a) 1 (j i) 2
30 heap combination The square to fill is sparse To fill it should be quick Good space usage Good time-usage
31 How much work? S P P P
32 How much work? S P S P P
33 How much work? S P S P P
34 How much work? S P S P P
35 How much work? L S P S P P
36 Deriving Efficient Transitive losure Algorithm Problem: find R such that R = R R + W.
37 Deriving Efficient Transitive losure Algorithm Problem: find R such that R = R R + W. [ ] [ A X A X W = R = ] 0 0
38 Deriving Efficient Transitive losure Algorithm Problem: find R such that R = R R + W. [ ] [ A X A X W = R = ] 0 0 [ A X ] [ A X 0 = ] 0 [ A X 0 ] + [ ] A X 0
39 Deriving Efficient Transitive losure Algorithm Problem: find R such that R = R R + W. [ ] [ A X A X W = R = ] 0 0 [ A X ] [ A X 0 = ] 0 [ A X 0 ] + [ ] A X 0 A = A A + A X = A X + X + X = +
40 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ).
41 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). [ ] [ ] Y11 Y Y = 12 X11 X X = 12 Y 21 Y 22 X 21 X 22 [ ] [ ] A11 A A = = 12 0 A
42 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). [ ] [ ] Y11 Y Y = 12 X11 X X = 12 Y 21 Y 22 X 21 X 22 [ ] A11 A A = 12 0 A 22 [ ] [ Y11 Y 12 A11 A = 12 Y 21 Y 22 0 A 22 [ Y11 Y + 12 Y 21 Y 22 ] ] [ ] 11 = [ ] Y11 Y 12 Y 21 Y 22 [ ] + [ ] X11 X 12 X 21 X 22
43 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). [ ] [ ] [ ] Y11 Y 12 A11 A = 12 Y11 Y 12 Y 21 Y 22 0 A 22 Y 21 Y 22 [ ] [ ] [ ] Y11 Y X11 X + 12 Y 21 Y X 21 X 22 Y 11 = A 11 Y 11 + A 12 Y 21 + Y X 11 Y 12 = A 11 Y 12 + A 12 Y 22 + Y Y X 12 Y 21 = 0 + A 22 Y 21 + Y X 21 Y 22 = 0 + A 22 Y 22 + Y Y X 22
44 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). Y 11 = A 11 Y 11 + A 12 Y 21 + Y X 11 Y 12 = A 11 Y 12 + A 12 Y 22 + Y Y X 12 Y 21 = 0 + A 22 Y 21 + Y X 21 Y 22 = 0 + A 22 Y 22 + Y Y X 22 Y 11 = A 11 Y 11 + X 11 + A 12 Y 21 + Y Y 12 = A 11 Y 12 + X 12 + A 12 Y 22 + Y Y Y 21 = A 22 Y 21 + X Y Y 22 = A 22 Y 22 + X 22 + Y Y 22 22
45 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). Y 11 = A 11 Y 11 + X 11 + A 12 Y 21 + Y Y 12 = A 11 Y 12 + X 12 + A 12 Y 22 + Y Y Y 21 = A 22 Y 21 + X Y Y 22 = A 22 Y 22 + X 22 + Y Y Y 11 = V (A 11, X 11 + A 12 Y 21, 11 ) Y 12 = V (A 11, X 12 + A 12 Y 22 + Y 11 12, 22 ) Y 21 = V (A 22, X 21, 11 ) Y 22 = V (A 22, X 22 + Y 21 12, 22 )
46 Deriving Efficient hart oncatenation Problem: find Y such that Y = AY + Y + X = V (A, X, ). [ ] [ ] Y11 Y Y = 12 X11 X X = 12 Y 21 Y 22 X 21 X 22 [ ] [ ] A11 A A = = 12 0 A Y 11 = V (A 11, X 11 + A 12 Y 21, 11 ) Y 12 = V (A 11, X 12 + A 12 Y 22 + Y 11 12, 22 ) Y 21 = V (A 22, X 21, 11 ) Y 22 = V (A 22, X 22 + Y 21 12, 22 ) No circular dependencies! Done!
47 Valiant s algorithm for transitive closure A X Y = V (A, X, )
48 Valiant s algorithm for transitive closure A 11 A 12 X 11 X 12 A 22 X 21 X 22 Y = V (A, X, )
49 Valiant s algorithm for transitive closure A 11 A 12 X 11 X 12 A 22 X 21 X 22 Y = V (A, X, )
50 Valiant s algorithm for transitive closure A 11 A 12 X 11 X 12 A 22 Y 21 X 22 Y = V (A, X, )
51 Valiant s algorithm for transitive closure A 11 A 12 Y 11 X 12 A 22 Y 21 X 22 Y = V (A, X, )
52 Valiant s algorithm for transitive closure A 11 A 12 Y 11 X 12 A 22 Y 21 Y 22 Y = V (A, X, )
53 Valiant s algorithm for transitive closure A 11 A 12 Y 11 Y 12 A 22 Y 21 Y 22 Y = V (A, X, )
54 Haskell Implementation: Sparse Matrix Structure import Prelude (Eq (..)) class RingLike a where zero :: a (+) :: a a a ( ) :: a a a data M a = Q (M a) (M a) (M a) (M a) Z One a q Z Z Z Z = Z q a b c d = Q a b c d one x = if x zero then Z else One x
55 Haskell Implementation: algorithm instance (Eq a, RingLike a) RingLike (M a) where v :: (Eq a, RingLike a) M a M a M a M a v a Z b = Z v Z (One x) Z = One x v (Q a 11 a 12 Z a 22 ) (Q x 11 x 12 x 21 x 22 ) (Q b 11 b 12 Z b 22 ) = q y 11 y 12 y 21 y 22 where y 21 = v a 22 x 21 b 11 y 11 = v a 11 (x 11 + a 12 y 21 ) b 11 y 22 = v a 22 (x 22 + y 21 b 12 ) b 22 y 12 = v a 11 (x 12 + a 12 y 22 + y 11 b 12 ) b 22
56 Recursion in the grammar T TitlePage
57 Recursion in the grammar T TitlePage ad!
58 Recursion in the grammar T TitlePage ad! The combination has a lot of work to do (at least linear)
59 Recursion in the grammar T TitlePage ad! The combination has a lot of work to do (at least linear) AST is a list
60 Recursion in the grammar T TitlePage ad! The combination has a lot of work to do (at least linear) AST is a list Solution: TitlePage
61 inary encoding of lists: idea Y 0 Y 0 Y 0 Y 0 Y 0 L Y Y 0 Y 0 Y 0
62 inary encoding of lists: idea Y 0 Y 1 Y 0 Y 0 Y 1 Y 0 Y 0 Y 1 L Y Y 0 Y 0 Y 1 Y 0
63 inary encoding of lists: idea Y 0 Y 1 Y 2 Y 0 Y 0 Y 1 Y 0 Y 0 Y 1 Y 2 L Y Y 0 Y 0 Y 1 Y 0
64 inary encoding of lists: idea Y 0 Y 1 Y 2 Y 3 Y 0 Y 0 Y 1 Y 0 Y 0 Y 1 Y 2 L Y Y 0 Y 0 Y 1 Y 0
65 Summary Valiant (75) does parsing using matrix multiply; yields the most efficient known F recognition algorithm: O(n ). 1 The very same algorithm yields parsing on good inputs in O(n). The conquer step costs O(log 2 n) (instead of O(n )). Simple, effective, flexible algorithm. Implemented in NF: push-button technology. It is fast: Incremental parsing of a 8000-line program in less than 1 millisecond. Full details: 1 omplexity of the oppersmith-winograd algorithm
66 Perspective Works with any non-associative ring-like structure. The same algorithm supports probabilistic parsing; context sensitive; etc. (To be efficient we need a cutoff heuristic though.) Maybe suitable for deep learning? High level cells activated (much more) seldom
67 ONUS: Incremental computation (1)
68 ONUS: Incremental computation (2)
Harvard CS 121 and CSCI E-207 Lecture 12: General Context-Free Recognition
Harvard CS 121 and CSCI E-207 Lecture 12: General Context-Free Recognition Salil Vadhan October 11, 2012 Reading: Sipser, Section 2.3 and Section 2.1 (material on Chomsky Normal Form). Pumping Lemma for
More informationCKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016
CKY & Earley Parsing Ling 571 Deep Processing Techniques for NLP January 13, 2016 No Class Monday: Martin Luther King Jr. Day CKY Parsing: Finish the parse Recognizer à Parser Roadmap Earley parsing Motivation:
More informationCSCI 1010 Models of Computa3on. Lecture 17 Parsing Context-Free Languages
CSCI 1010 Models of Computa3on Lecture 17 Parsing Context-Free Languages Overview BoCom-up parsing of CFLs. BoCom-up parsing via the CKY algorithm An O(n 3 ) algorithm John E. Savage CSCI 1010 Lect 17
More informationComputational complexity and some Graph Theory
Graph Theory Lars Hellström January 22, 2014 Contents of todays lecture An important concern when choosing the algorithm to use for something (after basic requirements such as correctness and stability)
More informationParsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22
Parsing Probabilistic CFG (PCFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 22 Table of contents 1 Introduction 2 PCFG 3 Inside and outside probability 4 Parsing Jurafsky
More informationIntroduction to Kleene Algebras
Introduction to Kleene Algebras Riccardo Pucella Basic Notions Seminar December 1, 2005 Introduction to Kleene Algebras p.1 Idempotent Semirings An idempotent semiring is a structure S = (S, +,, 1, 0)
More informationFinite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018
Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of
More informationA* Search. 1 Dijkstra Shortest Path
A* Search Consider the eight puzzle. There are eight tiles numbered 1 through 8 on a 3 by three grid with nine locations so that one location is left empty. We can move by sliding a tile adjacent to the
More informationCFG PSA Algorithm. Sequence Alignment Guided By Common Motifs Described By Context Free Grammars
FG PS lgorithm Sequence lignment Guided By ommon Motifs Described By ontext Free Grammars motivation Find motifs- conserved regions that indicate a biological function or signature. Other algorithm do
More informationAdvanced Natural Language Processing Syntactic Parsing
Advanced Natural Language Processing Syntactic Parsing Alicia Ageno ageno@cs.upc.edu Universitat Politècnica de Catalunya NLP statistical parsing 1 Parsing Review Statistical Parsing SCFG Inside Algorithm
More informationSimplification of CFG and Normal Forms. Wen-Guey Tzeng Computer Science Department National Chiao Tung University
Simplification of CFG and Normal Forms Wen-Guey Tzeng Computer Science Department National Chiao Tung University Normal Forms We want a cfg with either Chomsky or Greibach normal form Chomsky normal form
More informationSimplification of CFG and Normal Forms. Wen-Guey Tzeng Computer Science Department National Chiao Tung University
Simplification of CFG and Normal Forms Wen-Guey Tzeng Computer Science Department National Chiao Tung University Normal Forms We want a cfg with either Chomsky or Greibach normal form Chomsky normal form
More informationTo make a grammar probabilistic, we need to assign a probability to each context-free rewrite
Notes on the Inside-Outside Algorithm To make a grammar probabilistic, we need to assign a probability to each context-free rewrite rule. But how should these probabilities be chosen? It is natural to
More informationIntroduction and Motivation. Introduction and Motivation. Introduction to Computability. Introduction and Motivation. Theory. Lecture5: Context Free
ntroduction to Computability Theory Lecture5: Context Free Languages Prof. Amos sraeli 1 ntroduction and Motivation n our study of RL-s we Covered: 1. Motivation and definition of regular languages. 2.
More informationMatrices: 2.1 Operations with Matrices
Goals In this chapter and section we study matrix operations: Define matrix addition Define multiplication of matrix by a scalar, to be called scalar multiplication. Define multiplication of two matrices,
More informationLecture 8: Complete Problems for Other Complexity Classes
IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Basic Course on Computational Complexity Lecture 8: Complete Problems for Other Complexity Classes David Mix Barrington and Alexis Maciel
More informationA brief introduction to Logic. (slides from
A brief introduction to Logic (slides from http://www.decision-procedures.org/) 1 A Brief Introduction to Logic - Outline Propositional Logic :Syntax Propositional Logic :Semantics Satisfiability and validity
More informationIntroduction to Computational Linguistics
Introduction to Computational Linguistics Olga Zamaraeva (2018) Based on Bender (prev. years) University of Washington May 3, 2018 1 / 101 Midterm Project Milestone 2: due Friday Assgnments 4& 5 due dates
More informationLECTURER: BURCU CAN Spring
LECTURER: BURCU CAN 2017-2018 Spring Regular Language Hidden Markov Model (HMM) Context Free Language Context Sensitive Language Probabilistic Context Free Grammar (PCFG) Unrestricted Language PCFGs can
More informationCircuits. Lecture 11 Uniform Circuit Complexity
Circuits Lecture 11 Uniform Circuit Complexity 1 Recall 2 Recall Non-uniform complexity 2 Recall Non-uniform complexity P/1 Decidable 2 Recall Non-uniform complexity P/1 Decidable NP P/log NP = P 2 Recall
More information1 Caveats of Parallel Algorithms
CME 323: Distriuted Algorithms and Optimization, Spring 2015 http://stanford.edu/ reza/dao. Instructor: Reza Zadeh, Matroid and Stanford. Lecture 1, 9/26/2015. Scried y Suhas Suresha, Pin Pin, Andreas
More informationCS483 Design and Analysis of Algorithms
CS483 Design and Analysis of Algorithms Lectures 15-16 Dynamic Programming Instructor: Fei Li lifei@cs.gmu.edu with subject: CS483 Office hours: STII, Room 443, Friday 4:00pm - 6:00pm or by appointments
More informationCFLs and Regular Languages. CFLs and Regular Languages. CFLs and Regular Languages. Will show that all Regular Languages are CFLs. Union.
We can show that every RL is also a CFL Since a regular grammar is certainly context free. We can also show by only using Regular Expressions and Context Free Grammars That is what we will do in this half.
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars CS 585, Fall 2017 Introduction to Natural Language Processing http://people.cs.umass.edu/~brenocon/inlp2017 Brendan O Connor College of Information and Computer Sciences
More informationChapter 6. Properties of Regular Languages
Chapter 6 Properties of Regular Languages Regular Sets and Languages Claim(1). The family of languages accepted by FSAs consists of precisely the regular sets over a given alphabet. Every regular set is
More informationNote: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).
Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). 1a) G = ({R, S, T}, {0,1}, P, S) where P is: S R0R R R0R1R R1R0R T T 0T ε (S generates the first 0. R generates
More informationArtificial Intelligence
CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20-21 Natural Language Parsing Parsing of Sentences Are sentences flat linear structures? Why tree? Is
More informationEven More on Dynamic Programming
Algorithms & Models of Computation CS/ECE 374, Fall 2017 Even More on Dynamic Programming Lecture 15 Thursday, October 19, 2017 Sariel Har-Peled (UIUC) CS374 1 Fall 2017 1 / 26 Part I Longest Common Subsequence
More informationMatrix Multiplication
Matrix Multiplication Matrix Multiplication Matrix multiplication. Given two n-by-n matrices A and B, compute C = AB. n c ij = a ik b kj k=1 c 11 c 12 c 1n c 21 c 22 c 2n c n1 c n2 c nn = a 11 a 12 a 1n
More informationBoolean cubes EECS150. Mapping truth tables onto cubes. Simplification. The Uniting Theorem. Three variable example
EES5 Section 5 Simplification and State Minimization Fall 2 -cube X oolean cubes Visual technique for indentifying when the uniting theorem can be applied n input variables = n-dimensional "cube" Y 2-cube
More informationHandout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0
Massachusetts Institute of Technology 6.863J/9.611J, Natural Language Processing, Spring, 2001 Department of Electrical Engineering and Computer Science Department of Brain and Cognitive Sciences Handout
More informationProbabilistic Context-free Grammars
Probabilistic Context-free Grammars Computational Linguistics Alexander Koller 24 November 2017 The CKY Recognizer S NP VP NP Det N VP V NP V ate NP John Det a N sandwich i = 1 2 3 4 k = 2 3 4 5 S NP John
More informationObject Detection Grammars
Object Detection Grammars Pedro F. Felzenszwalb and David McAllester February 11, 2010 1 Introduction We formulate a general grammar model motivated by the problem of object detection in computer vision.
More informationDEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY
DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY 1 On-line Resources http://neuralnetworksanddeeplearning.com/index.html Online book by Michael Nielsen http://matlabtricks.com/post-5/3x3-convolution-kernelswith-online-demo
More informationParsing. Unger s Parser. Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is
Introduction (1) Unger s parser [Grune and Jacobs, 2008] is a CFG parser that is Unger s Parser Laura Heinrich-Heine-Universität Düsseldorf Wintersemester 2012/2013 a top-down parser: we start with S and
More informationCSE 105 THEORY OF COMPUTATION
CSE 105 THEORY OF COMPUTATION Spring 2016 http://cseweb.ucsd.edu/classes/sp16/cse105-ab/ Today's learning goals Sipser Ch 2 Define push down automata Trace the computation of a push down automaton Design
More informationChapter 5. Finite Automata
Chapter 5 Finite Automata 5.1 Finite State Automata Capable of recognizing numerous symbol patterns, the class of regular languages Suitable for pattern-recognition type applications, such as the lexical
More informationRemembering subresults (Part I): Well-formed substring tables
Remembering subresults (Part I): Well-formed substring tables Detmar Meurers: Intro to Computational Linguistics I OSU, LING 684.01, 1. February 2005 Problem: Inefficiency of recomputing subresults Two
More informationNotation. Pattern Recognition II. Michal Haindl. Outline - PR Basic Concepts. Pattern Recognition Notions
Notation S pattern space X feature vector X = [x 1,...,x l ] l = dim{x} number of features X feature space K number of classes ω i class indicator Ω = {ω 1,...,ω K } g(x) discriminant function H decision
More informationParsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)
Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford) S N VP V NP D N John hit the ball Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements
More informationMatrices. Chapter Definitions and Notations
Chapter 3 Matrices 3. Definitions and Notations Matrices are yet another mathematical object. Learning about matrices means learning what they are, how they are represented, the types of operations which
More informationParsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing
L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the
More informationParsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.
: Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching
More information1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u,
1. Draw a parse tree for the following derivation: S C A C C A b b b b A b b b b B b b b b a A a a b b b b a b a a b b 2. Show on your parse tree u, v, x, y, z as per the pumping theorem. 3. Prove that
More informationProbabilistic Context-Free Grammar
Probabilistic Context-Free Grammar Petr Horáček, Eva Zámečníková and Ivana Burgetová Department of Information Systems Faculty of Information Technology Brno University of Technology Božetěchova 2, 612
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More informationANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3
ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any
More informationSorting Algorithms. We have already seen: Selection-sort Insertion-sort Heap-sort. We will see: Bubble-sort Merge-sort Quick-sort
Sorting Algorithms We have already seen: Selection-sort Insertion-sort Heap-sort We will see: Bubble-sort Merge-sort Quick-sort We will show that: O(n log n) is optimal for comparison based sorting. Bubble-Sort
More informationCSC 4181Compiler Construction. Context-Free Grammars Using grammars in parsers. Parsing Process. Context-Free Grammar
CSC 4181Compiler Construction Context-ree Grammars Using grammars in parsers CG 1 Parsing Process Call the scanner to get tokens Build a parse tree from the stream of tokens A parse tree shows the syntactic
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationProbabilistic Context Free Grammars
1 Defining PCFGs A PCFG G consists of Probabilistic Context Free Grammars 1. A set of terminals: {w k }, k = 1..., V 2. A set of non terminals: { i }, i = 1..., n 3. A designated Start symbol: 1 4. A set
More informationParsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21
Parsing Unger s Parser Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2016/17 1 / 21 Table of contents 1 Introduction 2 The Parser 3 An Example 4 Optimizations 5 Conclusion 2 / 21 Introduction
More informationAspects of Tree-Based Statistical Machine Translation
Aspects of Tree-Based Statistical Machine Translation Marcello Federico Human Language Technology FBK 2014 Outline Tree-based translation models: Synchronous context free grammars Hierarchical phrase-based
More informationContext- Free Parsing with CKY. October 16, 2014
Context- Free Parsing with CKY October 16, 2014 Lecture Plan Parsing as Logical DeducBon Defining the CFG recognibon problem BoHom up vs. top down Quick review of Chomsky normal form The CKY algorithm
More informationEinführung in die Computerlinguistik
Einführung in die Computerlinguistik Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 22 CFG (1) Example: Grammar G telescope : Productions: S NP VP NP
More informationQuiz 1 Solutions. Problem 2. Asymptotics & Recurrences [20 points] (3 parts)
Introduction to Algorithms October 13, 2010 Massachusetts Institute of Technology 6.006 Fall 2010 Professors Konstantinos Daskalakis and Patrick Jaillet Quiz 1 Solutions Quiz 1 Solutions Problem 1. We
More informationContext-Free Grammars and Languages. Reading: Chapter 5
Context-Free Grammars and Languages Reading: Chapter 5 1 Context-Free Languages The class of context-free languages generalizes the class of regular languages, i.e., every regular language is a context-free
More informationCourse 495: Advanced Statistical Machine Learning/Pattern Recognition
Course 495: Advanced Statistical Machine Learning/Pattern Recognition Lecturer: Stefanos Zafeiriou Goal (Lectures): To present discrete and continuous valued probabilistic linear dynamical systems (HMMs
More informationSyntactical analysis. Syntactical analysis. Syntactical analysis. Syntactical analysis
Context-free grammars Derivations Parse Trees Left-recursive grammars Top-down parsing non-recursive predictive parsers construction of parse tables Bottom-up parsing shift/reduce parsers LR parsers GLR
More informationParsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication
Parsing Linear Context-Free Rewriting Systems with Fast Matrix Multiplication Shay B. Cohen University of Edinburgh Daniel Gildea University of Rochester We describe a recognition algorithm for a subset
More informationComplexity (Pre Lecture)
Complexity (Pre Lecture) Dr. Neil T. Dantam CSCI-561, Colorado School of Mines Fall 2018 Dantam (Mines CSCI-561) Complexity (Pre Lecture) Fall 2018 1 / 70 Why? What can we always compute efficiently? What
More informationNon-context-Free Languages. CS215, Lecture 5 c
Non-context-Free Languages CS215, Lecture 5 c 2007 1 The Pumping Lemma Theorem. (Pumping Lemma) Let be context-free. There exists a positive integer divided into five pieces, Proof for for each, and..
More informationWhat Is a Language? Grammars, Languages, and Machines. Strings: the Building Blocks of Languages
Do Homework 2. What Is a Language? Grammars, Languages, and Machines L Language Grammar Accepts Machine Strings: the Building Blocks of Languages An alphabet is a finite set of symbols: English alphabet:
More informationCompiling Techniques
Lecture 7: Abstract Syntax 13 October 2015 Table of contents Syntax Tree 1 Syntax Tree Semantic Actions Examples Abstract Grammar 2 Internal Representation AST Builder 3 Visitor Processing Semantic Actions
More informationCS5371 Theory of Computation. Lecture 18: Complexity III (Two Classes: P and NP)
CS5371 Theory of Computation Lecture 18: Complexity III (Two Classes: P and NP) Objectives Define what is the class P Examples of languages in P Define what is the class NP Examples of languages in NP
More informationUniversity of the Virgin Islands, St. Thomas January 14, 2015 Algorithms and Programming for High Schoolers. Lecture 5
University of the Virgin Islands, St. Thomas January 14, 2015 Algorithms and Programming for High Schoolers Numerical algorithms: Lecture 5 Today we ll cover algorithms for various numerical problems:
More informationAutomata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,
Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu ETH Zürich (D-ITET) October, 5 2017 Part 3 out of 5 Last week, we learned about closure and equivalence of regular
More informationPart 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages
Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu Part 3 out of 5 ETH Zürich (D-ITET) October, 5 2017 Last week, we learned about closure and equivalence of regular
More informationProperties of Context-Free Languages
Properties of Context-Free Languages Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationChapter 14 (Partially) Unsupervised Parsing
Chapter 14 (Partially) Unsupervised Parsing The linguistically-motivated tree transformations we discussed previously are very effective, but when we move to a new language, we may have to come up with
More informationDynamic Programming: Matrix chain multiplication (CLRS 15.2)
Dynamic Programming: Matrix chain multiplication (CLRS.) The problem Given a sequence of matrices A, A, A,..., A n, find the best way (using the minimal number of multiplications) to compute their product.
More informationFast Inference with Min-Sum Matrix Product
Fast Inference with Min-Sum Matrix Product (or how I finished a homework assignment 15 years later) Pedro Felzenszwalb University of Chicago Julian McAuley Australia National University / NICTA 15 years
More informationNatural Language Processing : Probabilistic Context Free Grammars. Updated 5/09
Natural Language Processing : Probabilistic Context Free Grammars Updated 5/09 Motivation N-gram models and HMM Tagging only allowed us to process sentences linearly. However, even simple sentences require
More informationSpectral Unsupervised Parsing with Additive Tree Metrics
Spectral Unsupervised Parsing with Additive Tree Metrics Ankur Parikh, Shay Cohen, Eric P. Xing Carnegie Mellon, University of Edinburgh Ankur Parikh 2014 1 Overview Model: We present a novel approach
More informationProbabilistic Context-Free Grammars. Michael Collins, Columbia University
Probabilistic Context-Free Grammars Michael Collins, Columbia University Overview Probabilistic Context-Free Grammars (PCFGs) The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar
More informationREDUCING THE WORST CASE RUNNING TIMES OF A FAMILY OF RNA AND CFG PROBLEMS, USING VALIANT S APPROACH
REDUCING THE WORST CASE RUNNING TIMES OF A FAMILY OF RNA AND CFG PROBLEMS, USING VALIANT S APPROACH SHAY ZAKOV, DEKEL TSUR, AND MICHAL ZIV-UKELSON Abstract. We study Valiant s classical algorithm for Context
More informationIntroduction to Natural Language Processing
Introduction to Natural Language Processing PARSING: Earley, Bottom-Up Chart Parsing Jean-Cédric Chappelier Jean-Cedric.Chappelier@epfl.ch Artificial Intelligence Laboratory 1/20 Objectives of this lecture
More informationMachine Learning for natural language processing
Machine Learning for natural language processing Classification: Maximum Entropy Models Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 24 Introduction Classification = supervised
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars
More informationComputability Theory
CS:4330 Theory of Computation Spring 2018 Computability Theory Decidable Problems of CFLs and beyond Haniel Barbosa Readings for this lecture Chapter 4 of [Sipser 1996], 3rd edition. Section 4.1. Decidable
More informationAutomata Theory CS S-FR2 Final Review
Automata Theory CS411-2015S-FR2 Final Review David Galles Department of Computer Science University of San Francisco FR2-0: Halting Problem e(m) e(w) Halting Machine takes as input an encoding of a Turing
More informationMildly Context-Sensitive Grammar Formalisms: Thread Automata
Idea of Thread Automata (1) Mildly Context-Sensitive Grammar Formalisms: Thread Automata Laura Kallmeyer Sommersemester 2011 Thread automata (TA) have been proposed in [Villemonte de La Clergerie, 2002].
More informationCS 662 Sample Midterm
Name: 1 True/False, plus corrections CS 662 Sample Midterm 35 pts, 5 pts each Each of the following statements is either true or false. If it is true, mark it true. If it is false, correct the statement
More informationCSCI Final Project Report A Parallel Implementation of Viterbi s Decoding Algorithm
CSCI 1760 - Final Project Report A Parallel Implementation of Viterbi s Decoding Algorithm Shay Mozes Brown University shay@cs.brown.edu Abstract. This report describes parallel Java implementations of
More informationGrammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing. Part I. Formal Properties of TAG. Outline: Formal Properties of TAG
Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing Laura Kallmeyer, Timm Lichte, Wolfgang Maier Universität Tübingen Part I Formal Properties of TAG 16.05.2007 und 21.05.2007 TAG Parsing
More informationTop-Down Parsing and Intro to Bottom-Up Parsing
Predictive Parsers op-down Parsing and Intro to Bottom-Up Parsing Lecture 7 Like recursive-descent but parser can predict which production to use By looking at the next few tokens No backtracking Predictive
More informationA parsing technique for TRG languages
A parsing technique for TRG languages Daniele Paolo Scarpazza Politecnico di Milano October 15th, 2004 Daniele Paolo Scarpazza A parsing technique for TRG grammars [1]
More informationCSCI Compiler Construction
CSCI 742 - Compiler Construction Lecture 12 Cocke-Younger-Kasami (CYK) Algorithm Instructor: Hossein Hojjat February 20, 2017 Recap: Chomsky Normal Form (CNF) A CFG is in Chomsky Normal Form if each rule
More informationParser Combinators, (Simply) Indexed Grammars, Natural Language Parsing
Parser Combinators, (Simply) Indexed Grammars, Natural Language Parsing Jan van Eijck jve@cwi.nl Essex, April 23, 2004 Abstract Parser combinators [4, 6] are higher order functions that transform parsers
More informationDesign and Analysis of Algorithms
CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 4: Divide and Conquer (I) Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Divide and Conquer ( DQ ) First paradigm or framework DQ(S)
More informationHarvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars
Harvard CS 121 and CSCI E-207 Lecture 9: Regular Languages Wrap-Up, Context-Free Grammars Salil Vadhan October 2, 2012 Reading: Sipser, 2.1 (except Chomsky Normal Form). Algorithmic questions about regular
More informationMore Completeness, conp, FNP,etc. CS254 Chris Pollett Oct 30, 2006.
More Completeness, conp, FNP,etc. CS254 Chris Pollett Oct 30, 2006. Outline A last complete problem for NP conp, conp NP Function problems MAX-CUT A cut in a graph G=(V,E) is a set of nodes S used to partition
More informationPCFGs 2 L645 / B659. Dept. of Linguistics, Indiana University Fall PCFGs 2. Questions. Calculating P(w 1m ) Inside Probabilities
1 / 22 Inside L645 / B659 Dept. of Linguistics, Indiana University Fall 2015 Inside- 2 / 22 for PCFGs 3 questions for Probabilistic Context Free Grammars (PCFGs): What is the probability of a sentence
More informationIn this chapter, we explore the parsing problem, which encompasses several questions, including:
Chapter 12 Parsing Algorithms 12.1 Introduction In this chapter, we explore the parsing problem, which encompasses several questions, including: Does L(G) contain w? What is the highest-weight derivation
More informationSTA 414/2104: Machine Learning
STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far
More informationAlgorithm Design CS 515 Fall 2015 Sample Final Exam Solutions
Algorithm Design CS 515 Fall 2015 Sample Final Exam Solutions Copyright c 2015 Andrew Klapper. All rights reserved. 1. For the functions satisfying the following three recurrences, determine which is the
More informationProbabilistic Context-Free Grammars and beyond
Probabilistic Context-Free Grammars and beyond Mark Johnson Microsoft Research / Brown University July 2007 1 / 87 Outline Introduction Formal languages and Grammars Probabilistic context-free grammars
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More informationAlgorithms for Scientific Computing
Algorithms for Scientific Computing Parallelization using Space-Filling Curves Michael Bader Technical University of Munich Summer 2016 Parallelisation using Space-filling Curves Problem setting: mesh
More informationComputational Linguistics II: Parsing
Computational Linguistics II: Parsing Left-corner-Parsing Frank Richter & Jan-Philipp Söhn fr@sfs.uni-tuebingen.de, jp.soehn@uni-tuebingen.de January 15th, 2007 Richter/Söhn (WS 2006/07) Computational
More information