Parsing. Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)
|
|
- Hugo Wiggins
- 5 years ago
- Views:
Transcription
1 Parsing Based on presentations from Chris Manning s course on Statistical Parsing (Stanford)
2 S N VP V NP D N John hit the ball
3 Levels of analysis Level Morphology/Lexical POS (morpho-synactic), WSD Elements used Words Words Shallow syntax parsing Full syntax parsing NER, MWE SRL Full semantic parsing Phrases Sentence Phrases Parsed trees Parsed trees
4 Buffalo
5 Parsing is a difficult task!
6 Parsing is a difficult task! ^ ^ so excited! #Khaleesi #minikhaleesi #GoT
7 Telegramese A style of writing that leaves out words that are not important The Telegraph Office: How to Write Telegrams Properly The enemy has not yet been met or even seen on account of the entanglements thrown up during the night. Enemy unmet unseen account entanglements upthrown night Sell 10,000 bushels of May wheat at $1.45 ¼. Sell barney stoke [code]
8 Ambiguities POS tags (e.g., books : a verb or a noun?) Compositional expression meanings (e.g., he spilled the beans about his past) Syntactic attachments (V N PP) (e.g., I ate my spaghetti with a fork) Global semantic ambiguities (e.g., bear left at zoo) Usually, ambiguities in one layer may be resolved in upper layers
9 Ambiguities Fed raises interest rates 0.5% in effort to control inflation.
10 Motivation Parsing may help to resolve ambiguities Parsing is a step toward understanding the sentence completely Was shown to improve the results of several NLP applications: MT (Chiang, 2005) Question answering (Hovy et al., 2000)
11 Grammar S NP VP NN interest NP (DT) NN NNS rates NP NN NNS NNS raises NP NNP VBP interest VP V NP VBZ rates Minimal grammar on Fed raises sentence: 36 parses Simple 10 rule grammar: 592 parses Real-size broad-coverage grammar: millions of parses
12 Size of grammar Number of rules less more Limits unlikely parses But grammar is not robust Parses more sentences But sentences end up with ever more parses
13 Statistical parsing Statistical parsing can help selecting the rules that best fit the input sentence, allowing the grammar to contain more rules
14 Treebanks ( (S (NP-SBJ (DT The) (NN move)) (VP (VBD followed) (NP (NP (DT a) (NN round)) The Penn Treebank Project (PTB): Arabic, English, Chinese, Persian, French, (PP (IN of) (NP (NP (JJ similar) (NNS increases)) (PP (IN by) (NP (JJ other) (NNS lenders))) (PP (IN against) (NP (NNP Arizona) (JJ real) (NN estate) (NNS loans)))))) (,,)
15 Advantages of treebanks Reusability of the labor Broad coverage Frequencies and distributional information A way to evaluate systems
16 Types of parsing Constituency parsing Dependency parsing
17 Constituency parsing Constituents are defined based on linguistic rules (phrases) Constituents are recursive (NP may contain NP as part of its sub-constituents) Different linguists may define constituents differently
18 Dependency parsing Dependency structure shows which words depend on (modify or are arguments of) which other words
19 Parsing We want to run a grammar backwards to find possible structures for a sentence Parsing can be viewed as a search problem We can do this bottom-up or top-down We search by building a search tree which is distinct from the parse tree
20 Context-free (phrase structure) grammar (CFG) G T is set of terminals N is set of variables (nonterminals) S is the start symbol (one of the variables) R is rules (productions) of the form X γ, where X is a variable and γ is a sequence of terminals and variables (possibly empty) A grammar G generates a language L=L(G)
21 Probabilistic or stochastic context-free grammars (PCFG) R: rules X γ P: P(R) gives the probability of each rule X γ R X N, P(X γ) =1 A grammar G generates a language L w T* P(w) =1
22 Soundness and completeness A parser is sound if every parse it returns is valid/correct A parser terminates if it is guaranteed to not go off into an infinite loop A parser is complete if for any given grammar and sentence, it is sound, produces every valid parse for that sentence, and terminates (For many purposes, we settle for sound but incomplete parsers: e.g., probabilistic parsers that return a k-best list.)
23 Top down parsing Top-down parsing is goal directed A top-down parser starts with a list of constituents to be built. The top-down parser rewrites the goals in the goal list by matching one against the left side of the grammar rules, and expanding it with the right side, attempting to match the sentence to be derived If a goal can be rewritten in several ways, then there is a choice of which rule to apply (search problem) Can use depth-first or breadth-first search, and goal ordering
24 Top down parsing
25 Disadvantages of top down A top-down parser will do badly if there are many different rules for the same variable. Consider if there are 600 rules for S, 599 of which start with NP, but one of which starts with V, and the sentence starts with V Useless work: expands things that are possible top-down but not there Repeated work
26 Repeated work
27 Bottom up chart parsing Bottom-up parsing is data directed The initial goal list of a bottom-up parser is the string to be parsed. If a sequence in the goal list matches the right side of a rule, then this sequence may be replaced by the left side of the rule Parsing is finished when the goal list contains just the start category If the right side of several rules match the goal list, then there is a choice of which rule to apply (search problem) The standard presentation is as shift-reduce parsing
28 Shift-reduce parsing cats scratch people with claws cats scratch people with claws SHIFT N scratch people with claws REDUCE NP scratch people with claws REDUCE NP scratch people with claws SHIFT NP V people with claws REDUCE NP V people with claws SHIFT NP V N with claws REDUCE NP V NP with claws REDUCE NP V NP with claws SHIFT NP V NP P claws REDUCE NP V NP P claws SHIFT NP V NP P N REDUCE NP V NP P NP REDUCE NP V NP PP REDUCE NP VP REDUCE S REDUCE
29 Disadvantages of bottom up Useless work: locally possible, but globally impossible. Inefficient when there is great lexical ambiguity (grammardriven control might help here) Repeated work: anywhere there is common substructure
30 Parsing as search Left recursive structures must be found, not predicted Doing these things doesn't fix the repeated work problem: Both TD and BU parsers can (and frequently do) do work exponential in the sentence length on NLP problems Grammar transformations can fix both left-recursion and epsilon productions Then you parse the same language but with different trees (and fix them post hoc)
31 Dynamic programming Rather than doing parsing-as-search, we do parsing as dynamic programming Examples: CYK (bottom up), Early (top down) It solves the problem of doing repeated work
32 Notation w 1n = w 1 w n = the word sequence from 1 to n w ab = the subsequence w a w b N j ab = the variable Nj dominating w a w b We ll write P(N i ζ j ) to mean P(N i ζ j N i ) We ll want to calculate max t P(t * w ab )
33 Tree and sentence probabilities P(t) -- The probability of tree is the product of the probabilities of the rules used to generate it P(w 1n ) -- The probability of the sentence is the sum of the probabilities of the trees which have that sentence as their yield P(w 1n ) = Σ j P(w 1n, t) where t is a parse of w 1n = Σ j P(t)
34 A treebank tree ROOT S NP VP N V NP PP N P NP N cats scratch people with claws
35 CYK (Cocke-Younger-Kasami) algorithm A bottom-up parser using dynamic programming Assume the grammar is in Chomsky normal form A BC Maintain N nxn tables µ N = number of variables, n = length of input Fill out the table entries by induction
36 After binarization ROOT N V NP PP N NP N cats scratch people with claws
37 Can 1 you 2 book 3 ELAL 4 flights 5? w1,1 w1,2 w1,3 w1,4 w1,5 1 w2,2 w2,3 w2,4 w2,5 2 w3,3 w3,4 w3,5 3 w4,4 w4,5 4 w5,5 5
38 CYK Base case Consider the input strings of length one (i.e., each individual word w i ) P(A w i ) Since the grammar is in normal form: A * wi iff A w i So µ[i, i, A] = P(A w i )
39 CYK Base case Can 1 you 2 book 3 ELAL 4 flights 5? 1.4 Aux Noun
40 CYK Recursive case For strings of words of length > 1, A * w ij iff there is at least one rule A BC where B derives the first k words (between i and i-1 +k ) and C derives the remaining ones (between i+k and j) (for each non-terminal) Choose the max among all possibilities B A C i i-1+k i+k j µ[i, j, A)] = µ [i, i-1 +k, B] * µ [i+k, j, C] * P(A BC)
41 CYK Termination The max prob parse will be µ [1, n, S] w1,1 w1,2 w1,3 w1,4 w1,5 S w2,2 w2,3 w2,4 w2,5 w3,3 w3,4 w3,5 w4,4 w4,5 w5,5
42 Top down: Early algorithm Finds constituents and partial constituents in input A B C. D E is partial: only the first half of the A A D + = A B C D E j k B C D E i j i k A B C. D E A B C D. E
43 Early algorithm Proceeds incrementally, left-to-right Before it reads word 5, it has already built all hypotheses that are consistent with first 4 words Reads word 5 & attaches it to immediately preceding hypotheses. Might yield new constituents that are then attached to hypotheses immediately preceding them Use dynamic programming.
44 Example (grammar) ROOT S S NP VP NP Papa NP Det N N caviar NP NP PP N spoon VP VP PP V ate VP V NP P with PP P NP Det the Det a
45 0 0 ROOT. S initialize Remember this stands for (0, ROOT. S)
46 0 0 ROOT. S 0 S. NP VP predict the kind of S we are looking for Remember this stands for (0, S. NP VP)
47 0 0 ROOT. S 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa predict the kind of NP we are looking for (actually we ll look for 3 kinds: any of the 3 will do)
48 0 0 ROOT. S 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a predict the kind of Det we are looking for (2 kinds)
49 0 0 ROOT. S 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a predict the kind of NP we re looking for but we were already looking for these so don t add duplicate goals! Note that this happened when we were processing a left-recursive rule.
50 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a scan: the desired word is in the input!
51 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a scan: failure
52 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 NP. Det N 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a scan: failure
53 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 0 NP. Papa 0 Det. the 0 Det. a attach the newly created NP (which starts at 0) to its customers (incomplete constituents that end at 0 and have NP after the dot)
54 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 0 Det. a predict
55 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a predict
56 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate predict
57 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate predict
58 0 Papa 1 0 ROOT. S 0 NP Papa. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate 1 P. with predict
59 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate 1 P. with scan: success!
60 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate 1 P. with scan: failure
61 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 0 NP. NP PP 1 VP. V NP 0 NP. Papa 1 VP. VP PP 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate 1 P. with attach
62 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 0 Det. a 1 V. ate 1 P. with predict
63 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with predict (these next few steps should look familiar)
64 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with predict
65 0 Papa 1 ate 2 0 ROOT. S 0 NP Papa. 1 V ate. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with scan (this time we fail since Papa is not the next word)
66 0 Papa 1 ate 2 the 3 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with scan: success!
67 0 Papa 1 ate 2 the 3 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 0 S. NP VP 0 S NP. VP 1 VP V. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with
68 0 Papa 1 ate 2 the 3 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 0 NP. Det N 0 NP NP. PP 2 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with
69 0 Papa 1 ate 2 the 3 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with
70 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with
71 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with
72 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with attach
73 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 0 Det. a 1 V. ate 2 Det. a 1 P. with attach (again!)
74 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 1 P. with attach (again!)
75 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with
76 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. attach (again!)
77 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S.
78 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
79 0 Papa 1 ate 2 the 3 caviar 4 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
80 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
81 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
82 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 5 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
83 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 5 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 5 Det. the 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 5 Det. a 1 P. with 0 ROOT S. 4 P. with
84 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 5 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 5 Det. the 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 5 Det. a 1 P. with 0 ROOT S. 4 P. with
85 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 5 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 5 Det. the 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 5 Det. a 1 P. with 0 ROOT S. 4 P. with
86 0 Papa 1 ate 2 the 3 caviar 4 with 5 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 5 NP. Papa 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 5 Det. the 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 5 Det. a 1 P. with 0 ROOT S. 4 P. with
87 ate 2 the 3 caviar 4 with 5 a 6. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
88 ate 2 the 3 caviar 4 with 5 a 6. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
89 ate 2 the 3 caviar 4 with 5 a 6. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 6 N. caviar 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 6 N. spoon P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
90 ate 2 the 3 caviar 4 with 5 a 6. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 6 N. caviar 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 6 N. spoon P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
91 ate 2 the 3 caviar 4 with 5 a 6 spoon 7. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 6 N spoon. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 6 N. caviar 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 6 N. spoon P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
92 ate 2 the 3 caviar 4 with 5 a 6 spoon 7. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 6 N spoon. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N 5 NP Det N. P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 6 N. caviar 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 6 N. spoon P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
93 ate 2 the 3 caviar 4 with 5 a 6 spoon 7. 1 V ate. 2 Det the. 3 N caviar. 4 P with. 5 Det a. 6 N spoon. 1 VP V. NP 2 NP Det. N 2 NP Det N. 4 PP P. NP 5 NP Det. N 5 NP Det N. P 2 NP. Det N 3 N. caviar 1 VP V NP. 5 NP. Det N 6 N. caviar 4 PP P NP. 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP. NP PP 6 N. spoon 5 NP NP. PP P 2 NP. Papa 0 S NP VP. 5 NP. Papa 2 Det. the 1 VP VP. PP 5 Det. the 2 Det. a 4 PP. P NP 5 Det. a 0 ROOT S. 4 P. with
94 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 1 P. with 0 ROOT S. 4 P. with
95 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 4 P. with
96 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP
97 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP
98 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with
99 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with
100 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with
101 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with 0 ROOT S.
102 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with 0 ROOT S.
103 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with 0 ROOT S.
104 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with 0 ROOT S.
105 0 Papa 1 ate 2 the 3 caviar 4 with a spoon 7 0 ROOT. S 0 NP Papa. 1 V ate. 2 Det the. 3 N caviar. 6 N spoon. 0 S. NP VP 0 S NP. VP 1 VP V. NP 2 NP Det. N 2 NP Det N. 5 NP Det N. 0 NP. Det N 0 NP NP. PP 2 NP. Det N 3 N. caviar 1 VP V NP. 4 PP P NP. 0 NP. NP PP 1 VP. V NP 2 NP. NP PP 3 N. spoon 2 NP NP. PP 5 NP NP. PP 0 NP. Papa 1 VP. VP PP 2 NP. Papa 0 S NP VP. 2 NP NP PP. 0 Det. the 1 PP. P NP 2 Det. the 1 VP VP. PP 1 VP VP PP. 0 Det. a 1 V. ate 2 Det. a 4 PP. P NP 7 PP. P NP 1 P. with 0 ROOT S. 1 VP V NP. 4 P. with 2 NP NP. PP 0 S NP VP. 1 VP VP. PP 7 P. with 0 ROOT S.
Attendee information. Seven Lectures on Statistical Parsing. Phrase structure grammars = context-free grammars. Assessment.
even Lectures on tatistical Parsing Christopher Manning LA Linguistic Institute 7 LA Lecture Attendee information Please put on a piece of paper: ame: Affiliation: tatus (undergrad, grad, industry, prof,
More informationProbabilistic Context-free Grammars
Probabilistic Context-free Grammars Computational Linguistics Alexander Koller 24 November 2017 The CKY Recognizer S NP VP NP Det N VP V NP V ate NP John Det a N sandwich i = 1 2 3 4 k = 2 3 4 5 S NP John
More informationContext-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing
Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing Natural Language Processing CS 4120/6120 Spring 2017 Northeastern University David Smith with some slides from Jason Eisner & Andrew
More informationContext-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing
Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing Natural Language Processing! CS 6120 Spring 2014! Northeastern University!! David Smith! with some slides from Jason Eisner & Andrew
More informationParsing with CFGs L445 / L545 / B659. Dept. of Linguistics, Indiana University Spring Parsing with CFGs. Direction of processing
L445 / L545 / B659 Dept. of Linguistics, Indiana University Spring 2016 1 / 46 : Overview Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the
More informationParsing with CFGs. Direction of processing. Top-down. Bottom-up. Left-corner parsing. Chart parsing CYK. Earley 1 / 46.
: Overview L545 Dept. of Linguistics, Indiana University Spring 2013 Input: a string Output: a (single) parse tree A useful step in the process of obtaining meaning We can view the problem as searching
More informationCS 6120/CS4120: Natural Language Processing
CS 6120/CS4120: Natural Language Processing Instructor: Prof. Lu Wang College of Computer and Information Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Assignment/report submission
More informationLECTURER: BURCU CAN Spring
LECTURER: BURCU CAN 2017-2018 Spring Regular Language Hidden Markov Model (HMM) Context Free Language Context Sensitive Language Probabilistic Context Free Grammar (PCFG) Unrestricted Language PCFGs can
More informationMaschinelle Sprachverarbeitung
Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other
More informationMaschinelle Sprachverarbeitung
Maschinelle Sprachverarbeitung Parsing with Probabilistic Context-Free Grammar Ulf Leser Content of this Lecture Phrase-Structure Parse Trees Probabilistic Context-Free Grammars Parsing with PCFG Other
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars CS 585, Fall 2017 Introduction to Natural Language Processing http://people.cs.umass.edu/~brenocon/inlp2017 Brendan O Connor College of Information and Computer Sciences
More informationContext-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing
Context-Free Parsing: CKY & Earley Algorithms and Probabilistic Parsing Natural Language Processing CS 4120/6120 Spring 2016 Northeastern University David Smith with some slides from Jason Eisner & Andrew
More informationParsing with Context-Free Grammars
Parsing with Context-Free Grammars Berlin Chen 2005 References: 1. Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2. Speech and Language Processing, chapters 9, 10 NLP-Berlin Chen 1 Grammars
More informationAdvanced Natural Language Processing Syntactic Parsing
Advanced Natural Language Processing Syntactic Parsing Alicia Ageno ageno@cs.upc.edu Universitat Politècnica de Catalunya NLP statistical parsing 1 Parsing Review Statistical Parsing SCFG Inside Algorithm
More informationReview. Earley Algorithm Chapter Left Recursion. Left-Recursion. Rule Ordering. Rule Ordering
Review Earley Algorithm Chapter 13.4 Lecture #9 October 2009 Top-Down vs. Bottom-Up Parsers Both generate too many useless trees Combine the two to avoid over-generation: Top-Down Parsing with Bottom-Up
More informationCS460/626 : Natural Language
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 23, 24 Parsing Algorithms; Parsing in case of Ambiguity; Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 8 th,
More informationProbabilistic Context-Free Grammars. Michael Collins, Columbia University
Probabilistic Context-Free Grammars Michael Collins, Columbia University Overview Probabilistic Context-Free Grammars (PCFGs) The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar
More informationNatural Language Processing : Probabilistic Context Free Grammars. Updated 5/09
Natural Language Processing : Probabilistic Context Free Grammars Updated 5/09 Motivation N-gram models and HMM Tagging only allowed us to process sentences linearly. However, even simple sentences require
More informationProbabilistic Context Free Grammars. Many slides from Michael Collins and Chris Manning
Probabilistic Context Free Grammars Many slides from Michael Collins and Chris Manning Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic
More informationCKY & Earley Parsing. Ling 571 Deep Processing Techniques for NLP January 13, 2016
CKY & Earley Parsing Ling 571 Deep Processing Techniques for NLP January 13, 2016 No Class Monday: Martin Luther King Jr. Day CKY Parsing: Finish the parse Recognizer à Parser Roadmap Earley parsing Motivation:
More informationStatistical Methods for NLP
Statistical Methods for NLP Stochastic Grammars Joakim Nivre Uppsala University Department of Linguistics and Philology joakim.nivre@lingfil.uu.se Statistical Methods for NLP 1(22) Structured Classification
More informationIn this chapter, we explore the parsing problem, which encompasses several questions, including:
Chapter 12 Parsing Algorithms 12.1 Introduction In this chapter, we explore the parsing problem, which encompasses several questions, including: Does L(G) contain w? What is the highest-weight derivation
More informationNatural Language Processing CS Lecture 06. Razvan C. Bunescu School of Electrical Engineering and Computer Science
Natural Language Processing CS 6840 Lecture 06 Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu Statistical Parsing Define a probabilistic model of syntax P(T S):
More informationConstituency Parsing
CS5740: Natural Language Processing Spring 2017 Constituency Parsing Instructor: Yoav Artzi Slides adapted from Dan Klein, Dan Jurafsky, Chris Manning, Michael Collins, Luke Zettlemoyer, Yejin Choi, and
More informationProcessing/Speech, NLP and the Web
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 25 Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 14 th March, 2011 Bracketed Structure: Treebank Corpus [ S1[
More informationCS626: NLP, Speech and the Web. Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012
CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 14: Parsing Algorithms 30 th August, 2012 Parsing Problem Semantics Part of Speech Tagging NLP Trinity Morph Analysis
More informationProbabilistic Context Free Grammars. Many slides from Michael Collins
Probabilistic Context Free Grammars Many slides from Michael Collins Overview I Probabilistic Context-Free Grammars (PCFGs) I The CKY Algorithm for parsing with PCFGs A Probabilistic Context-Free Grammar
More informationIntroduction to Computational Linguistics
Introduction to Computational Linguistics Olga Zamaraeva (2018) Based on Bender (prev. years) University of Washington May 3, 2018 1 / 101 Midterm Project Milestone 2: due Friday Assgnments 4& 5 due dates
More informationParsing. Probabilistic CFG (PCFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 22
Parsing Probabilistic CFG (PCFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 22 Table of contents 1 Introduction 2 PCFG 3 Inside and outside probability 4 Parsing Jurafsky
More information10/17/04. Today s Main Points
Part-of-speech Tagging & Hidden Markov Model Intro Lecture #10 Introduction to Natural Language Processing CMPSCI 585, Fall 2004 University of Massachusetts Amherst Andrew McCallum Today s Main Points
More informationRemembering subresults (Part I): Well-formed substring tables
Remembering subresults (Part I): Well-formed substring tables Detmar Meurers: Intro to Computational Linguistics I OSU, LING 684.01, 1. February 2005 Problem: Inefficiency of recomputing subresults Two
More informationNatural Language Processing 1. lecture 7: constituent parsing. Ivan Titov. Institute for Logic, Language and Computation
atural Language Processing 1 lecture 7: constituent parsing Ivan Titov Institute for Logic, Language and Computation Outline Syntax: intro, CFGs, PCFGs PCFGs: Estimation CFGs: Parsing PCFGs: Parsing Parsing
More informationPCFGs 2 L645 / B659. Dept. of Linguistics, Indiana University Fall PCFGs 2. Questions. Calculating P(w 1m ) Inside Probabilities
1 / 22 Inside L645 / B659 Dept. of Linguistics, Indiana University Fall 2015 Inside- 2 / 22 for PCFGs 3 questions for Probabilistic Context Free Grammars (PCFGs): What is the probability of a sentence
More informationA* Search. 1 Dijkstra Shortest Path
A* Search Consider the eight puzzle. There are eight tiles numbered 1 through 8 on a 3 by three grid with nine locations so that one location is left empty. We can move by sliding a tile adjacent to the
More informationA Supertag-Context Model for Weakly-Supervised CCG Parser Learning
A Supertag-Context Model for Weakly-Supervised CCG Parser Learning Dan Garrette Chris Dyer Jason Baldridge Noah A. Smith U. Washington CMU UT-Austin CMU Contributions 1. A new generative model for learning
More informationHandout 8: Computation & Hierarchical parsing II. Compute initial state set S 0 Compute initial state set S 0
Massachusetts Institute of Technology 6.863J/9.611J, Natural Language Processing, Spring, 2001 Department of Electrical Engineering and Computer Science Department of Brain and Cognitive Sciences Handout
More informationCMPT-825 Natural Language Processing. Why are parsing algorithms important?
CMPT-825 Natural Language Processing Anoop Sarkar http://www.cs.sfu.ca/ anoop October 26, 2010 1/34 Why are parsing algorithms important? A linguistic theory is implemented in a formal system to generate
More informationNatural Language Processing
SFU NatLangLab Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class Simon Fraser University September 27, 2018 0 Natural Language Processing Anoop Sarkar anoopsarkar.github.io/nlp-class
More informationStatistical Machine Translation
Statistical Machine Translation -tree-based models (cont.)- Artem Sokolov Computerlinguistik Universität Heidelberg Sommersemester 2015 material from P. Koehn, S. Riezler, D. Altshuler Bottom-Up Decoding
More informationContext- Free Parsing with CKY. October 16, 2014
Context- Free Parsing with CKY October 16, 2014 Lecture Plan Parsing as Logical DeducBon Defining the CFG recognibon problem BoHom up vs. top down Quick review of Chomsky normal form The CKY algorithm
More informationChapter 14 (Partially) Unsupervised Parsing
Chapter 14 (Partially) Unsupervised Parsing The linguistically-motivated tree transformations we discussed previously are very effective, but when we move to a new language, we may have to come up with
More informationContext Free Grammars
Automata and Formal Languages Context Free Grammars Sipser pages 101-111 Lecture 11 Tim Sheard 1 Formal Languages 1. Context free languages provide a convenient notation for recursive description of languages.
More information{Probabilistic Stochastic} Context-Free Grammars (PCFGs)
{Probabilistic Stochastic} Context-Free Grammars (PCFGs) 116 The velocity of the seismic waves rises to... S NP sg VP sg DT NN PP risesto... The velocity IN NP pl of the seismic waves 117 PCFGs APCFGGconsists
More informationProbabilistic Context-Free Grammar
Probabilistic Context-Free Grammar Petr Horáček, Eva Zámečníková and Ivana Burgetová Department of Information Systems Faculty of Information Technology Brno University of Technology Božetěchova 2, 612
More informationDT2118 Speech and Speaker Recognition
DT2118 Speech and Speaker Recognition Language Modelling Giampiero Salvi KTH/CSC/TMH giampi@kth.se VT 2015 1 / 56 Outline Introduction Formal Language Theory Stochastic Language Models (SLM) N-gram Language
More informationProbabilistic Context Free Grammars
1 Defining PCFGs A PCFG G consists of Probabilistic Context Free Grammars 1. A set of terminals: {w k }, k = 1..., V 2. A set of non terminals: { i }, i = 1..., n 3. A designated Start symbol: 1 4. A set
More informationIntroduction to Probablistic Natural Language Processing
Introduction to Probablistic Natural Language Processing Alexis Nasr Laboratoire d Informatique Fondamentale de Marseille Natural Language Processing Use computers to process human languages Machine Translation
More informationProbabilistic Linguistics
Matilde Marcolli MAT1509HS: Mathematical and Computational Linguistics University of Toronto, Winter 2019, T 4-6 and W 4, BA6180 Bernoulli measures finite set A alphabet, strings of arbitrary (finite)
More informationCS 545 Lecture XVI: Parsing
CS 545 Lecture XVI: Parsing brownies_choco81@yahoo.com brownies_choco81@yahoo.com Benjamin Snyder Parsing Given a grammar G and a sentence x = (x1, x2,..., xn), find the best parse tree. We re not going
More informationCS : Speech, NLP and the Web/Topics in AI
CS626-449: Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-17: Probabilistic parsing; insideoutside probabilities Probability of a parse tree (cont.) S 1,l NP 1,2
More informationArtificial Intelligence
CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20-21 Natural Language Parsing Parsing of Sentences Are sentences flat linear structures? Why tree? Is
More informationINF4820: Algorithms for Artificial Intelligence and Natural Language Processing. Hidden Markov Models
INF4820: Algorithms for Artificial Intelligence and Natural Language Processing Hidden Markov Models Murhaf Fares & Stephan Oepen Language Technology Group (LTG) October 27, 2016 Recap: Probabilistic Language
More informationComputational Linguistics
Computational Linguistics Dependency-based Parsing Clayton Greenberg Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2016 Acknowledgements These slides
More informationAlgorithms for Syntax-Aware Statistical Machine Translation
Algorithms for Syntax-Aware Statistical Machine Translation I. Dan Melamed, Wei Wang and Ben Wellington ew York University Syntax-Aware Statistical MT Statistical involves machine learning (ML) seems crucial
More informationAn Efficient Context-Free Parsing Algorithm. Speakers: Morad Ankri Yaniv Elia
An Efficient Context-Free Parsing Algorithm Speakers: Morad Ankri Yaniv Elia Yaniv: Introduction Terminology Informal Explanation The Recognizer Morad: Example Time and Space Bounds Empirical results Practical
More informationS NP VP 0.9 S VP 0.1 VP V NP 0.5 VP V 0.1 VP V PP 0.1 NP NP NP 0.1 NP NP PP 0.2 NP N 0.7 PP P NP 1.0 VP NP PP 1.0. N people 0.
/6/7 CS 6/CS: Natural Language Processing Instructor: Prof. Lu Wang College of Computer and Information Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang The grammar: Binary, no epsilons,.9..5
More informationFinite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018
Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of
More informationMultiword Expression Identification with Tree Substitution Grammars
Multiword Expression Identification with Tree Substitution Grammars Spence Green, Marie-Catherine de Marneffe, John Bauer, and Christopher D. Manning Stanford University EMNLP 2011 Main Idea Use syntactic
More informationLecture Notes on Inductive Definitions
Lecture Notes on Inductive Definitions 15-312: Foundations of Programming Languages Frank Pfenning Lecture 2 August 28, 2003 These supplementary notes review the notion of an inductive definition and give
More informationComputational Linguistics. Acknowledgements. Phrase-Structure Trees. Dependency-based Parsing
Computational Linguistics Dependency-based Parsing Dietrich Klakow & Stefan Thater FR 4.7 Allgemeine Linguistik (Computerlinguistik) Universität des Saarlandes Summer 2013 Acknowledgements These slides
More informationComputational Linguistics II: Parsing
Computational Linguistics II: Parsing Left-corner-Parsing Frank Richter & Jan-Philipp Söhn fr@sfs.uni-tuebingen.de, jp.soehn@uni-tuebingen.de January 15th, 2007 Richter/Söhn (WS 2006/07) Computational
More informationThis kind of reordering is beyond the power of finite transducers, but a synchronous CFG can do this.
Chapter 12 Synchronous CFGs Synchronous context-free grammars are a generalization of CFGs that generate pairs of related strings instead of single strings. They are useful in many situations where one
More informationGrammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing. Part I. Formal Properties of TAG. Outline: Formal Properties of TAG
Grammar formalisms Tree Adjoining Grammar: Formal Properties, Parsing Laura Kallmeyer, Timm Lichte, Wolfgang Maier Universität Tübingen Part I Formal Properties of TAG 16.05.2007 und 21.05.2007 TAG Parsing
More informationBringing machine learning & compositional semantics together: central concepts
Bringing machine learning & compositional semantics together: central concepts https://githubcom/cgpotts/annualreview-complearning Chris Potts Stanford Linguistics CS 244U: Natural language understanding
More informationA Context-Free Grammar
Statistical Parsing A Context-Free Grammar S VP VP Vi VP Vt VP VP PP DT NN PP PP P Vi sleeps Vt saw NN man NN dog NN telescope DT the IN with IN in Ambiguity A sentence of reasonable length can easily
More informationRoger Levy Probabilistic Models in the Study of Language draft, October 2,
Roger Levy Probabilistic Models in the Study of Language draft, October 2, 2012 224 Chapter 10 Probabilistic Grammars 10.1 Outline HMMs PCFGs ptsgs and ptags Highlight: Zuidema et al., 2008, CogSci; Cohn
More informationINF4820: Algorithms for Artificial Intelligence and Natural Language Processing. Language Models & Hidden Markov Models
1 University of Oslo : Department of Informatics INF4820: Algorithms for Artificial Intelligence and Natural Language Processing Language Models & Hidden Markov Models Stephan Oepen & Erik Velldal Language
More informationDecoding and Inference with Syntactic Translation Models
Decoding and Inference with Syntactic Translation Models March 5, 2013 CFGs S NP VP VP NP V V NP NP CFGs S NP VP S VP NP V V NP NP CFGs S NP VP S VP NP V NP VP V NP NP CFGs S NP VP S VP NP V NP VP V NP
More informationCS460/626 : Natural Language
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 27 SMT Assignment; HMM recap; Probabilistic Parsing cntd) Pushpak Bhattacharyya CSE Dept., IIT Bombay 17 th March, 2011 CMU Pronunciation
More informationParsing. Unger s Parser. Laura Kallmeyer. Winter 2016/17. Heinrich-Heine-Universität Düsseldorf 1 / 21
Parsing Unger s Parser Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2016/17 1 / 21 Table of contents 1 Introduction 2 The Parser 3 An Example 4 Optimizations 5 Conclusion 2 / 21 Introduction
More informationUnit 2: Tree Models. CS 562: Empirical Methods in Natural Language Processing. Lectures 19-23: Context-Free Grammars and Parsing
CS 562: Empirical Methods in Natural Language Processing Unit 2: Tree Models Lectures 19-23: Context-Free Grammars and Parsing Oct-Nov 2009 Liang Huang (lhuang@isi.edu) Big Picture we have already covered...
More informationLecture Notes on Inductive Definitions
Lecture Notes on Inductive Definitions 15-312: Foundations of Programming Languages Frank Pfenning Lecture 2 September 2, 2004 These supplementary notes review the notion of an inductive definition and
More informationAspects of Tree-Based Statistical Machine Translation
Aspects of Tree-Based Statistical Machine Translation Marcello Federico Human Language Technology FBK 2014 Outline Tree-based translation models: Synchronous context free grammars Hierarchical phrase-based
More informationSuppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis
1 Introduction Parenthesis Matching Problem Describe the set of arithmetic expressions with correctly matched parenthesis. Arithmetic expressions with correctly matched parenthesis cannot be described
More informationComputers playing Jeopardy!
Computers playing Jeopardy! CSE 392, Computers Playing Jeopardy!, Fall 2011 Stony Brook University http://www.cs.stonybrook.edu/~cse392 1 Last class: grammars and parsing in Prolog Verb thrills VP Verb
More informationINF4820: Algorithms for Artificial Intelligence and Natural Language Processing. Hidden Markov Models
INF4820: Algorithms for Artificial Intelligence and Natural Language Processing Hidden Markov Models Murhaf Fares & Stephan Oepen Language Technology Group (LTG) October 18, 2017 Recap: Probabilistic Language
More informationIntroduction to Semantics. The Formalization of Meaning 1
The Formalization of Meaning 1 1. Obtaining a System That Derives Truth Conditions (1) The Goal of Our Enterprise To develop a system that, for every sentence S of English, derives the truth-conditions
More informationTo make a grammar probabilistic, we need to assign a probability to each context-free rewrite
Notes on the Inside-Outside Algorithm To make a grammar probabilistic, we need to assign a probability to each context-free rewrite rule. But how should these probabilities be chosen? It is natural to
More informationLecture 13: Structured Prediction
Lecture 13: Structured Prediction Kai-Wei Chang CS @ University of Virginia kw@kwchang.net Couse webpage: http://kwchang.net/teaching/nlp16 CS6501: NLP 1 Quiz 2 v Lectures 9-13 v Lecture 12: before page
More informationSyntax-Based Decoding
Syntax-Based Decoding Philipp Koehn 9 November 2017 1 syntax-based models Synchronous Context Free Grammar Rules 2 Nonterminal rules NP DET 1 2 JJ 3 DET 1 JJ 3 2 Terminal rules N maison house NP la maison
More informationLING 473: Day 10. START THE RECORDING Coding for Probability Hidden Markov Models Formal Grammars
LING 473: Day 10 START THE RECORDING Coding for Probability Hidden Markov Models Formal Grammars 1 Issues with Projects 1. *.sh files must have #!/bin/sh at the top (to run on Condor) 2. If run.sh is supposed
More informationParsing. Left-Corner Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 17
Parsing Left-Corner Parsing Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 17 Table of contents 1 Motivation 2 Algorithm 3 Look-ahead 4 Chart Parsing 2 / 17 Motivation Problems
More informationCSCI Compiler Construction
CSCI 742 - Compiler Construction Lecture 12 Cocke-Younger-Kasami (CYK) Algorithm Instructor: Hossein Hojjat February 20, 2017 Recap: Chomsky Normal Form (CNF) A CFG is in Chomsky Normal Form if each rule
More informationCh. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically
Ch. 2: Phrase Structure Syntactic Structure (basic concepts) A tree diagram marks constituents hierarchically NP S AUX VP Ali will V NP help D N the man A node is any point in the tree diagram and it can
More informationLecture 12: Algorithms for HMMs
Lecture 12: Algorithms for HMMs Nathan Schneider (some slides from Sharon Goldwater; thanks to Jonathan May for bug fixes) ENLP 26 February 2018 Recap: tagging POS tagging is a sequence labelling task.
More informationRandom Generation of Nondeterministic Tree Automata
Random Generation of Nondeterministic Tree Automata Thomas Hanneforth 1 and Andreas Maletti 2 and Daniel Quernheim 2 1 Department of Linguistics University of Potsdam, Germany 2 Institute for Natural Language
More informationMultilevel Coarse-to-Fine PCFG Parsing
Multilevel Coarse-to-Fine PCFG Parsing Eugene Charniak, Mark Johnson, Micha Elsner, Joseph Austerweil, David Ellis, Isaac Haxton, Catherine Hill, Shrivaths Iyengar, Jeremy Moore, Michael Pozar, and Theresa
More informationPart of Speech Tagging: Viterbi, Forward, Backward, Forward- Backward, Baum-Welch. COMP-599 Oct 1, 2015
Part of Speech Tagging: Viterbi, Forward, Backward, Forward- Backward, Baum-Welch COMP-599 Oct 1, 2015 Announcements Research skills workshop today 3pm-4:30pm Schulich Library room 313 Start thinking about
More informationNatural Language Processing. Lecture 13: More on CFG Parsing
Natural Language Processing Lecture 13: More on CFG Parsing Probabilistc/Weighted Parsing Example: ambiguous parse Probabilistc CFG Ambiguous parse w/probabilites 0.05 0.05 0.20 0.10 0.30 0.20 0.60 0.75
More informationLecture 7: Introduction to syntax-based MT
Lecture 7: Introduction to syntax-based MT Andreas Maletti Statistical Machine Translation Stuttgart December 16, 2011 SMT VII A. Maletti 1 Lecture 7 Goals Overview Tree substitution grammars (tree automata)
More informationLatent Variable Models in NLP
Latent Variable Models in NLP Aria Haghighi with Slav Petrov, John DeNero, and Dan Klein UC Berkeley, CS Division Latent Variable Models Latent Variable Models Latent Variable Models Observed Latent Variable
More informationLecture 5: UDOP, Dependency Grammars
Lecture 5: UDOP, Dependency Grammars Jelle Zuidema ILLC, Universiteit van Amsterdam Unsupervised Language Learning, 2014 Generative Model objective PCFG PTSG CCM DMV heuristic Wolff (1984) UDOP ML IO K&M
More informationPenn Treebank Parsing. Advanced Topics in Language Processing Stephen Clark
Penn Treebank Parsing Advanced Topics in Language Processing Stephen Clark 1 The Penn Treebank 40,000 sentences of WSJ newspaper text annotated with phrasestructure trees The trees contain some predicate-argument
More informationDecidable and undecidable languages
The Chinese University of Hong Kong Fall 2011 CSCI 3130: Formal languages and automata theory Decidable and undecidable languages Andrej Bogdanov http://www.cse.cuhk.edu.hk/~andrejb/csc3130 Problems about
More informationCISC4090: Theory of Computation
CISC4090: Theory of Computation Chapter 2 Context-Free Languages Courtesy of Prof. Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Spring, 2014 Overview In Chapter
More informationGrammars and introduction to machine learning. Computers Playing Jeopardy! Course Stony Brook University
Grammars and introduction to machine learning Computers Playing Jeopardy! Course Stony Brook University Last class: grammars and parsing in Prolog Noun -> roller Verb thrills VP Verb NP S NP VP NP S VP
More informationSpectral Unsupervised Parsing with Additive Tree Metrics
Spectral Unsupervised Parsing with Additive Tree Metrics Ankur Parikh, Shay Cohen, Eric P. Xing Carnegie Mellon, University of Edinburgh Ankur Parikh 2014 1 Overview Model: We present a novel approach
More informationFoundations of Informatics: a Bridging Course
Foundations of Informatics: a Bridging Course Week 3: Formal Languages and Semantics Thomas Noll Lehrstuhl für Informatik 2 RWTH Aachen University noll@cs.rwth-aachen.de http://www.b-it-center.de/wob/en/view/class211_id948.html
More informationSupplementary Notes on Inductive Definitions
Supplementary Notes on Inductive Definitions 15-312: Foundations of Programming Languages Frank Pfenning Lecture 2 August 29, 2002 These supplementary notes review the notion of an inductive definition
More informationLecture 12: Algorithms for HMMs
Lecture 12: Algorithms for HMMs Nathan Schneider (some slides from Sharon Goldwater; thanks to Jonathan May for bug fixes) ENLP 17 October 2016 updated 9 September 2017 Recap: tagging POS tagging is a
More information