q 2 in δ 2. [4: 1 mark q 1 in δ 1 and q 2 and (q 1, q 2 ) (q 1, q 2) whenever q 1 for each component] (b) The resulting DFA:

Similar documents
CS 314 Principles of Programming Languages

FABER Formal Languages, Automata and Models of Computation

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory

Let's start with an example:

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS

CS 275 Automata and Formal Language Theory

Lexical Analysis Part III

Overview HC9. Parsing: Top-Down & LL(1) Context-Free Grammars (1) Introduction. CFGs (3) Context-Free Grammars (2) Vertalerbouw HC 9: Ch.

Lexical Analysis Finite Automate

CSE : Exam 3-ANSWERS, Spring 2011 Time: 50 minutes

Finite Automata. Informatics 2A: Lecture 3. John Longley. 22 September School of Informatics University of Edinburgh

Finite Automata. Informatics 2A: Lecture 3. Mary Cryan. 21 September School of Informatics University of Edinburgh

More on automata. Michael George. March 24 April 7, 2014

First Midterm Examination

Lecture 6 Regular Grammars

Converting Regular Expressions to Discrete Finite Automata: A Tutorial

Harvard University Computer Science 121 Midterm October 23, 2012

Riemann Sums and Riemann Integrals

Formal languages, automata, and theory of computation

The transformation to right derivation is called the canonical reduction sequence. Bottom-up analysis

Riemann Sums and Riemann Integrals

Closure Properties of Regular Languages

Chapter Five: Nondeterministic Finite Automata. Formal Language, chapter 5, slide 1

CMSC 330: Organization of Programming Languages. DFAs, and NFAs, and Regexps (Oh my!)

Review for the Midterm

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.)

Assignment 1 Automata, Languages, and Computability. 1 Finite State Automata and Regular Languages

Formal Languages Simplifications of CFGs

SWEN 224 Formal Foundations of Programming WITH ANSWERS

Automata and Languages

3 Regular expressions

Homework 3 Solutions

1.4 Nonregular Languages

Anatomy of a Deterministic Finite Automaton. Deterministic Finite Automata. A machine so simple that you can understand it in less than one minute

Markscheme May 2016 Mathematics Standard level Paper 1

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

KNOWLEDGE-BASED AGENTS INFERENCE

a b b a pop push read unread

MA 124 January 18, Derivatives are. Integrals are.

Chapter 0. What is the Lebesgue integral about?

Worked out examples Finite Automata

Scanner. Specifying patterns. Specifying patterns. Operations on languages. A scanner must recognize the units of syntax Some parts are easy:

Theoretical foundations of Gaussian quadrature

Kleene s Theorem. Kleene s Theorem. Kleene s Theorem. Kleene s Theorem. Kleene s Theorem. Kleene s Theorem 2/16/15

Normal Forms for Context-free Grammars

State Minimization for DFAs

Jim Lambers MAT 169 Fall Semester Lecture 4 Notes

Nondeterminism and Nodeterministic Automata

CAAM 453 NUMERICAL ANALYSIS I Examination There are four questions, plus a bonus. Do not look at them until you begin the exam.

1.3 Regular Expressions

Finite Automata-cont d

2. Lexical Analysis. Oscar Nierstrasz

Continuous Random Variables

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER MACHINES AND THEIR LANGUAGES ANSWERS

First Midterm Examination

Revision Sheet. (a) Give a regular expression for each of the following languages:

ECE 327 Solution to Midterm 2016t1 (Winter)

Name Ima Sample ASU ID

CS 188 Introduction to Artificial Intelligence Fall 2018 Note 7

CS5371 Theory of Computation. Lecture 20: Complexity V (Polynomial-Time Reducibility)

Lesson 25: Adding and Subtracting Rational Expressions

NFAs and Regular Expressions. NFA-ε, continued. Recall. Last class: Today: Fun:

DATA Search I 魏忠钰. 复旦大学大数据学院 School of Data Science, Fudan University. March 7 th, 2018

Spring 2017 Exam 1 MARK BOX HAND IN PART PIN: 17

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

Chapter 2 Finite Automata

Finite-State Automata: Recap

Nondeterminism. Nondeterministic Finite Automata. Example: Moves on a Chessboard. Nondeterminism (2) Example: Chessboard (2) Formal NFA

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

I. Theory of Automata II. Theory of Formal Languages III. Theory of Turing Machines

CS:4330 Theory of Computation Spring Regular Languages. Equivalences between Finite automata and REs. Haniel Barbosa

CSC 473 Automata, Grammars & Languages 11/9/10

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2014

CS375: Logic and Theory of Computing

1 From NFA to regular expression

Talen en Automaten Test 1, Mon 7 th Dec, h45 17h30

Matrix Solution to Linear Equations and Markov Chains

Scientific notation is a way of expressing really big numbers or really small numbers.

Fundamentals of Computer Science

Chapters 4 & 5 Integrals & Applications

Exercises Chapter 1. Exercise 1.1. Let Σ be an alphabet. Prove wv = w + v for all strings w and v.

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Informatics 2A: Lecture 20. Shay Cohen. 31 October 2017

Recitation 3: More Applications of the Derivative

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

Minimal DFA. minimal DFA for L starting from any other

For convenience, we rewrite m2 s m2 = m m m ; where m is repeted m times. Since xyz = m m m nd jxyj»m, we hve tht the string y is substring of the fir

Cambridge International Examinations Cambridge International Advanced Subsidiary and Advanced Level. Published

CMSC 330: Organization of Programming Languages

MTH 5102 Linear Algebra Practice Exam 1 - Solutions Feb. 9, 2016

Theory of Computation Regular Languages. (NTU EE) Regular Languages Fall / 38

Parsing and Pattern Recognition

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS

Exponentials - Grade 10 [CAPS] *

CS 330 Formal Methods and Models

Global Types for Dynamic Checking of Protocol Conformance of Multi-Agent Systems

Review of basic calculus

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3

Transcription:

Module Title: Informtics 2A Exm Diet (Dec/April/Aug): Aug 2015 Brief notes on nswers: 1. () Lexing: The input is progrm text [1].The output is strem of lexemes pired with their lexicl clsses [1]. For exmple the expression xyz 22 would result two lexeme strem xyz, 22 with lexicl clsses VAR of vribles nd INT of integer literls respectively [1]. Prsing: The input is strem of lexicl clsses (lexemes good enough) [1]. The output is syntx tree [1]. For exmple, VAR INT might produce tree such s (EXP (FUN (VAR)) (ARGS (ARG INT) (ARGS ɛ))) expressing the syntx of n expression formed s (possibly iterted) function ppliction [1]. Type-checking. The input is syntx tree [1]. The output is yes/no response s to whether the progrm type-checks, together with dignostic error messges if not [1]. In the exmple bove it would be checked tht the vrible xyz hs type of the form Integer -> τ, for some τ, llowing the function ppliction to be formed. [Only miniml detils re required. The exmples need to be fithful the ide of lexing/prsing/type-checking, not necessrily fithful to ny prticulr lnguge.] (b) E.g., Compiltion(/interprettion/evlution/execution) [1]. 2. () Let M 1 = (Q 1, s 1, F 1, δ 1 ) nd M 2 = (Q 2, s 2, F 2, δ 2 ). The desired DFA hs Q = Q 1 Q 2 s = (s 1, s 2 ) F = F 1 (Q 2 F 2 ) nd (q 1, q 2 ) (q 1, q 2) whenever q 1 for ech component] (b) The resulting DFA: q 1 in δ 1 nd q 2 q 2 in δ 2. [4: 1 mrk [4 mrks: deduct 1 per mistke] (c) No it is not possible: context-free lnguges re not closed under complement [1]. An exmple is Σ {xx x Σ } which is context free but its complement not [1]. [The first mrk needs more context thn just the nswer no ] [The counterexmple is bookwork] i

3. () The PDA execution: [8 mrks: in principle 1 per step] (b) The lnguge is: stte stck unred input q1 b q1 b q1 b q1 b q2 q2 ɛ q2 ɛ q2 ɛ ɛ { n b m 1 n, 0 m n} [2 mrks: wrd 1 if ide right but some error in detil] 4. () Zipf s lw is n empiricl lw stting tht in typicl corpus of text, the number of occurrences of given word type is (pproximtely) inversely proportionl to its frequency rnk. [Up to 2 mrks] (b) Suppose k is constnt such tht the word with frequency rnk r occurs round k/r times. We re given tht k + k/2 + k/3 + k/4 = (25/12)k = 2500, whence k = 1200. To resonble pproximtion, 1/21 + + 1/30 = 10/25 = 2/5, so the required number of tokens is round 2k/5 = 480. [Up to 3 mrks.] (c) To compute the trnsition probbility from the POS X to the POS Y, look t ll occurrences in the corpus of tokens tgged s X, nd clculte wht proportion of these re followed by token tgged s Y. To compute the emission probbility for POS X s word w, look t ll occurrences of tokens tgged s X, nd clculte wht proportion of these re the word w. [1 mrk for ech of these; 1 further mrk for clrity of explntion.] (d) We would expect higher proportionl ccurcy for the infrequent word types. This is becuse most infrequent words hve only one possible POS tg, wheres severl of the most frequent words re grmmticl function words with severl possible tgs. [2 mrks. Any other logiclly resonble point will be ccepted here.] 5. The CYK prse chrt is: bot mn wtches fish bot NP 1 NP 1 NP 2, S 1 NP 5, S 2 mn NP 1, V 1 NP 1, VP 1 NP 2, VP 1, S 1 wtches NP 1, V 1 NP 1, VP 1 fish NP 1, V 1 Up to 7 mrks for the non-terminl entries; up to 3 mrks for the numbers. 6. () The production S S b is left recursive [1]. ii

(b) For exmple S S1 T S1 ( S S ) T ɛ b T [5 mrks: in proportion] (c) First(S) = {, ( } First(S1) = {, ( } First(T) = {ɛ, b} [3: in principle 1 mrk ech] (d) [3: in principle 1 mrk ech] (e) Prse tble: [6 mrks: deduct 1 per mistke] (f) Algorithm execution: [7 mrks: in proportion] Follow(S) = {, (, ), $ } Follow(S1) = {, b, (, ) } Follow(T) = {, (, ), $ } b ( ) $ S S1 T S1 T S1 ( S S ) T ɛ b T ɛ ɛ ɛ ction unred input stck ( ) $ S S S1 T ( ) $ S1 T S1 ( S S ) ( ) $ ( S S ) T mtch ( ) $ S S ) T S S1 T ) $ S1 T S ) T S1 ) $ T S ) T mtch ) $ T S ) T T ɛ ) $ S ) T S S1 T ) $ S1 T ) T S1 ) $ T ) T mtch ) $ T ) T T ɛ ) $ ) T mtch ) $ T T ɛ $ ɛ iii

7. () The following trnsducer does the job. Here V stnds for ny of the letters,e,i,o,u, nd C stnds for ny letter except,e,i,o,u,y. A single symbol (without colon) denotes trnsition with the sme input nd output. [6 mrks for evidence of understnding, 6 mrks for the detils. Only deduct 1 or 2 mrks if ny or ll of the following trnsitions re missing or incorrect: 3 0, 3 1, 4 1, nd the trnsition 4 0 lbelled with C:ysC.] (b) he# prses s he# plies# prses s plies#, plie s#, ply s# trdes# prses s trdes#, trde s# [Up to 4 mrks; deduct 1 mrk per mistke.] (c) he: PRN plies: NPL, V3S trdes: NPL, V3S [1 mrk for ech word.] (d) The Viterbi mtrix is s follows. Cells tht re not consistent with the possibilities listed in prt (c) re left empty. he plies trdes PRN 0.4 NPL 0.4x0.1=0.04 0.16x0.5=0.08 V3S 0.4x0.4=0.16 0.04x0.1=0.004 So the most probble tg sequence is PRN V3S NPL. [4 mrks for the numbers, 1 mrk for the bcktrce pointers, 1 mrk for the correct tgging. Minor clericl errors will not be hevily penlized.] 8. () The complete grmmr with semntic ttchments is s follows: iv

S Nme is NP {NP.Sem(Nme.Sem)} NP Nme {λx. x = Nme.Sem} NP NP s Rel {λx. y. NP.Sem(y) & Rel.Sem(x, y)} Rel Rel1 {Rel1.Sem} Rel1 fther {λxy. Fther(x, y)} Rel1 dughter {λxy. Femle(x) & (Fther(y, x) Mother(y, x))} Rel1 brother {λxy. Mle(x) & (x = y) & ( z.fther(z, x) Fther(z, y)) & ( w.mother(w, x) Mother(w, y))} Rel1 grndchild {λxy. z. (Fther(y, z) Mother(y, z)) & (Fther(z, x) Mother(z, x))} Nme Alice {Alice}, etc. [3 mrks ech for the cluses for brother nd grndchild ; 2 mrks for the cluse for NP s Rel; roughly 1 mrk ech for the remining cluses. Other solutions re eqully cceptble for some of these cluses.] (b) The root node of the tree will be nnotted with (λx. y.(λx.x = Alice)(y) & (λxy.fther(x, y))(x, y))(brin) nd the other nodes will be nnotted with subexpressions of this in the evident wy. [Roughly 1 mrk per node.] (c) The bove λ-expression β-reduces in three steps (which my be done in vrious orders) to y. y = Alice & Fther(Brin, y) [1 mrk per reduction step.] (d) We my define This suggests the semntic cluse R(x, y) R(x, y) & w. R(w, y) w = x Rel only Rel1 {λxy. Rel.Sem(x, y) & w. Rel.Sem(w, y) w = x} [1 mrk for the formul for R, 2 mrks for the semntic cluse.] v