Hidden Markov Model Induction by Bayesian Model Merging

Size: px
Start display at page:

Download "Hidden Markov Model Induction by Bayesian Model Merging"

Transcription

1 To pper in: C. L. Giles, S. J. Hnson, & J. D. Cown, eds., Advnces in Neurl Informtion Processing Systems 5, Sn Mteo, CA, Morgn Kufmn, 1993 Hidden Mrkov Model Induction y Byesin Model Merging Andres Stolcke, Computer Science Division University of Cliforni Berkeley, CA stolcke@icsi.erkeley.edu Stephen Omohundro Interntionl Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA om@icsi.erkeley.edu Astrct This pper descries technique for lerning oth the numer of sttes nd the topology of Hidden Mrkov Models from exmples. The induction process strts with the most specific model consistent with the trining dt nd generlizes y successively merging sttes. Both the choice of sttes to merge nd the stopping criterion re guided y the Byesin posterior proility. We compre our lgorithm with the Bum-Welch method of estimting fixed-size models, nd find tht it cn induce miniml HMMs from dt in cses where fixed estimtion does not converge or requires redundnt prmeters to converge. 1 INTRODUCTION AND OVERVIEW Hidden Mrkov Models (HMMs) re well-studied pproch to the modelling of sequence dt. HMMs cn e viewed s stochstic generliztion of finite-stte utomt, where oth the trnsitions etween sttes nd the genertion of output symols re governed y proility distriutions. HMMs hve een importnt in speech recognition (Riner & Jung, 1986), cryptogrphy, nd more recently in other res such s protein clssifiction nd lignment (Hussler, Krogh, Min & Sjölnder, 1992; Bldi, Chuvin, Hunkpiller & McClure, 1993). Prctitioners hve typiclly chosen the HMM topology y hnd, so tht lerning the HMM from smple dt mens estimting only fixed numer of model prmeters. The stndrd pproch is to find mximum likelihood (ML) or mximum posteriori proility(map) estimte of the HMM prmeters. The Bum-Welch lgorithm uses dynmic progrmming

2 to pproximte these estimtes (Bum, Petrie, Soules & Weiss, 1970). A more generl prolem is to dditionlly find the est HMM topology. This includes oth the numer of sttes nd the connectivity (the non-zero trnsitions nd emissions). One could exhustively serch the model spce using the Bum-Welch lgorithm on fully connected models of vrying sizes, picking the model size nd topology with the highest posterior proility. (Mximum likelihood estimtion is not useful for this comprison since lrger models usully fit the dt etter.) This pproch is very costly nd Bum- Welch my get stuck t su-optiml locl mxim. Our comprtive results lter in the pper show tht this often occurs in prctice. The prolem cn e somewht llevited y smpling from severl initil conditions, ut t further increse in computtionl cost. The HMM induction method proposed in this pper tckles the structure lerning prolem in n incrementl wy. Rther thn estimting fixed-size model from scrtch for vrious sizes, the model size is djusted s new evidence rrives. There re two opposing tendencies in djusting the model size nd structure. Initilly new dt dds to the model size, ecuse the HMM hs to e ugmented to ccommodte the new smples. If enough dt of similr structure is ville, however, the lgorithm collpses the shred structure, decresing the model size. The merging of structure is lso wht drives generliztion, i.e., cretes HMMs tht generte dt not seen during trining. Beyond eing incrementl, our lgorithm is dt-driven, in tht the smples themselves completely determine the initil model shpe. Bum-Welch estimtion, y comprison, uses n initilly rndom set of prmeters for given-sized HMM nd itertively updtes them until point is found t which the smple likelihood is loclly mximl. Wht seems intuitively troulesome with this pproch is tht the initil model is completely uninformed y the dt. The smple dt directs the model formtion process only in n indirect mnner s the model pproches meningful shpe. 2 HIDDEN MARKOV MODELS For lck of spce we cnnot give full introduction to HMMs here; see Riner & Jung (1986) for detils. Briefly, n HMM consists of sttes nd trnsitions like Mrkov chin. In the discrete version considered here, it genertes strings y performing rndom wlks etween n initil nd finl stte, outputting symols t every stte in etween. The proility P(x M) tht model M genertes string x is determined y the conditionl proilities of mking trnsition from one stte to nother nd the proility of emitting ech symol from ech stte. Once these re given, the proility of prticulr pth through the model generting the string cn e computed s the product of ll trnsition nd emission proilities long the pth. The proility of string x is the sum of the proilities of ll pths generting x. For exmple, the model M 3 in Figure 1 genertes the strings,,, with proilities 2 3, 2 3 2, 2 3 3,, respectively. 3 HMM INDUCTION BY STATE MERGING 3.1 MODEL MERGING Omohundro (1992) hs proposed n pproch to sttisticl model inference in which initil

3 models simply replicte the dt nd generlize y similrity. As more dt is received, component models re fit from more complex model spces. This llows the formtion of ritrrily complex models without overfitting long the wy. The elementry step used in modifying the overll model is merging of su-models, collpsing the smple sets for the corresponding smple regions. The serch for su-models to merge is guided y n ttempt to scrifice s little of the smple likelihood s possile s result of the merging process. This serch cn e done very efficiently if () greedy serch strtegy cn e used, nd () likelihood computtions cn e done loclly for ech su-model nd don t require glol recomputtion on ech model updte. 3.2 STATE MERGING IN HMMS We hve pplied this generl pproch to the HMM lerning tsk. We descrie the lgorithm here mostly y presenting n exmple. The detils re ville in Stolcke & Omohundro (1993). To otin n initil model from the dt, we first construct n HMM which produces exctly the input strings. The strt stte hs s mny outgoing trnsitions s there re strings nd ech string is represented y unique pth with one stte per smple symol. The proility of entering these pths from the strt stte is uniformly distriuted. Within ech pth there is unique trnsition rc whose proility is 1. The emission proilities re 1 for ech stte to produce the corresponding symol. As n exmple, consider the regulr lnguge () + nd two smples drwn from it, the strings nd. The lgorithm constructs the initil model M 0 depicted in Figure 1. This is the most specific model ccounting for the oserved dt. It ssigns ech smple proility equl to its reltive frequency, nd is therefore mximum likelihood model for the dt. Lerning from the smple dt mens generlizing from it. This implies trding off model likelihood ginst some sort of is towrds simpler models, expressed y prior proility distriution over HMMs. Byesin nlysis provides forml sis for this trdeoff. Byes rule tells us tht the posterior model proility P(M x) is proportionl to the product of the model prior P(M) nd the likelihood of the dt P(x M). Smller or simpler models will hve higher prior nd this cn outweigh the drop in likelihood s long s the generliztion is conservtive nd keeps the model close to the dt. The choice of model priors is discussed in the next section. The fundmentl ide exploited here is tht the initil model M 0 cn e grdully trnsformed into the generting model y repetedly merging sttes. The intuition for this heuristic comes from the fct tht if we tke the pths tht generte the smples in n ctul generting HMM M nd unroll them to mke them completely disjoint, we otin M 0. The itertive merging process, then, is n ttempt to undo the unrolling, trcing serch through the model spce ck to the generting model. Merging two sttes q 1 nd q 2 in this context mens replcing q 1 nd q 2 y new stte r with trnsition distriution tht is weighted mixture of the trnsition proilities of q 1, q 2, nd with similr mixture distriution for the emissions. Trnsition proilities into q 1 or q 2 re dded up nd redirected to r. The weights used in forming the mixture distriutions re the reltive frequencies with which q 1 nd q 2 re visited in the current model. Repetedly performing such merging opertions yields sequence of models M 0, M 1,

4 M 0: F log L(x M 0) = M 1: log L(x M 1) = log L(x M 0) M 2: F log L(x M 2) = F M 3: F log L(x M 3) = log L(x M 2) Figure 1: Sequence of models otined y merging smples f, g. All trnsitions without specil nnottions hve proility 1; Output symols pper ove their respective sttes nd lso crry n implicit proility of 1. For ech model the log likelihood is given. M 2,, long which we cn serch for the MAP model. To mke the serch for M efficient, we use greedy strtegy: given M i, choose pir of sttes for merging tht mximizes P(M i+1 X). Continuing with the previous exmple, we find tht sttes 1 nd 3 in M 0 cn e merged without penlizing the likelihood. This is ecuse they hve identicl outputs nd the loss due to merging the outgoing trnsitions is compensted y the merging of the incoming trnsitions. The.5/.5 split is simply trnsferred to outgoing trnsitions of the merged stte. The sme sitution otins for sttes 2 nd 4 once 1 nd 3 re merged. From these two first merges we get model M 1 in Figure 1. By convention we reuse the smller of two stte indices to denote the merged stte. At this point the est merge turns out to e etween sttes 2 nd 6, giving model M 2. However, there is penlty in likelihood, which decreses to out.59 of its previous vlue. Under ll the resonle priors we considered (see elow), the posterior model proility still increses due to n increse in the prior. Note tht the trnsition proility rtio t stte 2 is now 2/1, since two smples mke use of the first trnsition, wheres only one tkes the second trnsition. Finlly, sttes 1 nd 5 cn e merged without penlty to give M 3, the miniml model tht genertes () +. Further merging t this point would reduce the likelihood y three orders of mgnitude. The resulting decrese in the posterior proility tells the lgorithm to stop

5 t this point. 3.3 MODEL PRIORS As noted previously, the likelihoods P(X M i ) long the sequence of models considered y the lgorithm is monotoniclly decresing. The prior P(M) must ccount for n overll increse in posterior proility, nd is therefore the driving force ehind generliztion. As in the work on Byesin lerning of clssifiction trees y Buntine (1992), we cn split the prior P(M) into term ccounting for the model structure, P(M s ), nd term for the djustle prmeters in fixed structure P(M p M s ). We initilly relied on the structurl prior only, incorporting n explicit is towrds smller models. Size here is some function of the numer of sttes nd/or trnsitions, M. Such prior cn e otined y mking P(M s )/e M, nd cn e viewed s description length prior tht penlizes models ccording to their coding length (Rissnen, 1983; Wllce & Freemn, 1987). The constnts in this MDL term hd to e djusted y hnd from exmples of desirle generliztion. For the prmeter prior P(M p M s ), it is stndrd prctice to pply some sort of smoothing or regulrizing prior to void overfitting the model prmeters. Since oth the trnsition nd the emission proilities re given y multinomil distriutions it is nturl to use Dirichlet conjugte prior in this cse (Berger, 1985). The effect of this prior is equivlent to hving numer of virtul smples for ech of the possile trnsitions nd emissions which re dded to the ctul smples when it comes to estimting the most likely prmeter settings. In our cse, the virtul smples mde equl use of ll potentil trnsitions nd emissions, dding is towrds uniform trnsition nd emission proilities. We found tht the Dirichlet priors y themselves produce n implicit is towrds smller models, phenomenon tht cn e explined s follows. The prior lone results in model with uniform, flt distriutions. Adding ctul smples hs the effect of putting umps into the posterior distriutions, so s to fit the dt. The more smples re ville, the more peked the posteriors will get round the mximum likelihood estimtes of the prmeters, incresing the MAP vlue. In estimting HMM prmeters, wht counts is not the totl numer of smples, ut the numer of smples per stte, since trnsition nd emission distriutions re locl to ech stte. As we merge sttes, the ville evidence gets shred y fewer sttes, thus llowing the remining sttes to produce etter fit to the dt. This phenomenon is similr, ut not identicl, to the Byesin Occm fctors tht prefer models with fewer prmeter (McKy, 1992). Occm fctors re result of integrting the posterior over the prmeter spce, something which we do not do ecuse of the computtionl complictions it introduces in HMMs (see elow). 3.4 APPROXIMATIONS At ech itertion step, our lgorithm evlutes the posterior resulting from every possile merge in the current HMM. To keep this procedure fesile, numer of pproximtions re incorported in the implementtion tht don t seem to ffect its qulittive properties. For the purpose of likelihood computtion, we consider only the most likely pth through the model for given smple string (the Viteri pth). This llows us to

6 express the likelihood in product form, computle from sufficient sttistics for ech trnsition nd emission. We ssume the Viteri pths re preserved y the merging opertion, tht is, the pths previously pssing through the merged sttes now go through the resulting new stte. This llows us to updte the sufficient sttistics incrementlly, nd mens only O(numer of sttes) likelihood terms need to e recomputed. The posterior proility of the model structure is pproximted y the posterior of the MAP estimtes for the model prmeters. Rigorously integrting over ll prmeter vlues is not fesile since vrying even single prmeter could chnge the pths of ll smples through the HMM. Finlly, it hs to e kept in mind tht our serch procedure long the sequence of merged models finds only locl optim, since we stop s soon s the posterior strts to decrese. A full serch of the spce would e much more costly. However, we found est-first look-hed strtegy to e sufficient in rre cses where locl mximum cused prolem. In those cses we continue merging long the est-first pth for fixed numer of steps (typiclly one) to check whether the posterior hs undergone just temporry decrese. 4 EXPERIMENTS We hve used vrious rtificil finite-stte lnguges to test our lgorithm nd compre its performnce to the stndrd Bum-Welch lgorithm. Tle 1 summrizes the results on the two smple lnguges c c nd The first of these contins contingency etween initil nd finl symols tht cn e hrd for lerning lgorithms to uncover. We used no explicit model size prior in our experiments fter we found tht the Dirichlet prior ws very roust in giving just the the right mount of is towrd smller models. 1 Summrizing the results, we found tht merging very relily found the generting model structure from very smll numer of smples. The prmeter vlues re determined y the smple set sttistics. The Bum-Welch lgorithm, much like ckpropgtion network, my e sensitive to its rndom initil prmeter settings. We therefore smpled from numer of initil conditions. Interestingly, we found tht Bum-Welch hs good chnce of settling into suoptiml HMM structure, especilly if the numer of sttes is the miniml numer required for the trget lnguge. It proved much esier to estimte correct lnguge models when extr sttes were provided. Also, incresing the smple size helped it converge to the trget model. 5 RELATED WORK Our pproch is relted to severl other pproches in the literture. The concept of stte merging is implicit in the notion of stte equivlence clsses, which is fundmentl to much of utomt theory (Hopcroft & Ullmn, 1979) nd hs een pplied 1 The numer of virtul smples per trnsition/emission ws held constnt t 0.1 throughout.

7 () Method Smple Entropy Cross-entropy Lnguge n Merging 8 m.p ±.020 c c 6 Merging 20 rndom ±.033 c c 6 Bum-Welch 8 m.p ±.023 (est) ( )c ( ) 6 (10 trils) ±.228 (worst) ( )c ( ) 6 Bum-Welch 20 rndom ±.031 (est) c c 6 (10 trils) ±.031 (worst) ( )c ( ) 6 Bum-Welch 8 m.p ±.271 c c 10 Bum-Welch 20 rndom ±.032 c c 10 () Method Smple Entropy Cross-entropy Lnguge n Merging 5 m.p ± Bum-Welch 5 m.p ±.161 (est) ( + + ) + 4 (3 trils) ±.007 (worst) ( + + ) + 4 Merging 10 rndom ± Bum-Welch 10 rndom ±.076 (est) (3 trils) ±.137 (worst) ( + + ) + 4 Tle 1: Results for merging nd Bum-Welch on two regulr lnguges: () c c nd () Smples were either the top most prole (m.p.) ones from the trget lnguge, or set of rndomly generted ones. Entropy is the verge negtive log proility on the trining set, wheres cross-entropy refers to the empiricl cross-entropy etween the induced model nd the generting model (the lower, the etter generliztion). n denotes the finl numer of model sttes for merging, or the fixed model size for Bum-Welch. For Bum-Welch, oth est nd worst performnce over severl initil conditions is listed. to utomt lerning s well (Angluin & Smith, 1983). Tomit (1982) is n exmple of finite-stte model spce serch guided y (nonproilistic) goodness mesure. Horning (1969) descries Byesin grmmr induction procedure tht serches the model spce exhustively for the MAP model. The procedure provly finds the glolly optiml grmmr in finite time, ut is infesile in prctice ecuse of its enumertive chrcter. The incrementl ugmenttion of the HMM y merging in new smples hs some of the flvor of the lgorithm used y Port & Feldmn (1991) to induce finite-stte model from positive-only, ordered exmples. Hussler et l. (1992) use limited HMM surgery (insertions nd deletions in liner HMM) to djust the model size to the dt, while keeping the topology unchnged. 6 FURTHER RESEARCH We re investigting severl rel-world pplictions for our method. One tsk is the construction of unified multiple-pronuncition word models for speech recognition. This is currently eing crried out in collortion with Chuck Wooters t ICSI, nd it ppers tht our merging lgorithm is le to produce linguisticlly dequte phonetic models. Another direction involves n extension of the model spce to stochstic context-free grmmrs, for which stndrd estimtion method nlogous to Bum-Welch exists (Lri

8 & Young, 1990). The notions of smple incorportion nd merging crry over to this domin (with merging now involving the non-terminls of the CFG), ut need to e complemented with mechnism tht dds new non-terminls to crete hierrchicl structure (which we cll chunking). Acknowledgements We would like to thnk Peter Cheesemn, Wry Buntine, Dvid Stoutmire, nd Jerry Feldmn for helpful discussions of the issues in this pper. References Angluin, D. & Smith, C. H. (1983), Inductive inference: Theory nd methods, ACM Computing Surveys 15(3), Bldi, P., Chuvin, Y., Hunkpiller, T. & McClure, M. A. (1993), Hidden Mrkov Models in moleculr iology: New lgorithms nd pplictions, this volume. Bum, L. E., Petrie, T., Soules, G. & Weiss, N. (1970), A mximiztion technique occuring in the sttisticl nlysis of proilistic functions in Mrkov chins, The Annls of Mthemticl Sttistics 41(1), Berger, J. O. (1985), Sttisticl Decision Theory nd Byesin Anlysis, Springer Verlg, New York. Buntine, W. (1992), Lerning clssifiction trees, in D. J. Hnd, ed., Artificil Intelligence Frontiers in Sttistics: AI nd Sttistics III, Chpmn & Hll. Hussler, D., Krogh, A., Min, I. S. & Sjölnder, K. (1992), Protein modeling using hidden Mrkov models: Anlysis of gloins, Technicl Report UCSC-CRL-92-23, Computer nd Informtion Sciences, University of Cliforni, Snt Cruz, C. Revised Sept Hopcroft, J. E. & Ullmn, J. D. (1979), Introduction to Automt Theory, Lnguges, nd Computtion, Addison-Wesley, Reding, Mss. Horning, J. J. (1969), A study of grmmticl inference, Technicl Report CS 139, Computer Science Deprtment, Stnford University, Stnford, C. Lri, K. & Young, S. J. (1990), The estimtion of stochstic context-free grmmrs using the Inside-Outside lgorithm, Computer Speech nd Lnguge 4, McKy, D. J. C. (1992), Byesin interpoltion, Neurl Computtion 4, Omohundro, S. M. (1992), Best-first model merging for dynmic lerning nd recognition, Technicl Report TR , Interntionl Computer Science Institute, Berkeley, C. Port, S. & Feldmn, J. A. (1991), Lerning utomt from ordered exmples, Mchine Lerning 7, Riner, L. R. & Jung, B. H. (1986), An introduction to Hidden Mrkov Models, IEEE ASSP Mgzine 3(1), Rissnen, J. (1983), A universl prior for integers nd estimtion y minimum description length, The Annls of Sttistics 11(2), Stolcke, A. & Omohundro, S. (1993), Best-first model merging for Hidden Mrkov Model induction, Technicl Report TR , Interntionl Computer Science Institute, Berkeley, C. Tomit, M. (1982), Dynmic construction of finite utomt from exmples using hill-climing, in Proceedings of the 4th Annul Conference of the Cognitive Science Society, Ann Aror, Mich., pp Wllce, C. S. & Freemn, P. R. (1987), Estimtion nd inference y compct coding, Journl of the Royl Sttisticl Society, Series B 49(3),

Bayesian Networks: Approximate Inference

Bayesian Networks: Approximate Inference pproches to inference yesin Networks: pproximte Inference xct inference Vrillimintion Join tree lgorithm pproximte inference Simplify the structure of the network to mkxct inferencfficient (vritionl methods,

More information

Genetic Programming. Outline. Evolutionary Strategies. Evolutionary strategies Genetic programming Summary

Genetic Programming. Outline. Evolutionary Strategies. Evolutionary strategies Genetic programming Summary Outline Genetic Progrmming Evolutionry strtegies Genetic progrmming Summry Bsed on the mteril provided y Professor Michel Negnevitsky Evolutionry Strtegies An pproch simulting nturl evolution ws proposed

More information

Convert the NFA into DFA

Convert the NFA into DFA Convert the NF into F For ech NF we cn find F ccepting the sme lnguge. The numer of sttes of the F could e exponentil in the numer of sttes of the NF, ut in prctice this worst cse occurs rrely. lgorithm:

More information

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3 2 The Prllel Circuit Electric Circuits: Figure 2- elow show ttery nd multiple resistors rrnged in prllel. Ech resistor receives portion of the current from the ttery sed on its resistnce. The split is

More information

Inductive and statistical learning of formal grammars

Inductive and statistical learning of formal grammars Inductive nd sttisticl lerning of forml grmmrs Pierre Dupont Grmmr Induction Mchine Lerning Gol: to give the lerning ility to mchine Design progrms the performnce of which improves over time pdupont@info.ucl.c.e

More information

Model Reduction of Finite State Machines by Contraction

Model Reduction of Finite State Machines by Contraction Model Reduction of Finite Stte Mchines y Contrction Alessndro Giu Dip. di Ingegneri Elettric ed Elettronic, Università di Cgliri, Pizz d Armi, 09123 Cgliri, Itly Phone: +39-070-675-5892 Fx: +39-070-675-5900

More information

Minimal DFA. minimal DFA for L starting from any other

Minimal DFA. minimal DFA for L starting from any other Miniml DFA Among the mny DFAs ccepting the sme regulr lnguge L, there is exctly one (up to renming of sttes) which hs the smllest possile numer of sttes. Moreover, it is possile to otin tht miniml DFA

More information

Today. Recap: Reasoning Over Time. Demo Bonanza! CS 188: Artificial Intelligence. Advanced HMMs. Speech recognition. HMMs. Start machine learning

Today. Recap: Reasoning Over Time. Demo Bonanza! CS 188: Artificial Intelligence. Advanced HMMs. Speech recognition. HMMs. Start machine learning CS 188: Artificil Intelligence Advnced HMMs Dn Klein, Pieter Aeel University of Cliforni, Berkeley Demo Bonnz! Tody HMMs Demo onnz! Most likely explntion queries Speech recognition A mssive HMM! Detils

More information

Designing finite automata II

Designing finite automata II Designing finite utomt II Prolem: Design DFA A such tht L(A) consists of ll strings of nd which re of length 3n, for n = 0, 1, 2, (1) Determine wht to rememer out the input string Assign stte to ech of

More information

Nondeterminism and Nodeterministic Automata

Nondeterminism and Nodeterministic Automata Nondeterminism nd Nodeterministic Automt 61 Nondeterminism nd Nondeterministic Automt The computtionl mchine models tht we lerned in the clss re deterministic in the sense tht the next move is uniquely

More information

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont.

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont. NFA DFA Exmple 3 CMSC 330: Orgniztion of Progrmming Lnguges NFA {B,D,E {A,E {C,D {E Finite Automt, con't. R = { {A,E, {B,D,E, {C,D, {E 2 Equivlence of DFAs nd NFAs Any string from {A to either {D or {CD

More information

CMSC 330: Organization of Programming Languages

CMSC 330: Organization of Programming Languages CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 CMSC 330 1 Types of Finite Automt Deterministic Finite Automt (DFA) Exctly one sequence of steps for ech string All exmples so fr Nondeterministic

More information

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true. York University CSE 2 Unit 3. DFA Clsses Converting etween DFA, NFA, Regulr Expressions, nd Extended Regulr Expressions Instructor: Jeff Edmonds Don t chet y looking t these nswers premturely.. For ech

More information

Compiler Design. Fall Lexical Analysis. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Fall Lexical Analysis. Sample Exercises and Solutions. Prof. Pedro C. Diniz University of Southern Cliforni Computer Science Deprtment Compiler Design Fll Lexicl Anlysis Smple Exercises nd Solutions Prof. Pedro C. Diniz USC / Informtion Sciences Institute 4676 Admirlty Wy, Suite

More information

p-adic Egyptian Fractions

p-adic Egyptian Fractions p-adic Egyptin Frctions Contents 1 Introduction 1 2 Trditionl Egyptin Frctions nd Greedy Algorithm 2 3 Set-up 3 4 p-greedy Algorithm 5 5 p-egyptin Trditionl 10 6 Conclusion 1 Introduction An Egyptin frction

More information

1 Nondeterministic Finite Automata

1 Nondeterministic Finite Automata 1 Nondeterministic Finite Automt Suppose in life, whenever you hd choice, you could try oth possiilities nd live your life. At the end, you would go ck nd choose the one tht worked out the est. Then you

More information

1B40 Practical Skills

1B40 Practical Skills B40 Prcticl Skills Comining uncertinties from severl quntities error propgtion We usully encounter situtions where the result of n experiment is given in terms of two (or more) quntities. We then need

More information

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4 Intermedite Mth Circles Wednesdy, Novemer 14, 2018 Finite Automt II Nickols Rollick nrollick@uwterloo.c Regulr Lnguges Lst time, we were introduced to the ide of DFA (deterministic finite utomton), one

More information

Assignment 1 Automata, Languages, and Computability. 1 Finite State Automata and Regular Languages

Assignment 1 Automata, Languages, and Computability. 1 Finite State Automata and Regular Languages Deprtment of Computer Science, Austrlin Ntionl University COMP2600 Forml Methods for Softwre Engineering Semester 2, 206 Assignment Automt, Lnguges, nd Computility Smple Solutions Finite Stte Automt nd

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb.

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb. CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2 CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University CS415 Compilers Lexicl Anlysis nd These slides re sed on slides copyrighted y Keith Cooper, Ken Kennedy & Lind Torczon t Rice University First Progrmming Project Instruction Scheduling Project hs een posted

More information

19 Optimal behavior: Game theory

19 Optimal behavior: Game theory Intro. to Artificil Intelligence: Dle Schuurmns, Relu Ptrscu 1 19 Optiml behvior: Gme theory Adversril stte dynmics hve to ccount for worst cse Compute policy π : S A tht mximizes minimum rewrd Let S (,

More information

Parse trees, ambiguity, and Chomsky normal form

Parse trees, ambiguity, and Chomsky normal form Prse trees, miguity, nd Chomsky norml form In this lecture we will discuss few importnt notions connected with contextfree grmmrs, including prse trees, miguity, nd specil form for context-free grmmrs

More information

Lecture 08: Feb. 08, 2019

Lecture 08: Feb. 08, 2019 4CS4-6:Theory of Computtion(Closure on Reg. Lngs., regex to NDFA, DFA to regex) Prof. K.R. Chowdhry Lecture 08: Fe. 08, 2019 : Professor of CS Disclimer: These notes hve not een sujected to the usul scrutiny

More information

Formal Languages and Automata

Formal Languages and Automata Moile Computing nd Softwre Engineering p. 1/5 Forml Lnguges nd Automt Chpter 2 Finite Automt Chun-Ming Liu cmliu@csie.ntut.edu.tw Deprtment of Computer Science nd Informtion Engineering Ntionl Tipei University

More information

Review of Gaussian Quadrature method

Review of Gaussian Quadrature method Review of Gussin Qudrture method Nsser M. Asi Spring 006 compiled on Sundy Decemer 1, 017 t 09:1 PM 1 The prolem To find numericl vlue for the integrl of rel vlued function of rel vrile over specific rnge

More information

Section 4: Integration ECO4112F 2011

Section 4: Integration ECO4112F 2011 Reding: Ching Chpter Section : Integrtion ECOF Note: These notes do not fully cover the mteril in Ching, ut re ment to supplement your reding in Ching. Thus fr the optimistion you hve covered hs een sttic

More information

A likelihood-ratio test for identifying probabilistic deterministic real-time automata from positive data

A likelihood-ratio test for identifying probabilistic deterministic real-time automata from positive data A likelihood-rtio test for identifying proilistic deterministic rel-time utomt from positive dt Sicco Verwer 1, Mthijs de Weerdt 2, nd Cees Witteveen 2 1 Eindhoven University of Technology 2 Delft University

More information

Learning Moore Machines from Input-Output Traces

Learning Moore Machines from Input-Output Traces Lerning Moore Mchines from Input-Output Trces Georgios Gintmidis 1 nd Stvros Tripkis 1,2 1 Alto University, Finlnd 2 UC Berkeley, USA Motivtion: lerning models from blck boxes Inputs? Lerner Forml Model

More information

Chapter Five: Nondeterministic Finite Automata. Formal Language, chapter 5, slide 1

Chapter Five: Nondeterministic Finite Automata. Formal Language, chapter 5, slide 1 Chpter Five: Nondeterministic Finite Automt Forml Lnguge, chpter 5, slide 1 1 A DFA hs exctly one trnsition from every stte on every symol in the lphet. By relxing this requirement we get relted ut more

More information

QUADRATURE is an old-fashioned word that refers to

QUADRATURE is an old-fashioned word that refers to World Acdemy of Science Engineering nd Technology Interntionl Journl of Mthemticl nd Computtionl Sciences Vol:5 No:7 011 A New Qudrture Rule Derived from Spline Interpoltion with Error Anlysis Hdi Tghvfrd

More information

Lecture 09: Myhill-Nerode Theorem

Lecture 09: Myhill-Nerode Theorem CS 373: Theory of Computtion Mdhusudn Prthsrthy Lecture 09: Myhill-Nerode Theorem 16 Ferury 2010 In this lecture, we will see tht every lnguge hs unique miniml DFA We will see this fct from two perspectives

More information

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014 CMPSCI 250: Introduction to Computtion Lecture #31: Wht DFA s Cn nd Cn t Do Dvid Mix Brrington 9 April 2014 Wht DFA s Cn nd Cn t Do Deterministic Finite Automt Forml Definition of DFA s Exmples of DFA

More information

Section 6: Area, Volume, and Average Value

Section 6: Area, Volume, and Average Value Chpter The Integrl Applied Clculus Section 6: Are, Volume, nd Averge Vlue Are We hve lredy used integrls to find the re etween the grph of function nd the horizontl xis. Integrls cn lso e used to find

More information

CS 330 Formal Methods and Models

CS 330 Formal Methods and Models CS 330 Forml Methods nd Models Dn Richrds, George Mson University, Spring 2017 Quiz Solutions Quiz 1, Propositionl Logic Dte: Ferury 2 1. Prove ((( p q) q) p) is tutology () (3pts) y truth tle. p q p q

More information

Continuous Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Continuous Random Variables Class 5, Jeremy Orloff and Jonathan Bloom Lerning Gols Continuous Rndom Vriles Clss 5, 8.05 Jeremy Orloff nd Jonthn Bloom. Know the definition of continuous rndom vrile. 2. Know the definition of the proility density function (pdf) nd cumultive

More information

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2016

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2016 CS125 Lecture 12 Fll 2016 12.1 Nondeterminism The ide of nondeterministic computtions is to llow our lgorithms to mke guesses, nd only require tht they ccept when the guesses re correct. For exmple, simple

More information

Classification: Rules. Prof. Pier Luca Lanzi Laurea in Ingegneria Informatica Politecnico di Milano Polo regionale di Como

Classification: Rules. Prof. Pier Luca Lanzi Laurea in Ingegneria Informatica Politecnico di Milano Polo regionale di Como Metodologie per Sistemi Intelligenti Clssifiction: Prof. Pier Luc Lnzi Lure in Ingegneri Informtic Politecnico di Milno Polo regionle di Como Rules Lecture outline Why rules? Wht re clssifiction rules?

More information

CS 330 Formal Methods and Models Dana Richards, George Mason University, Spring 2016 Quiz Solutions

CS 330 Formal Methods and Models Dana Richards, George Mason University, Spring 2016 Quiz Solutions CS 330 Forml Methods nd Models Dn Richrds, George Mson University, Spring 2016 Quiz Solutions Quiz 1, Propositionl Logic Dte: Ferury 9 1. (4pts) ((p q) (q r)) (p r), prove tutology using truth tles. p

More information

1 Online Learning and Regret Minimization

1 Online Learning and Regret Minimization 2.997 Decision-Mking in Lrge-Scle Systems My 10 MIT, Spring 2004 Hndout #29 Lecture Note 24 1 Online Lerning nd Regret Minimiztion In this lecture, we consider the problem of sequentil decision mking in

More information

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique?

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique? XII. LINEAR ALGEBRA: SOLVING SYSTEMS OF EQUATIONS Tody we re going to tlk out solving systems of liner equtions. These re prolems tht give couple of equtions with couple of unknowns, like: 6= x + x 7=

More information

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning Solution for Assignment 1 : Intro to Probbility nd Sttistics, PAC lerning 10-701/15-781: Mchine Lerning (Fll 004) Due: Sept. 30th 004, Thursdy, Strt of clss Question 1. Bsic Probbility ( 18 pts) 1.1 (

More information

CS 188: Artificial Intelligence Fall Announcements

CS 188: Artificial Intelligence Fall Announcements CS 188: Artificil Intelligence Fll 2009 Lecture 20: Prticle Filtering 11/5/2009 Dn Klein UC Berkeley Announcements Written 3 out: due 10/12 Project 4 out: due 10/19 Written 4 proly xed, Project 5 moving

More information

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.)

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.) CS 373, Spring 29. Solutions to Mock midterm (sed on first midterm in CS 273, Fll 28.) Prolem : Short nswer (8 points) The nswers to these prolems should e short nd not complicted. () If n NF M ccepts

More information

Math 8 Winter 2015 Applications of Integration

Math 8 Winter 2015 Applications of Integration Mth 8 Winter 205 Applictions of Integrtion Here re few importnt pplictions of integrtion. The pplictions you my see on n exm in this course include only the Net Chnge Theorem (which is relly just the Fundmentl

More information

Lecture 3. In this lecture, we will discuss algorithms for solving systems of linear equations.

Lecture 3. In this lecture, we will discuss algorithms for solving systems of linear equations. Lecture 3 3 Solving liner equtions In this lecture we will discuss lgorithms for solving systems of liner equtions Multiplictive identity Let us restrict ourselves to considering squre mtrices since one

More information

Reinforcement learning II

Reinforcement learning II CS 1675 Introduction to Mchine Lerning Lecture 26 Reinforcement lerning II Milos Huskrecht milos@cs.pitt.edu 5329 Sennott Squre Reinforcement lerning Bsics: Input x Lerner Output Reinforcement r Critic

More information

First Midterm Examination

First Midterm Examination 24-25 Fll Semester First Midterm Exmintion ) Give the stte digrm of DFA tht recognizes the lnguge A over lphet Σ = {, } where A = {w w contins or } 2) The following DFA recognizes the lnguge B over lphet

More information

Chapter 2 Finite Automata

Chapter 2 Finite Automata Chpter 2 Finite Automt 28 2.1 Introduction Finite utomt: first model of the notion of effective procedure. (They lso hve mny other pplictions). The concept of finite utomton cn e derived y exmining wht

More information

First Midterm Examination

First Midterm Examination Çnky University Deprtment of Computer Engineering 203-204 Fll Semester First Midterm Exmintion ) Design DFA for ll strings over the lphet Σ = {,, c} in which there is no, no nd no cc. 2) Wht lnguge does

More information

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17 EECS 70 Discrete Mthemtics nd Proility Theory Spring 2013 Annt Shi Lecture 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion,

More information

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature CMDA 4604: Intermedite Topics in Mthemticl Modeling Lecture 19: Interpoltion nd Qudrture In this lecture we mke brief diversion into the res of interpoltion nd qudrture. Given function f C[, b], we sy

More information

Week 10: Line Integrals

Week 10: Line Integrals Week 10: Line Integrls Introduction In this finl week we return to prmetrised curves nd consider integrtion long such curves. We lredy sw this in Week 2 when we integrted long curve to find its length.

More information

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata CS103B ndout 18 Winter 2007 Ferury 28, 2007 Finite Automt Initil text y Mggie Johnson. Introduction Severl childrens gmes fit the following description: Pieces re set up on plying ord; dice re thrown or

More information

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17 CS 70 Discrete Mthemtics nd Proility Theory Summer 2014 Jmes Cook Note 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion, y tking

More information

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science CSCI 340: Computtionl Models Kleene s Theorem Chpter 7 Deprtment of Computer Science Unifiction In 1954, Kleene presented (nd proved) theorem which (in our version) sttes tht if lnguge cn e defined y ny

More information

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2014

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2014 CS125 Lecture 12 Fll 2014 12.1 Nondeterminism The ide of nondeterministic computtions is to llow our lgorithms to mke guesses, nd only require tht they ccept when the guesses re correct. For exmple, simple

More information

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory CS 275 utomt nd Forml Lnguge Theory Course Notes Prt II: The Recognition Prolem (II) Chpter II.5.: Properties of Context Free Grmmrs (14) nton Setzer (Bsed on ook drft y J. V. Tucker nd K. Stephenson)

More information

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS CS 310 (sec 20) - Winter 2003 - Finl Exm (solutions) SOLUTIONS 1. (Logic) Use truth tles to prove the following logicl equivlences: () p q (p p) (q q) () p q (p q) (p q) () p q p q p p q q (q q) (p p)

More information

Homework 3 Solutions

Homework 3 Solutions CS 341: Foundtions of Computer Science II Prof. Mrvin Nkym Homework 3 Solutions 1. Give NFAs with the specified numer of sttes recognizing ech of the following lnguges. In ll cses, the lphet is Σ = {,1}.

More information

Formal languages, automata, and theory of computation

Formal languages, automata, and theory of computation Mälrdlen University TEN1 DVA337 2015 School of Innovtion, Design nd Engineering Forml lnguges, utomt, nd theory of computtion Thursdy, Novemer 5, 14:10-18:30 Techer: Dniel Hedin, phone 021-107052 The exm

More information

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms The Minimum Lel Spnning Tree Prolem: Illustrting the Utility of Genetic Algorithms Yupei Xiong, Univ. of Mrylnd Bruce Golden, Univ. of Mrylnd Edwrd Wsil, Americn Univ. Presented t BAE Systems Distinguished

More information

The practical version

The practical version Roerto s Notes on Integrl Clculus Chpter 4: Definite integrls nd the FTC Section 7 The Fundmentl Theorem of Clculus: The prcticl version Wht you need to know lredy: The theoreticl version of the FTC. Wht

More information

LECTURE NOTE #12 PROF. ALAN YUILLE

LECTURE NOTE #12 PROF. ALAN YUILLE LECTURE NOTE #12 PROF. ALAN YUILLE 1. Clustering, K-mens, nd EM Tsk: set of unlbeled dt D = {x 1,..., x n } Decompose into clsses w 1,..., w M where M is unknown. Lern clss models p(x w)) Discovery of

More information

Reinforcement Learning

Reinforcement Learning Reinforcement Lerning Tom Mitchell, Mchine Lerning, chpter 13 Outline Introduction Comprison with inductive lerning Mrkov Decision Processes: the model Optiml policy: The tsk Q Lerning: Q function Algorithm

More information

CS 188: Artificial Intelligence Spring 2007

CS 188: Artificial Intelligence Spring 2007 CS 188: Artificil Intelligence Spring 2007 Lecture 3: Queue-Bsed Serch 1/23/2007 Srini Nrynn UC Berkeley Mny slides over the course dpted from Dn Klein, Sturt Russell or Andrew Moore Announcements Assignment

More information

NFAs and Regular Expressions. NFA-ε, continued. Recall. Last class: Today: Fun:

NFAs and Regular Expressions. NFA-ε, continued. Recall. Last class: Today: Fun: CMPU 240 Lnguge Theory nd Computtion Spring 2019 NFAs nd Regulr Expressions Lst clss: Introduced nondeterministic finite utomt with -trnsitions Tody: Prove n NFA- is no more powerful thn n NFA Introduce

More information

CMSC 330: Organization of Programming Languages. DFAs, and NFAs, and Regexps (Oh my!)

CMSC 330: Organization of Programming Languages. DFAs, and NFAs, and Regexps (Oh my!) CMSC 330: Orgniztion of Progrmming Lnguges DFAs, nd NFAs, nd Regexps (Oh my!) CMSC330 Spring 2018 Types of Finite Automt Deterministic Finite Automt (DFA) Exctly one sequence of steps for ech string All

More information

AUTOMATA AND LANGUAGES. Definition 1.5: Finite Automaton

AUTOMATA AND LANGUAGES. Definition 1.5: Finite Automaton 25. Finite Automt AUTOMATA AND LANGUAGES A system of computtion tht only hs finite numer of possile sttes cn e modeled using finite utomton A finite utomton is often illustrted s stte digrm d d d. d q

More information

1.4 Nonregular Languages

1.4 Nonregular Languages 74 1.4 Nonregulr Lnguges The number of forml lnguges over ny lphbet (= decision/recognition problems) is uncountble On the other hnd, the number of regulr expressions (= strings) is countble Hence, ll

More information

Context-Free Grammars and Languages

Context-Free Grammars and Languages Context-Free Grmmrs nd Lnguges (Bsed on Hopcroft, Motwni nd Ullmn (2007) & Cohen (1997)) Introduction Consider n exmple sentence: A smll ct ets the fish English grmmr hs rules for constructing sentences;

More information

Finite Automata-cont d

Finite Automata-cont d Automt Theory nd Forml Lnguges Professor Leslie Lnder Lecture # 6 Finite Automt-cont d The Pumping Lemm WEB SITE: http://ingwe.inghmton.edu/ ~lnder/cs573.html Septemer 18, 2000 Exmple 1 Consider L = {ww

More information

Closure Properties of Regular Languages

Closure Properties of Regular Languages Closure Properties of Regulr Lnguges Regulr lnguges re closed under mny set opertions. Let L 1 nd L 2 e regulr lnguges. (1) L 1 L 2 (the union) is regulr. (2) L 1 L 2 (the conctention) is regulr. (3) L

More information

Math 1B, lecture 4: Error bounds for numerical methods

Math 1B, lecture 4: Error bounds for numerical methods Mth B, lecture 4: Error bounds for numericl methods Nthn Pflueger 4 September 0 Introduction The five numericl methods descried in the previous lecture ll operte by the sme principle: they pproximte the

More information

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true. York University CSE 2 Unit 3. DFA Clsses Converting etween DFA, NFA, Regulr Expressions, nd Extended Regulr Expressions Instructor: Jeff Edmonds Don t chet y looking t these nswers premturely.. For ech

More information

Review of Calculus, cont d

Review of Calculus, cont d Jim Lmbers MAT 460 Fll Semester 2009-10 Lecture 3 Notes These notes correspond to Section 1.1 in the text. Review of Clculus, cont d Riemnn Sums nd the Definite Integrl There re mny cses in which some

More information

1 From NFA to regular expression

1 From NFA to regular expression Note 1: How to convert DFA/NFA to regulr expression Version: 1.0 S/EE 374, Fll 2017 Septemer 11, 2017 In this note, we show tht ny DFA cn e converted into regulr expression. Our construction would work

More information

5.7 Improper Integrals

5.7 Improper Integrals 458 pplictions of definite integrls 5.7 Improper Integrls In Section 5.4, we computed the work required to lift pylod of mss m from the surfce of moon of mss nd rdius R to height H bove the surfce of the

More information

cmp-lg/ May 1994

cmp-lg/ May 1994 I 1947 Center St. Suite 600 Berkeley, Cliforni 94704-1198 (510) 643-9153 FAX (510) 643-7684 INTERNATIONAL COMPUTER SCIENCE INSTITUTE cmp-lg/9405017 09 My 1994 Best-rst Model Merging for Hidden Mrkov Model

More information

2.4 Linear Inequalities and Interval Notation

2.4 Linear Inequalities and Interval Notation .4 Liner Inequlities nd Intervl Nottion We wnt to solve equtions tht hve n inequlity symol insted of n equl sign. There re four inequlity symols tht we will look t: Less thn , Less thn or

More information

Some Theory of Computation Exercises Week 1

Some Theory of Computation Exercises Week 1 Some Theory of Computtion Exercises Week 1 Section 1 Deterministic Finite Automt Question 1.3 d d d d u q 1 q 2 q 3 q 4 q 5 d u u u u Question 1.4 Prt c - {w w hs even s nd one or two s} First we sk whether

More information

Advanced Calculus: MATH 410 Notes on Integrals and Integrability Professor David Levermore 17 October 2004

Advanced Calculus: MATH 410 Notes on Integrals and Integrability Professor David Levermore 17 October 2004 Advnced Clculus: MATH 410 Notes on Integrls nd Integrbility Professor Dvid Levermore 17 October 2004 1. Definite Integrls In this section we revisit the definite integrl tht you were introduced to when

More information

Improper Integrals. The First Fundamental Theorem of Calculus, as we ve discussed in class, goes as follows:

Improper Integrals. The First Fundamental Theorem of Calculus, as we ve discussed in class, goes as follows: Improper Integrls The First Fundmentl Theorem of Clculus, s we ve discussed in clss, goes s follows: If f is continuous on the intervl [, ] nd F is function for which F t = ft, then ftdt = F F. An integrl

More information

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies Stte spce systems nlysis (continued) Stbility A. Definitions A system is sid to be Asymptoticlly Stble (AS) when it stisfies ut () = 0, t > 0 lim xt () 0. t A system is AS if nd only if the impulse response

More information

Evolutionary Computation

Evolutionary Computation Topic 9 Evolutionry Computtion Introduction, or cn evolution e intelligent? Simultion of nturl evolution Genetic lgorithms Evolution strtegies Genetic progrmming Summry Cn evolution e intelligent? Intelligence

More information

CS 188 Introduction to Artificial Intelligence Fall 2018 Note 7

CS 188 Introduction to Artificial Intelligence Fall 2018 Note 7 CS 188 Introduction to Artificil Intelligence Fll 2018 Note 7 These lecture notes re hevily bsed on notes originlly written by Nikhil Shrm. Decision Networks In the third note, we lerned bout gme trees

More information

Harvard University Computer Science 121 Midterm October 23, 2012

Harvard University Computer Science 121 Midterm October 23, 2012 Hrvrd University Computer Science 121 Midterm Octoer 23, 2012 This is closed-ook exmintion. You my use ny result from lecture, Sipser, prolem sets, or section, s long s you quote it clerly. The lphet is

More information

Coalgebra, Lecture 15: Equations for Deterministic Automata

Coalgebra, Lecture 15: Equations for Deterministic Automata Colger, Lecture 15: Equtions for Deterministic Automt Julin Slmnc (nd Jurrin Rot) Decemer 19, 2016 In this lecture, we will study the concept of equtions for deterministic utomt. The notes re self contined

More information

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS The University of Nottinghm SCHOOL OF COMPUTER SCIENCE LEVEL 2 MODULE, SPRING SEMESTER 2016 2017 LNGUGES ND COMPUTTION NSWERS Time llowed TWO hours Cndidtes my complete the front cover of their nswer ook

More information

ECO 317 Economics of Uncertainty Fall Term 2007 Notes for lectures 4. Stochastic Dominance

ECO 317 Economics of Uncertainty Fall Term 2007 Notes for lectures 4. Stochastic Dominance Generl structure ECO 37 Economics of Uncertinty Fll Term 007 Notes for lectures 4. Stochstic Dominnce Here we suppose tht the consequences re welth mounts denoted by W, which cn tke on ny vlue between

More information

3 Regular expressions

3 Regular expressions 3 Regulr expressions Given n lphet Σ lnguge is set of words L Σ. So fr we were le to descrie lnguges either y using set theory (i.e. enumertion or comprehension) or y n utomton. In this section we shll

More information

C Dutch System Version as agreed by the 83rd FIDE Congress in Istanbul 2012

C Dutch System Version as agreed by the 83rd FIDE Congress in Istanbul 2012 04.3.1. Dutch System Version s greed y the 83rd FIDE Congress in Istnul 2012 A Introductory Remrks nd Definitions A.1 Initil rnking list A.2 Order See 04.2.B (Generl Hndling Rules - Initil order) For pirings

More information

2D1431 Machine Learning Lab 3: Reinforcement Learning

2D1431 Machine Learning Lab 3: Reinforcement Learning 2D1431 Mchine Lerning Lb 3: Reinforcement Lerning Frnk Hoffmnn modified by Örjn Ekeberg December 7, 2004 1 Introduction In this lb you will lern bout dynmic progrmming nd reinforcement lerning. It is ssumed

More information

Bellman Optimality Equation for V*

Bellman Optimality Equation for V* Bellmn Optimlity Eqution for V* The vlue of stte under n optiml policy must equl the expected return for the best ction from tht stte: V (s) mx Q (s,) A(s) mx A(s) mx A(s) Er t 1 V (s t 1 ) s t s, t s

More information

Lecture 3: Equivalence Relations

Lecture 3: Equivalence Relations Mthcmp Crsh Course Instructor: Pdric Brtlett Lecture 3: Equivlence Reltions Week 1 Mthcmp 2014 In our lst three tlks of this clss, we shift the focus of our tlks from proof techniques to proof concepts

More information

Torsion in Groups of Integral Triangles

Torsion in Groups of Integral Triangles Advnces in Pure Mthemtics, 01,, 116-10 http://dxdoiorg/1046/pm011015 Pulished Online Jnury 01 (http://wwwscirporg/journl/pm) Torsion in Groups of Integrl Tringles Will Murry Deprtment of Mthemtics nd Sttistics,

More information

5. (±±) Λ = fw j w is string of even lengthg [ 00 = f11,00g 7. (11 [ 00)± Λ = fw j w egins with either 11 or 00g 8. (0 [ ffl)1 Λ = 01 Λ [ 1 Λ 9.

5. (±±) Λ = fw j w is string of even lengthg [ 00 = f11,00g 7. (11 [ 00)± Λ = fw j w egins with either 11 or 00g 8. (0 [ ffl)1 Λ = 01 Λ [ 1 Λ 9. Regulr Expressions, Pumping Lemm, Right Liner Grmmrs Ling 106 Mrch 25, 2002 1 Regulr Expressions A regulr expression descries or genertes lnguge: it is kind of shorthnd for listing the memers of lnguge.

More information

5: The Definite Integral

5: The Definite Integral 5: The Definite Integrl 5.: Estimting with Finite Sums Consider moving oject its velocity (meters per second) t ny time (seconds) is given y v t = t+. Cn we use this informtion to determine the distnce

More information

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018 CS 301 Lecture 04 Regulr Expressions Stephen Checkowy Jnury 29, 2018 1 / 35 Review from lst time NFA N = (Q, Σ, δ, q 0, F ) where δ Q Σ P (Q) mps stte nd n lphet symol (or ) to set of sttes We run n NFA

More information