Context model automata for text compression (published in The ComputerJournal, 41 (7), , 1998)

Size: px
Start display at page:

Download "Context model automata for text compression (published in The ComputerJournal, 41 (7), , 1998)"

Transcription

1 Context model utomt for text ompression (pulished in The ComputerJournl, 41 (7), , 1998) Psi Fränti Deprtment of Computer Siene, University of Joensuu, Box 111, FIN Joensuu, FINLAND Timo Htkk Deprtment of Informtion Tehnology, Lppeenrnt University of Tehnology Box 20, FIN Lppeenrnt, FINLAND Astrt Finite-stte utomt offer n lterntive pproh to implement ontext models where the sttes in the utomt nnot in generl e ssigned y single ontext. Despite the potentil of this pproh, it mkes the design of the modelling more prolemti euse the ext ehviour of the model is not known. Here we propose simple formlism ontext model utomt (CMA) tht gives n ext interprettion for the minimum ontext elonging to eh stte in the utomton. The formlism is generl enough to simulte ontext models suh s PPM nd GDMC. Using the CMA formlism s our tool, we study the ehviour of the ove two ontext models. Key words: Text ompression, ontext modelling, PPM models, Mrkov models Numer of words: 8100 (inluding figures nd tles) Numer of lines: 1000 (inluding figures nd tles) Numer of figures: 14 Numer of tles: 3 Correspondene: Psi Fränti Deprtment of Computer Siene, University of Joensuu P.O. Box 111, FIN Joensuu, FINLAND Phone: Fx: Emil: frnti@s.joensuu.fi

2 1. Introdution Text ompression is typilly performed y sttistil or ditionry-sed methods [1, 2]. The input file is proessed y single pss strting with n initil model or ditionry. The method then dpts to the soure dt during the ompression. Only the dt tht is lredy ompressed (thus lso known y the deoder) n e used in the model. In this wy the deompression n e done synhronously. Ditionry-sed methods deompose the soure dt into sustrings whih re ompressed seprtely either y pointer to n erlier ourrene of the string in the soure dt (LZ77 vrints) [3], or y pointer to dynmilly onstruted ditionry (LZ78 vrints) [4]. The pointers re stored using fixed length oding, or n entropy oding method suh s Huffmn oding [5]. Better ompression performne is hieved y sttistil ompression t the ost of lower throughput nd higher memory onsumption. In symol-sed ontext models eh symol is predited y ontext, whih is omintion of few preeding symols. A different sttistil model is onstruted for eh ontext on the sis of the previously oded dt. Eh model onsists of tle of suessor symols nd their frequenies. The tul oding n e performed y rithmeti oding using the implementtion given y Witten et l. [6]. Vrile-order ontexts hve een widely used in PPM models (predition y prtil mth), originlly proposed in [7]. The highest order ontext is first pplied. If the symol is not predited y the ontext, n espe symol is oded nd nother lower level ontext is used. If no other ontext n e pplied, the symol is eventully oded using zero-order model. The size of the highest order ontext is kept reltively smll (three in [1]) to redue the memory requirements nd the lerning ost. Dt ompression n oneptully e divided into two seprte omponents: modelling nd oding. Modelling is the key question in the design of ompression lgorithm wheres the oding n e optimlly done using rithmeti oding. From this point of view, it is esy to see tht the sttistil models re more generl thn the ditionry-sed models sine they expliitly seprte the two omponents from eh other. In ft, ommonly used ditionrysed methods n e simulted y symol-wise sttistil models. For exmple, Lngdon hs given sttistil ounterprt for LZ78 in [8]. A third pproh to text ompression is the use of finite stte utomt. An utomton onsists of set of sttes nd trnsitions etween the sttes. Eh trnsition orresponds to single oding event: oding of symol or prt of it. All sttes must inlude outgoing trnsitions for every possile oding inident tht might pper. An implementtion known s Dynmi Mrkov Compression (DMC) uses inry lphet in the utomton [9]. Symols of lrger lphets must e hndled itwise. It hs een shown tht DMC tully implements finite-ontext model [10]. Generlized Dynmi Mrkov Compression (GDMC), proposed y Teuhol nd Rit [11], generlizes the ide of DMC to multisymol lphets. The sttes in the utomton orrespond to the ontexts nd the trnsitions to the seletion of the next stte. Espe strtegy is tken to redue the numer of outgoing trnsitions in eh stte. GDMC 2

3 implements ontext model with potentilly stronger pilities thn the PPM model ut its ext ehviour is not understood very well. We propose simple formlism, ontext model utomt (CMA), for modelling the ontexts in finite-stte utomton. It gives n ext interprettion for the minimum ontext elonging to eh stte in the utomton. We will show tht oth PPM nd GDMC follow the CMA formlism. We thus otin ontext interprettion for the sttes of GDMC utomton. CMA n lso e used s prtil tool for nlyzing the ehviour of the GDMC nd PPM models, nd to design new ones. A more theoretil pproh hs een tken y Bunton [12] for studying the ontexts in fulltle model where eh stte inludes omplete set of trnsitions. The ontexts re defined s set of strings desried y regulr expressions. This gives n ext ut not neessrily n esy interprettion for the ontexts of the sttes. Our pproh does not tke into ount the prtil ontexts inluded in the sttes ut it gives simpler interprettion for the ontexts, whih n e esily utilized in prtil pplitions. We fous here on models tht uses the espe strtegy, in whih only suset of the entire lphet my pper in eh stte. This is more fithful for the trditionl ontext models suh s PPM. Using CMA formlism s our tool, we study the ehviour of the GDMC nd PPM ontext models. CMA gives useful informtion out the struture of the utomton y giving ontext interprettion for the sttes. CMA, however, models only n existing utomton nd it does not define the wy the utomton is onstruted. The dynmi onstrution of CMA is therefore the key question when designing new utomton-sed ompression sheme. In this study, we limit the disussion to the existing models underlying in PPM nd GDMC. In PPM, the retion of new sttes depends on the ontext length, nd in GDMC, on the sttistis of the stte. PPM is more greedy in the model expnsion nd it dpts quiker to the soure dt. The model growth is ontrolled y limiting the ontext length y predefined mximum. GDMC, on the other hnd, hs no upper limit for the ontext length ut it uses frequeny sed heuristi to ontrol the model growth. Due to its lzier expnsion, the resulting ontexts re not neessrily unique. Although CMA defines minimum ontext length for eh stte, the utomton hs longer prtil ontexts. It is noted tht the CMA model itself sys nothing out the ompression performne. It is therefore eqully importnt to onsider lso the sttistil model integrted with the ontext model. For exmple, oth PPM nd GDMC use greedy stte seletion (the first preditive stte is lwys pplied). Better results ould e otined using informtion theoreti stte seletion, sed on the expeted ompression performne of the sttes. However, it is not ler how the expeted ompression performne should e mesured. In the rest of the pper, we will fous on ontext modelling only, nd settle for the existing PPM nd GDMC methods in the sttistil modelling. 3

4 2. Context model utomt The min omponents of CMA-sed ompression lgorithm re: Context model utomton (CMA) Sttistil model Entropy oder The ojet of CMA is to model ontexts only. For eh input symol the urrent stte of the utomton determines the ontext prediting the symol to e oded. The sttistil model then lultes the proility distriution using the symol frequenies inluded in the hosen ontext. The tul oding is performed y n entropy oder (usully rithmeti oding) on the sis of the determined proility distriution. After the oding, the sttistis re updted. CMA implements the ontext model s follows. Eh stte in the utomton orresponds to single ontext hving its own sttistis. The sttes inlude lso trnsitions for ll possile oding events tht might pper. A trnsition leds to the stte to e used next. The utomton is deterministi: its ehviour depends only on the input (soure dt nd prmeters), i.e., the utomton works lwys in the sme wy for the sme input. The sttes do not neessrily inlude trnsitions for ll possile symols nd the espe strtegy is pplied insted. Only trnsitions for lredy seen symols n pper in the sttes. The rest re oded y n espe-trnsition () leding to nother stte. If the symol does not exist in the following stte, n espe pth is followed until proper stte is rehed nd the symol n e oded. The espe strtegy elimintes lso the zero-frequeny prolem [1]. An exmple of CMA nd its pplition for oding smple soure string re illustrted in Figure 1. The min phses of CMA-sed ompression lgorithm re desried in Figure 2. The input dt is proessed symol y symol nd the utomton is onstruted dynmilly during the oding proess. The ompression strts with n initil utomton from its initil stte. The initil stte typilly inludes trnsitions for ll possile symols nd it rries no ontext. After symol hs een oded nd sttistis updted, the utomton n e updted pplying two si loning opertions: Crete new stte Crete new trnsition The modifition of the utomton thus orresponds to the dynmi modelling in PPM-sed ompression shemes. The initil utomton does not need to e very omplited. It is more importnt tht the utomton dpts to the input dt. The key questions of CMA re when nd how to lone. Solutions for these questions will e disussed in Setion 3. 4

5 NULL CODING EXAMPLE: Soure string: "" Sttes: NULL NULL Coded symols: es Figure 1: Exmple of ontext model utomton for lphet = {,, }. Compress(File): { C = C 0 /* initil stte */ REPEAT Symol = GetNextSymol(File); C = CompressSymol(C, Symol); UNTIL Symol=EOF; } CompressSymol(C, Symol): returns next stte. { IF EXISTS( C.Trnsition(Symol) ) THEN EnodeSymol(C, Symol); UpdteSttistis(C, Symol); nextc = C.Trnsition(Symol); Updte utomton ELSE EnodeSymol(C, ); UpdteSttistis(C, ); nextc = CompressSymol(C.trnsition(), Symol); Updte utomton RETURN nextc; } Figure 2: Pseudo ode for the CMA oding proess Forml desription of CMA The following symols nd definitions will e used to desrie CMA: Alphet of ll symols Stte in the utomton C Initil stte C 0 Order of the ontext in C C Any trnsition (S D) T Soure stte for trnsition S Destintion stte for trnsition D Espe trnsition Lel of trnsition Lel(T) 5

6 There re two kinds of trnsitions in the utomton: tul trnsitions nd espe trnsitions. Lel() is defined to e n empty string. Note tht the -symol still needs to e oded. Pth desries sequene of trnsitions orresponding to trversl in the utomton. Lel of pth denotes the tention of ll non- lels of the trnsitions long the pth. The lel of trversed pth orresponds to fixed numer of previously oded symols. Using the ove symols nd definitions, n utomton is defined to e ontext model utomton (CMA) if it meets the following riteri: Definition 1: (deterministiity property): There is only one outgoing trnsition from eh stte with the sme lel. Furthermore, only one possile outgoing -trnsition exists in eh stte. Definition 2: (termintion property): The utomton must terminte in finite numer of steps for ny soure string. Definition 3: (ontext property): The ontext in eh stte C is the longest ommon suffix of the lels of ll possile pths from C 0 to C. The order of the ontext in C is the length of tht suffix string, nd is denoted y C. Definition 4: (suffix property): Consider trnsition T in the form S D. The ontext of D is Suffix(S)+Lel(T). In other words, the next ontext fter n tul trnsition is lwys suffix of the ontext of the previous stte, ontented with the symol just oded. The termintion property sttes tht the utomton must e le to enode ny soure string nd terminte within finite time, i.e., it reognizes ll the lnguges over the lphet. This mens tht ll espe pths from ny stte must eventully led to stte tht is ple of enoding the input symol. In prtie, this is usully solved so tht ll espe pths end up in the initil stte of the utomton, whih is ple of oding ll lphet symols. The ontext property defines the ontext of stte C s the longest uniquely determined ontext. For exmple, onsider the stte in Figure 1. The lels of ll possile inoming pths of length 3 re nd. Their longest ommon suffix is, whih is therefore the ontext of tht stte due to the ontext property. The tul ontext, however, my still e longer thn C. In the previous exmple, there exists longer (ut only prtilly unique) ontext ( or )+ euse the inoming pth with the lel nnot pper. The following orollries re dedued diretly from the CMA definitions: Corollry 1: If there is n tul trnsition from S to D, then D S + 1. Corollry 2: If there is n -trnsition from S to D, then D S. Corollry 3: All tul trnsitions inoming to D hve the sme lel. Furthermore, if there is t lest one tul trnsition inoming to D then the lst non- trnsition long ll pths leding to D must hve the sme lel. 6

7 The orollry 1 sttes tht the order of the ontext nnot e inresed y more thn one due to single trnsition. This n e verified oth y the ontext nd suffix properties. The suffix property imposes lso tht the ontext nnot expnd due to n espe trnsition (orollry 2). The suffix property is rther strit. Consider n -trnsition etween two sttes with ontext of order 1 or greter. The suffix property implies tht these ontexts must hve the sme lst symol (orollry 3). A weker definition would hve stted tht D = Suffix(S+Lel(T)) ut we dopt here the stronger definition Dynmi onstrution of CMA CMA gives ontext interprettion for the sttes of utomton. However, CMA models only n existing utomton nd does not sy nything out how the utomton is onstruted. The dynmi onstrution of CMA must therefore e defined y n initil utomton nd y set of loning opertions. We define CMA-sed ompression method reursively y the following three riteri: Requirement 1: The initil utomton is CMA. Requirement 2: The new utomton due to loning opertion is CMA. Requirement 3: Cloning does not hnge the ontexts of existing sttes. These first two requirements re rther loose nd they give gret freedom in the design of the ontext model. The third requirement, however, defines tht loning nnot hnge the ontexts of n existing utomton. Thus, the ontext of stte is determined t the time of its retion nd loning opertions my generte only lol hnges in the utomton. In this wy, we n esily keep trk of the wy the utomton develops. 7

8 3. CMA implementtions of GDMC nd PPM models In the following setion we will show how the GDMC nd PPM models n e implemented under the CMA onept. We will first disuss the loning opertions introdued in GDCM nd show tht they diretly implement CMA. The PPM model is then simulted using the sme loning opertions, nd y mking only minor modifitions to the CompressSymol funtion GDMC model The initil model of GDMC is given in Figure 3. It is lerly deterministi utomton nd it n enode ll symols y finite numer of steps euse ll sttes hve n espe pth leding to the initil stte, whih ontins trnsitions for ll lphet symols. If the sttes re ssigned y the ontexts shown in Figure 3, the utomton will lso meet the ontext nd suffix properties. Thus, the first requirement for dynmi CMA is fulfilled Cloning opertions in GDMC: GDMC involves two loning opertions: (1) reting new sttes, nd (2) reting new trnsitions (shortuts). A new stte n e reted fter symol hs een oded in the THEN-rnh of the CompressSymol funtion (see Figure 2). The ojets of the opertion re the urrent stte nd the next stte in the utomton. Denote the lst oded symol y nd the orresponding trnsition y S D. The loning is performed y reting new ontext S nd redireting the originl trnsition from S D to S S. An -trnsition S D is lso reted, see Figure 4. The deision when to lone depends on the frequeny distriution in the ontext, nd is issued in Setion 3.3. Note tht the loning in GDMC is performed only if the soure stte diretly predits the next symol. It is thus never pplied fter n espe pth. GDMC involves lso the so-lled shortut opertion. Shortut is new trnsition tht ypsses omintion of n -trnsition followed y n tul trnsition, see Figure 5. Note tht it is possile to trverse long -pth until orretly lelled trnsition is found. In this se, only the lst -trnsition is involved in the shortut opertion, see Figure 6. Alterntive shortut strtegies hve een disussed in [11]. For exmple: (1) reting diret shortut S 1 D, or (2) reting ll possile shortuts S 1 D, S 2 D, nd so on. It is lso noted tht in GDMC, it is llowed to rete new stte diretly fter shortut hs een reted. NULL d Figure 3: The initil model of GDMC. d 8

9 D D' S D C D' C S S S S Figure 4: Cloning of GDMC. Figure 5: Shortut of GDMC. S1 S n-1 Sn Shortut D Figure 6: Long pth of -trnsitions CMA properties fter ontext loning: The new utomton fter the ontext loning is deterministi nd the inlusion of the espe trnsition does not rek the termintion property. The ontext of the soure stte S is not ffeted y the loning nd so we n limit the exmintion of the ontext nd suffix properties to S nd D only. D denotes here the stte D fter loning. The ontext property sttes tht the ontext of stte is the longest ommon suffix of the lels of ll pths from C 0 to the stte. As the only inoming trnsition to S is S S, the ontext of the new stte n e defined y S (tention of S with ). This fulfills oth the ontext nd suffix properties. The lel of the longest ommon pth inoming to D is either the originl ontext D, or S due to loning. Thus, due to the ontext property D = Min(D, S), where Min is defined s the shorter of the two strings. Considering tht the utomton ws CMA efore loning, we know tht: D = Suffix(S)+ (suffix property). It follows tht D = Min(Suffix(S)+, S) = Suffix(S)+ = D. In other words, the ontext of D does not hnge due to loning nd therefore D fulfills the ontext property. Furthermore, the loning hs no effet on ny suessor of D. The suffix property of D n lso e verified esily: D = Suffix(S)+Lel() = Suffix(S) whih is true euse D = Suffix(S)+. The modified utomton therefore meets the CMA properties lso fter the loning of new stte. In the previous exmintions we hve ssumed tht no trnsition is direted to the stte itself. It is possile tht tul trnsitions in GDMC re sometimes direted to the stte itself. Even so, ll the ove onlusions re still vlid (proof is omitted here). 9

10 CMA properties fter shortut: An -trnsition is trversed in GDMC only when there is not outgoing trnsition lelled y the urrent symol to e oded. Duplite lels re thus never reted nd the utomton remins deterministi. The termintion property is lso preserved sine new espe trnsitions re never reted y shortuts. The retion of shortut does not ffet the sttes S nd C of Figure 5 in ny wy, nd so only the stte D needs to e exmined. The lel of the longest ommon pth inoming to D is either D or Suffix(S)+ due to the new trnsition. Considering tht the utomton ws lso CMA efore the modifition, we know tht D = Suffix(C)+ nd C = Suffix(S). It thus follows tht D = Min(D, Suffix(S)+) = Min(Suffix(C)+, Suffix(S)+) (1) The ft tht C = Suffix(S) implies Suffix(C)+ Suffix(S)+. The ove eqution therefore simplifies to: D = Suffix(C)+ = D. (2) This gives proof to the ontext nd suffix properties for D. Moreover, sine D = D, the shortut does not ffet the sueeding sttes of D in ny wy. Thus, the modified utomt due to the retion of shortut meet the CMA properties. Note tht the lterntive shortut strtegies disussed in [11] would meet ll the CMA riteri, too PPM model In PPM, the oding of symol strts lwys t the highest level N mx. If the symol nnot e oded y the urrent ontext, lower level ontexts re rehed using espe trnsitions. After proper ontext hs een found nd the symol hs een oded, the proess should then ontinue t the highest level gin. The orollry 1 of the CMA properties, however, sttes tht the order of ontext nnot inrese y more thn one due to single trnsition. For exmple, suppose tht the lst three oded symols were (ssuming N mx =3), ut the next symol d is predited only y the ontext. After d hs een oded, the next ontext ording to the PPM model should e d. But there nnot e diret trnsition d in CMA. In order to perform the PPM simultion, the mening of trnsition is more relxed thn in GDMC. Originlly trnsition orresponds not only to the oding of symol ut it lso indites the stte to e used for the next symol. The loning nd retion of trnsitions re defined in the sme wy s in GDMC ut the oding is not neessrily ontinued from the stte pointed out y the trnsition. Insted, the next stte is the new stte due to loning. The CompressSymolPPM funtion simulting the PPM model is defined in Figure 7. The funtion tkes the symol to e oded nd n N-level ontext s input, nd outputsthe next ontext to e used, whih is loted t level N+1 (or t level N mx in se of N=N mx ). If trnsition for the input symol exists, the symol is ompressed nd the trnsition is followed 10

11 for otining the ontext for the next symol. The resulting stte is then returned s the output of the funtion. This is trivil se where no loning is needed. In the se when the orret trnsition does not exist, the routine follows n espe pth reursively through lower levels nd when proper stte is found nd the symol oded, the funtion egins ktrking mong the trversed espe pth. New sttes re loned during the ktrking nd the result of the reursion is stte t the sme level s the urrent one. If the urrent stte is t level N mx, shortut is reted for pointing to the newly otined stte. Otherwise (whih hppens t the lower levels during the reursion) new stte is loned t the level N+1. This proess is illustrted in Figure 8 where d is the input symol nd the ontext. Sine trnsition d does not exist the funtion is reursively lled nd the stte d is eventully otined s result. The simultion is now lmost omplete ut we still need to define the initil utomton for PPM. In priniple, the utomton of Figure 3 ould e pplied ut the originl lgorithm in [7] uses slightly different initilistion. This is implemented y the utomton shown in Figure 9. The first stte ( stti ) hs trnsitions for ll possile symols in the lphet nd the sttistil model ssigned to the stte is ssumed to e stti. This is sometimes referred s -1 level ontext in the literture. The seond stte ( dptive ) hs only n espe trnsition to the stti stte. In this wy the model my etter dpt to the input dt euse it hs no trnsitions for unused symols s urden. In our implementtion, we let lso the stti stte to dpt to the input dt. The initil model of Figure 9 is theoretilly not len euse the dptive stte ontrdits the suffix property of CMA (inoming trnsitions should not hve different lels). This hs no prtil onsequenes euse the exeption remins lol nd the rest of the dynmilly onstruted utomton will still meet ll the CMA riteri. If we hd dopted the weker version of the suffix property (stting tht D = Suffix(S+Lel(T)) the initil model would hve een proper CMA. CompressSymolPPM(C, Symol): returns next stte. { IF EXISTS( C.Trnsition(Symol) ) THEN EnodeSymol(C, Symol); UpdteSttistis(C, Symol); nextc = C.Trnsition(Symol); ELSE EnodeSymol(C, ); nextc = CompressSymolPPM(C.trnsition(), Symol); CreteTrnsition(C, nextc, Symol); IF C < N mx THEN nextc = CloneContext(C, nextc, Symol); RETURN nextc; } Figure 7: Compression funtion for CMA in order to simulte PPM. The opertions modifying the utomton re enhned. <Figure 8 is not inluded in this file due to tehnil prolems. Only pper opy exists!!!> 11

12 Figure 8: Exmple of the use of the CompressSymolPPM funtion. NULL stti NULL dptive Figure 9: Initil model for PPM Properties of the GDMC nd PPM models In the following, we will study the ehviour of the GDMC nd PPM ontext models using the CMA formlism s our tool. The ontexts in CMA n e esily determined during the oding proess y ssigning the sttes with their level numer. Both in GDMC nd PPM, the level numer C of new stte C is lredy known t the time it is loned. Due to the suffix property, it is lwys one higher thn tht of the soure stte involved in the loning, see Figure 4. The tul ontext is then the C lst oded symols Cloning sttes: The key questions for reting new sttes re: when to lone nd how to lone. PPM lones lwys fter symol hs een oded. Greedy ontext expnsion is usully dvntgeous, see [1, p. 149] nd [2], ut the growth of the utomton must e limited in some wy to keep the memory onsumption resonle. The strightforwrd solution of PPM limits the ontext lengths to predefined mximum N mx. GDMC, on the other hnd, hs no upper limit for the ontext length ut it uses the following heuristi to ontrol the growth. Consider tht the trnsition S D ws trversed. A new stte is reted if the differene i f D DD f S D i (3) 12

13 exeeds predefined loning onstnt (CC). Here DD i refers to suessor stte of D, nd f to trnsition frequeny. This gives fesile solution for limiting the growth of the utomton. Best ompression results were reported with very smll CC-vlues [11]. Note tht unounded ontext lengths hve lso een proposed for PPM in [13]. This method limits the model growth y loning only when ontext is used seond time. Another signifint differene etween the GDMC nd PPM models lies in the wy the utomton is trversed. When oding long sequene of espe trnsitions, the PPM model llows ktrking mong the trversed espe pth. New sttes re lso loned during the ktrking. In this wy, PPM retes ll possile ontexts from the urrent level up to the level N mx wheres GDMC expnds the utomton only y one ontext t time. PPM is therefore more greedy in the ontext loning. After new stte hs een loned, there is question whether the oding proess should ontinue from the urrent stte D (like in GDMC), or from the new stte tht ws reted in the loning (like in PPM). At first sight there seems to e no point to ontinue from the newly reted stte euse it hs not yet lerned ny trnsitions. The ontext nnot therefore predit the next symol nd n espe trnsition must e followed k to the stte D. The trversed espe trnsition, on the other hnd, does not ost ny its sine its proility is 1. Besides, if the oding ontinues from the new stte it n diretly lern the next symol to e oded, nd in this wy the stte dpts fster to the soure dt. Figure 10 demonstrtes the ehviour of the GDMC nd PPM models when smple string is oded. PPM lerns ll existing ontexts up to the upper limit, wheres GDMC expndeds only up to the seond level. None of the seond level ontexts hs lerned ny trnsitions yet. The fster model growth of PPM n e summrised in the following three properties: (1) greedy loning riterion, (2) multiple loning opertions in single oding step, (3) oding is ontinued diretly from the new stte fter loning. NULL NULL NULL Figure 10: GDMC (left) nd PPM (right) utomt fter the oding of string. In GDMC, loning ws performed when ontext ws used seond time. In PPM, N mx ws set to 3. 13

14 Trnsitions: Consider trnsition S D (ssuming S <N mx ) nd denote D - S y the inrese of ontext length due to the trnsition. In PPM, ll suh trnsitions inrese the ontext extly y one. This is onsequene of the greedy loning riterion in whih the retion of new trnsition is diretly followed y loning opertion. Only exeptions re the sttes loting t the level N mx. Espe trnsitions derese the ontext lwys y one. In GDMC, trnsitions n lso e direted to the sme, or to lower level in the utomton. For exmple, when oding the symol in the stte of Figure 10, trnsition will e reted. If the stte is not immeditely loned the trnsition will remin. This is possile due to the lzy loning riterion. Trnsitions n lso e direted to themselves. It my hppen when the sme symol is repeted in the soure dt. Espe trnsitions n e direted in GDMC to ny lower level. For exmple, if the stte ws loned fter trversing the trnsition (the new stte eing ), the resulting espe trnsition ( ) would derese the ontext y two. These kind of ritrry jumps etween different levels re possile euse the loning riterion does not tke into ount the ontexts of the sttes ut it depends on the frequenies only Prtil ontexts: The ontext property defines the minimum order of the ontext nd implies tht there might e higher order prtil ontext in the stte. Consider stte C with n N-order ontext. Denote ll possile inoming (N+1)-length pths to C y xc, nd the set of ll possile symols x y X. The stte C is defined to inlude prtil ontext of order N+1 if the set X is true suset of the full symol set (X ). In other words, there exists t lest one symol in the lphet tht nnot preede the ontext. In the se of full tle strtegy (where ll sttes inlude trnsitions for ll lphet symols), eh longer ontext implies prtil ontext in ll of its suffix ontexts [12]. For exmple, if the stte exists in the utomton, the prtil ontext in is ( \)+ euse the stte will never e used for the input string. Similrly, ll other longer ontexts redue ertin string omintions from the ontext of. In the se of espe strtegy, the question of prtil ontexts eomes more omplited euse of the espe trnsitions. For exmple, the stte in Figure 11 (left) does not inlude prtil ontext ( or )+ despite the stte exists in the utomton. The stte is indeed not rehed with the input. However, the ontext of stte is defined merely on the sis of the history. The exlusion of ertin string omintions depending on the future symols is known s the exlusion priniple. Nevertheless, it is not onsidered s prtil ontext euse the stte n still e rehed y ll possile input strings x (x ). Vrile order ontext models using espe strtegy re not so strit out wht is the ext ontext of stte. When longer ontext () dpts to more nd more different trnsitions, the effet of the string is redued in the shorter ontext. It is therefore grdully trnsferring towrds the ontext ( or )+ ut it my never reh tht sttus. 14

15 Despite the ove disussion, the CMA utomt n still inlude prtil ontexts in the strit sense. Figure 11 illustrtes how this kind of ontext is reted y the loning opertion of GDMC. The new stte inludes prtil ontext ( or )+ euse it nnot e rehed when oding the string (due to the stte nd the trnsition ). This origintes from the lzy loning riterion of GDMC, whih llows sttes to e loned fter they hve lredy een ypssed y shortuts. The diret onsequene is tht the longest possile ontext is not lwys utilized. In PPM model, this kind of prtil ontexts nnot pper euse of the greedy loning riterion. NULL NULL prtil ontext Figure 11: Exmple of prtil ontext in GDMC Duplite ontexts: Consider the utomton of Figure 11 (left). Assume tht the trnsition NULL ws trversed nd the stte is then loned. This results in two sttes with identil ontexts, see Figure 12. The originl stte will e rehed using the existing trnsitions nd, the new one using the redireted trnsition NULL. Due to the suffix property, the loning opertion retes new ontext whose order is one higher thn tht of the soure trnsition. Duplite ontexts re therefore generted lwys when the trversed trnsition lredy inreses the ontext y one: D = S +1. The min effet of the duplites is tht the new stte will inlude prtil ontext. For exmple, the new ontext in Figure 12 inludes prtil ontext ( or )+. Duplites my lso e generted for D < S +1 if the soure stte S is lredy duplite ontext. In this se, there exist two rnhes in the utomton with similr ontexts. It is unler wht is the dvntge/disdvntge of the duplite ontexts. Presumly they slow down the lerning proess euse the sme set of trnsitions must first e lerned in the originl ontext, nd only then they n e lerned in the new (duplite) ontext. If so desired, the existene of duplites ould e esily voided y llowing loning only when D < S +1. Note tht duplite ontexts nnot pper in PPM model due to the greedy loning riterion. 15

16 NULL Figure 12: Exmple of duplite ontexts Proility estimtion Both PPM nd GDMC use frequeny-sed proility model. The proility for trnsition is lulted y dividing its frequeny ount y the sum of ll frequeny ounts of the outgoing trnsitions. The only exeption is the espe trnsition. In GDMC, espe trnsitions re treted s ny other trnsitions nd they thus hs their own frequeny ounters. In PPM the proility for n espe trnsition is lulted y predefined funtion known s Espe method C [14]: p q q (4) Here q is the numer of outgoing trnsitions in the stte, nd is the sum of their frequeny ounts, i.e., the totl numer of times the stte hs een visited. It is noted tht q equls to the numer of times the espe trnsition hs een used. Both methods thus hve the sme intuition ut the tul proility vlues my differ euse of different initilistion nd updte strtegies. In PPM, the sttistis for new ontext strt from srth nd the frequeny of new trnsition is lwys set to 1. In GDMC, the frequeny of new trnsition (referring to Figure 5) is estimted y: f S D f S C f C D Effsum C S (5) where Effsum(C S) is the sum of ll trnsition ounts in C exluding the ones existing lso in S. This orresponds to the exlusion priniple (see elow). Due to this initilistion the frequenies re stored s rel numers insted of integer domin. See [11] for detils of this proility modelling method. Both PPM nd GDMC pply the so-lled exlusion priniple when lulting the effetive proility for symol. If espe trnsition ws just trversed, it is known tht the urrent symol nnot e ny of the symols predited y the previous stte. All these symols n therefore e exluded from the proility lultions. The exlusion priniple lwys improves the ompression euse otherwise proportion of the proility spe would e wsted for impossile oding events. 16

17 After the symol hs een oded, its frequeny ount is inremented y one. The updte is restrited only to the preditive stte; the ounters in the lower order sttes re not ltered. This priniple, known s updte exlusion, is ommonly used in PPM nd lso in GDMC. An lterntive strtegy would e to follow the espe pth to the initil stte nd updte the frequenies in ll sttes long the pth. For exmple, if the symol ws predited y the ontext, the updte would then onsider lso the sttes nd. 4. Empiril results The files of Clgry orpus [1] were ompressed using the GDMC nd PPM models with the prmeter setup of Tle 1. We wnt to find out wht kind of ontexts (nd how long) the models rete, how fst they dpt to the soure, nd wht kind of trnsitions there exist in the utomt. Additionl tests were performed y vrying the min prmeters of the models. The im is to find out how the prmeter setup ffets the performne of the ompression, nd how well the ontext model n e ontrolled y the prmeters. The throughput of the CMAsed implementtion does not signifintly differ from the previous implementtions of PPM nd GDMC. In the literture, the mximum ontext length of PPM is often set to 3 ut we use N mx =4 euse it gives lower it rte for ll files in Corpus, exept the file geo. The verge it rtes for vrious N mx -vlues (1 to 7) re (3.62, 2.85, 2.46, 2.34, 2.32, 2.33, 2.34). In ddition, we tested lso GDMC with N mx =4 (referred s GDMC-4). The results re summrised in Tle 2. The verge it rtes for GDMC nd GDMC-4 re 2.52 nd The GDMC it rtes re 7 % higher thn tht of PPM on verge. Slightly etter results were otined only for the three longest files (ook1, ook2 nd pi). The enefit of the unlimited ontext length remins mrginl: 2 % on verge nd the mximum improvement is 6 % for the file pi. PPM lerns ll ontexts up to the mximum length tht there re to e lerned. GDMC is more piky in loning nd lerns only out % of the sme ontexts. It lso lerns lrge numer of longer ontexts. The totl numer of ontexts in GDMC is typilly out three times s muh s in PPM ut only out 60 % of the ontexts re unique; the rest re duplites. Tle 1: Summry of the ontext nd proility model prmeters. PPM GDMC Context Context length limit: N mx = 4 No upper limit model Cloning riterion: Clone lwys Depends on sttistis Multiple loning: Yes No Sttistil Proility estimtion: Frequeny sed Frequeny sed model Initil sttistis: Strting from srth Derived from suessor Espe proility: PPM model C Own frequeny ounter for 17

18 Tle 2: Performne omprison (it rtes) of the vrious methods. File Size PPM-4 GDMC GDMC-4 i ook ook geo news oj oj pper pper pi prog progl progp trns Averge In PPM, the input symols were predited y ontext length of 3.27 on verge. In GDMC nd GDMC-4, the orresponding numers were 4.38 nd 2.94 (exluding oj1 nd pi). In pi, the verge length of the predited ontext ws s lrge s 1823 due to the high repetition of zero ytes (lrge homogenous white res in the inry imge). We lso oserved tht GDMC uses lmost lwys the longest ville ontext in the utomton. This indites tht, even if there re prtil ontexts in the utomton (esides the duplites), they re rrely used. It ws shown in Setion 3.3 tht duplites in GDMC n e voided y modifying the loning riterion ordingly. Our experiments show tht this modifition does not hve ny signifint effet on the ompression performne. Smll improvement (0-3 %) ws hieved in se of 9 files out of 14, ut the it rte inresed in the se of geo (4 %) nd pi (16 %). This is most likely explined y the nture of their dt, whih is not tken into ount in the ompression. The file pi, for exmple, is inry imge ut the ontext models proess the dt one yte t time. GDMC utilizes the itwise dependenies ross the yte oundries somewht etter sine it is ple of modelling prtil ontexts (involved lso in the duplite ontexts). Figure 13 illustrtes the soure of the ode its. In PPM, 55 % of the ode its originte from the highest level (N=4). The ode its in GDMC originte from wider vriety of levels, ut overll, the highest levels (N 4) ontriute y lesser mount (38 %) thn the highest level in PPM. Even if there re gret numer of longer ontexts in GDMC, they re only suffix extensions of those ppering lso t the levels N 4. GDMC thus relies more on the lower level ontexts thn PPM. Figure 13 demonstrtes lso the growth of the models. The lower level ontexts re used more often in the erly stge of ompression. When the ompression proeeds nd the models dpt to the dt, the higher levels eome more ommonly used. The overll it rte dereses over the ourse of time. PPM expnds the ontexts fster thn GDMC, whih is shown in etter ompression performne in the erly stges, see Figure

19 Code its (kb) PPM % 20 % 30 % 40 % 50 % 60 % 70 % 80 % 90 % 100 Compressed symols % 3 4 Code its (kb) GDMC 1 10 % 20 % 30 % 40 % 50 % 60 % 70 % 80 % 90 % 100 Compressed symols % >4 Figure 13: Soure level of ode its s funtion time (ll files). Error! Not vlid link. Figure 14: Bit rtes of GDMC reltive to PPM s funtion of time (verge vlues). In GDMC, 32 % of the ode its origintes from espe symols wheres the orresponding numer in PPM is 14 %. By ompring the trnsition sttistis, it ws shown tht GDMC lerns fewer trnsitions thn PPM. The numer of trnsitions (exluding espe trnsitions) per ontext re 2.00 (PPM) nd 1.04 (GDMC) on verge. This is prtly explined y the longer ontexts of GDMC. They n predit ertin symols more urtely thn the lower level ontexts ut the totl numer of trnsitions (predited symols) remins smller. Espe trnsitions re therefore used more often. The destintion level of the trnsitions, however, did not signifintly differ etween the models. Only very few trnsitions led to the lower levels in GDMC, see Tle 3. Tle 3: Destintion level of trnsitions in respet to the soure level. +1 sme -1 lower PPM 34 % 66 % 0.0 % 0.0 % GDMC 95 % 4 % 0.7 % 0.0 % GDMC-4 42 % 56 % 2.4 % 0.0 % PPM nd GDMC hve their differenes lso in the proility model. For exmple, they use different initil utomt (see Figures 3 nd 9). The initil utomton of PPM gives 1 % lower it rte on verge. A more signifint differene is the initilistion of the frequeny ounter when new trnsition is reted. In PPM, the frequenies strt from srth wheres GDMC uses more sophistited frequeny estimtion tehnique (see Setion 3.4). Aording to our experiments, the results of GDMC re out 10 % worse if the PPM-like proility model is pplied. This implies tht the model might e utilized lso in PPM ut unfortuntely this is not the se. The GDMC proility model hs een designed nd optimized merely within the GDMC ontext model nd we were not le to dopt the method in PPM. 19

20 5. Conlusions A simple formlism ws introdued to desrie ontext model utomt (CMA). It provides strong tool to simulte existing ontext modelling methods suh s PPM nd GDMC. The ehviour of the ove two methods were studied oth nlytilly nd experimentlly from the CMA point of view. The min dvntges of the PPM nd GDMC models n e summrised s follows: PPM: Fst dpttion to soure Better initil model Aury of sttistis GDMC: No upper limit for ontext size Prtil ontexts Sophistited proility model The min dvntge of PPM over GDMC is its fster dpttion to the soure dt. This origintes from the following properties: (1) greedy loning riterion, (2) multiple loning opertions in single oding step, (3) oding is ontinued diretly from the new stte fter loning. PPM n lern up to N mx new ontexts per oded symol wheres GDMC n lern only one. The PPM model expnds quiker in the erly prt of ompression ut the growth eomes smoother over the ourse of time. The GDMC model, on the other hnd, keeps on growing with the sme rte throughout the ompression euse there is no upper limit for the ontext length. The initil model of PPM is slightly etter (1 %) thn tht of GDMC euse the zero-order ontext hs no trnsitions for unused symols s urden. Furthermore, the sttistis of the PPM ontexts re more urte thn tht of GDMC. The highest level ontexts in PPM re reted immeditely when they first pper in the soure dt. The sttistis thus onentrte on the relevnt ontexts only. In GDMC it tkes longer to uild the sme ontexts. There re lso gret numer of higher level nd duplite ontexts in the GDMC utomton. The lerning is therefore ineffiiently distriuted over wider numer of ontexts resulting in less urte sttistis. The unounded ontext length of GDMC is oth n dvntge nd disdvntge. It llows more flexiility in the ontext length ut it lso results in unneessrily long ontexts euse the ontext expnsion is not properly ontrolled. This defets the lerning of the sttistis nd etter loning riterion should therefore e designed. The other enefits of GDMC re lso questionle. The potentil of prtil/duplite ontexts is their ility to model more effiiently ontexts eyond the yte oundry (suh s the inry dt of the file pi). Unfortuntely extr ontexts rek up the sttistis into too mny ontexts resulting in less urte predition in the se of norml dt files. There is lso no diret ontrol of the onditions under whih the duplites re reted nd they re reted more or less in rndom fshion. Moreover, we were not le to trnsfer the sophistited estimtion of trnsition frequenies into the PPM model. Its min enefit is most likely to ompenste the slower lerning effiieny of GDMC, i.e., the method tries to predit the frequeny tht the PPM model would hve lredy lerned due to its fster dpttion. To sum up, CMA provides formlism to desrie ontext model utomt. It gives useful informtion out the struture of the utomton. Potentil weknesses, suh s duplites nd 20

21 ritrrily long ontexts, n therefore e esily ontrolled if so desired. CMA, on the other hnd, models only the struture of the utomton without giving ny diret informtion out the ompression performne. It is therefore importnt to onsider lso the sttistil model when designing CMA-sed ompression method. In prtiulr, informtion theoreti stte seletion nd the mixing of the proility estimtes from different sttes hve een shown to improve ompression performne of PPM model y 5-12 % [15]. Aknowledgements The work of Psi Fränti ws supported y grnt from the Ademy of Finlnd, nd the work of Timo Htkk y grnt from the Finnish Culturl Foundtion. 21

22 Referenes 1. Bell, T.C., Clery, J.G. nd Witten, I.H. (1990) Text Compression, Prentie Hll, Englewood Cliffs, NJ. 2. Willims, R.N. (1991) Adptive Dt Compression, Kluwer Ademi Pulishers, Norwell, Msshusetts. 3. Ziv, J. nd Lempel, A. (1977) A universl lgorithm for sequentil dt ompression. IEEE Trnstions on Informtion Theory, 23, Ziv, J. nd Lempel, A. (1978) Compression of individul sequenes vi vrile-rte oding. IEEE Trnstions on Informtion Theory, 24, Huffmn, D. (1952) A method for the reonstrution of minimum redundny odes. Pro. of the IRE, 40, Witten, I.H., Nel, R.M. n Clery, J.G. (1987) Arithmeti oding for dt ompression. Communitions of the ACM, 30, Clery, J.G. nd Witten, I.H. (1984) Dt ompression using dptive oding nd prtil string mthing, IEEE Trnstions on Communitions, 32, Lngdon, G.G. (1983) A note on the Ziv-Lempel model for ompressing individul sequenes. IEEE Trnstions on Informtion Theory, 29, Cormk, G.V. nd Horspool, R.N.S. (1987) Dt ompression using dynmi mrkov modelling, The Computer Journl, 30, Bell, T.C. nd Mofft, A. (1989) A note on the DMC dt ompression sheme, The Computer Journl, 32, Teuhol, J. nd Rit, T. (1993) Applition of finite-stte model to text ompression, The Computer Journl, 36, Bunton, S. (1995) The struture of DMC, Proeedings Dt Compression Conferene, Snowird, Uth, Clery, J.G. nd Tehn, W.J. (1997) Unounded length ontexts for PPM, The Computer Journl, 40, Mofft, A. (1990) Implementing the PPM dt ompression sheme, IEEE Trnstions on Communitions, 38, Bunton, S. (1997) Semntilly motivted improvements for PPM vrints, The Computer Journl, 40,

NON-DETERMINISTIC FSA

NON-DETERMINISTIC FSA Tw o types of non-determinism: NON-DETERMINISTIC FS () Multiple strt-sttes; strt-sttes S Q. The lnguge L(M) ={x:x tkes M from some strt-stte to some finl-stte nd ll of x is proessed}. The string x = is

More information

Lecture 6: Coding theory

Lecture 6: Coding theory Leture 6: Coing theory Biology 429 Crl Bergstrom Ferury 4, 2008 Soures: This leture loosely follows Cover n Thoms Chpter 5 n Yeung Chpter 3. As usul, some of the text n equtions re tken iretly from those

More information

Project 6: Minigoals Towards Simplifying and Rewriting Expressions

Project 6: Minigoals Towards Simplifying and Rewriting Expressions MAT 51 Wldis Projet 6: Minigols Towrds Simplifying nd Rewriting Expressions The distriutive property nd like terms You hve proly lerned in previous lsses out dding like terms ut one prolem with the wy

More information

Introduction to Olympiad Inequalities

Introduction to Olympiad Inequalities Introdution to Olympid Inequlities Edutionl Studies Progrm HSSP Msshusetts Institute of Tehnology Snj Simonovikj Spring 207 Contents Wrm up nd Am-Gm inequlity 2. Elementry inequlities......................

More information

CS 573 Automata Theory and Formal Languages

CS 573 Automata Theory and Formal Languages Non-determinism Automt Theory nd Forml Lnguges Professor Leslie Lnder Leture # 3 Septemer 6, 2 To hieve our gol, we need the onept of Non-deterministi Finite Automton with -moves (NFA) An NFA is tuple

More information

Finite State Automata and Determinisation

Finite State Automata and Determinisation Finite Stte Automt nd Deterministion Tim Dworn Jnury, 2016 Lnguges fs nf re df Deterministion 2 Outline 1 Lnguges 2 Finite Stte Automt (fs) 3 Non-deterministi Finite Stte Automt (nf) 4 Regulr Expressions

More information

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER MACHINES AND THEIR LANGUAGES ANSWERS

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER MACHINES AND THEIR LANGUAGES ANSWERS The University of ottinghm SCHOOL OF COMPUTR SCIC A LVL 2 MODUL, SPRIG SMSTR 2015 2016 MACHIS AD THIR LAGUAGS ASWRS Time llowed TWO hours Cndidtes my omplete the front over of their nswer ook nd sign their

More information

Nondeterministic Automata vs Deterministic Automata

Nondeterministic Automata vs Deterministic Automata Nondeterministi Automt vs Deterministi Automt We lerned tht NFA is onvenient model for showing the reltionships mong regulr grmmrs, FA, nd regulr expressions, nd designing them. However, we know tht n

More information

1 PYTHAGORAS THEOREM 1. Given a right angled triangle, the square of the hypotenuse is equal to the sum of the squares of the other two sides.

1 PYTHAGORAS THEOREM 1. Given a right angled triangle, the square of the hypotenuse is equal to the sum of the squares of the other two sides. 1 PYTHAGORAS THEOREM 1 1 Pythgors Theorem In this setion we will present geometri proof of the fmous theorem of Pythgors. Given right ngled tringle, the squre of the hypotenuse is equl to the sum of the

More information

Algorithms & Data Structures Homework 8 HS 18 Exercise Class (Room & TA): Submitted by: Peer Feedback by: Points:

Algorithms & Data Structures Homework 8 HS 18 Exercise Class (Room & TA): Submitted by: Peer Feedback by: Points: Eidgenössishe Tehnishe Hohshule Zürih Eole polytehnique fédérle de Zurih Politenio federle di Zurigo Federl Institute of Tehnology t Zurih Deprtement of Computer Siene. Novemer 0 Mrkus Püshel, Dvid Steurer

More information

6.5 Improper integrals

6.5 Improper integrals Eerpt from "Clulus" 3 AoPS In. www.rtofprolemsolving.om 6.5. IMPROPER INTEGRALS 6.5 Improper integrls As we ve seen, we use the definite integrl R f to ompute the re of the region under the grph of y =

More information

Technische Universität München Winter term 2009/10 I7 Prof. J. Esparza / J. Křetínský / M. Luttenberger 11. Februar Solution

Technische Universität München Winter term 2009/10 I7 Prof. J. Esparza / J. Křetínský / M. Luttenberger 11. Februar Solution Tehnishe Universität Münhen Winter term 29/ I7 Prof. J. Esprz / J. Křetínský / M. Luttenerger. Ferur 2 Solution Automt nd Forml Lnguges Homework 2 Due 5..29. Exerise 2. Let A e the following finite utomton:

More information

Lecture Notes No. 10

Lecture Notes No. 10 2.6 System Identifition, Estimtion, nd Lerning Leture otes o. Mrh 3, 26 6 Model Struture of Liner ime Invrint Systems 6. Model Struture In representing dynmil system, the first step is to find n pproprite

More information

Lossless Compression Lossy Compression

Lossless Compression Lossy Compression Administrivi CSE 39 Introdution to Dt Compression Spring 23 Leture : Introdution to Dt Compression Entropy Prefix Codes Instrutor Prof. Alexnder Mohr mohr@s.sunys.edu offie hours: TBA We http://mnl.s.sunys.edu/lss/se39/24-fll/

More information

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4 Intermedite Mth Circles Wednesdy, Novemer 14, 2018 Finite Automt II Nickols Rollick nrollick@uwterloo.c Regulr Lnguges Lst time, we were introduced to the ide of DFA (deterministic finite utomton), one

More information

A Lower Bound for the Length of a Partial Transversal in a Latin Square, Revised Version

A Lower Bound for the Length of a Partial Transversal in a Latin Square, Revised Version A Lower Bound for the Length of Prtil Trnsversl in Ltin Squre, Revised Version Pooy Htmi nd Peter W. Shor Deprtment of Mthemtil Sienes, Shrif University of Tehnology, P.O.Bo 11365-9415, Tehrn, Irn Deprtment

More information

Chapter 4 State-Space Planning

Chapter 4 State-Space Planning Leture slides for Automted Plnning: Theory nd Prtie Chpter 4 Stte-Spe Plnning Dn S. Nu CMSC 722, AI Plnning University of Mrylnd, Spring 2008 1 Motivtion Nerly ll plnning proedures re serh proedures Different

More information

Arrow s Impossibility Theorem

Arrow s Impossibility Theorem Rep Voting Prdoxes Properties Arrow s Theorem Arrow s Impossiility Theorem Leture 12 Arrow s Impossiility Theorem Leture 12, Slide 1 Rep Voting Prdoxes Properties Arrow s Theorem Leture Overview 1 Rep

More information

2.4 Theoretical Foundations

2.4 Theoretical Foundations 2 Progrmming Lnguge Syntx 2.4 Theoretil Fountions As note in the min text, snners n prsers re se on the finite utomt n pushown utomt tht form the ottom two levels of the Chomsky lnguge hierrhy. At eh level

More information

Arrow s Impossibility Theorem

Arrow s Impossibility Theorem Rep Fun Gme Properties Arrow s Theorem Arrow s Impossiility Theorem Leture 12 Arrow s Impossiility Theorem Leture 12, Slide 1 Rep Fun Gme Properties Arrow s Theorem Leture Overview 1 Rep 2 Fun Gme 3 Properties

More information

Part 4. Integration (with Proofs)

Part 4. Integration (with Proofs) Prt 4. Integrtion (with Proofs) 4.1 Definition Definition A prtition P of [, b] is finite set of points {x 0, x 1,..., x n } with = x 0 < x 1

More information

ANALYSIS AND MODELLING OF RAINFALL EVENTS

ANALYSIS AND MODELLING OF RAINFALL EVENTS Proeedings of the 14 th Interntionl Conferene on Environmentl Siene nd Tehnology Athens, Greee, 3-5 Septemer 215 ANALYSIS AND MODELLING OF RAINFALL EVENTS IOANNIDIS K., KARAGRIGORIOU A. nd LEKKAS D.F.

More information

= state, a = reading and q j

= state, a = reading and q j 4 Finite Automt CHAPTER 2 Finite Automt (FA) (i) Derterministi Finite Automt (DFA) A DFA, M Q, q,, F, Where, Q = set of sttes (finite) q Q = the strt/initil stte = input lphet (finite) (use only those

More information

, g. Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g. Solution 1.

, g. Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g. Solution 1. Exerise Genertor polynomils of onvolutionl ode, given in binry form, re g, g j g. ) Sketh the enoding iruit. b) Sketh the stte digrm. ) Find the trnsfer funtion T. d) Wht is the minimum free distne of

More information

22: Union Find. CS 473u - Algorithms - Spring April 14, We want to maintain a collection of sets, under the operations of:

22: Union Find. CS 473u - Algorithms - Spring April 14, We want to maintain a collection of sets, under the operations of: 22: Union Fin CS 473u - Algorithms - Spring 2005 April 14, 2005 1 Union-Fin We wnt to mintin olletion of sets, uner the opertions of: 1. MkeSet(x) - rete set tht ontins the single element x. 2. Fin(x)

More information

Compiler Design. Spring Lexical Analysis. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Lexical Analysis. Sample Exercises and Solutions. Prof. Pedro C. Diniz University of Southern Cliforni Computer Siene Deprtment Compiler Design Spring 7 Lexil Anlysis Smple Exerises nd Solutions Prof. Pedro C. Diniz USC / Informtion Sienes Institute 47 Admirlty Wy, Suite

More information

CS 491G Combinatorial Optimization Lecture Notes

CS 491G Combinatorial Optimization Lecture Notes CS 491G Comintoril Optimiztion Leture Notes Dvi Owen July 30, August 1 1 Mthings Figure 1: two possile mthings in simple grph. Definition 1 Given grph G = V, E, mthing is olletion of eges M suh tht e i,

More information

Global alignment. Genome Rearrangements Finding preserved genes. Lecture 18

Global alignment. Genome Rearrangements Finding preserved genes. Lecture 18 Computt onl Biology Leture 18 Genome Rerrngements Finding preserved genes We hve seen before how to rerrnge genome to obtin nother one bsed on: Reversls Knowledge of preserved bloks (or genes) Now we re

More information

Discrete Structures Lecture 11

Discrete Structures Lecture 11 Introdution Good morning. In this setion we study funtions. A funtion is mpping from one set to nother set or, perhps, from one set to itself. We study the properties of funtions. A mpping my not e funtion.

More information

Electromagnetism Notes, NYU Spring 2018

Electromagnetism Notes, NYU Spring 2018 Eletromgnetism Notes, NYU Spring 208 April 2, 208 Ation formultion of EM. Free field desription Let us first onsider the free EM field, i.e. in the bsene of ny hrges or urrents. To tret this s mehnil system

More information

Proving the Pythagorean Theorem

Proving the Pythagorean Theorem Proving the Pythgoren Theorem W. Bline Dowler June 30, 2010 Astrt Most people re fmilir with the formul 2 + 2 = 2. However, in most ses, this ws presented in lssroom s n solute with no ttempt t proof or

More information

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata CS103B ndout 18 Winter 2007 Ferury 28, 2007 Finite Automt Initil text y Mggie Johnson. Introduction Severl childrens gmes fit the following description: Pieces re set up on plying ord; dice re thrown or

More information

p-adic Egyptian Fractions

p-adic Egyptian Fractions p-adic Egyptin Frctions Contents 1 Introduction 1 2 Trditionl Egyptin Frctions nd Greedy Algorithm 2 3 Set-up 3 4 p-greedy Algorithm 5 5 p-egyptin Trditionl 10 6 Conclusion 1 Introduction An Egyptin frction

More information

Nondeterministic Finite Automata

Nondeterministic Finite Automata Nondeterministi Finite utomt The Power of Guessing Tuesdy, Otoer 4, 2 Reding: Sipser.2 (first prt); Stoughton 3.3 3.5 S235 Lnguges nd utomt eprtment of omputer Siene Wellesley ollege Finite utomton (F)

More information

Matrices SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics (c) 1. Definition of a Matrix

Matrices SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics (c) 1. Definition of a Matrix tries Definition of tri mtri is regulr rry of numers enlosed inside rkets SCHOOL OF ENGINEERING & UIL ENVIRONEN Emple he following re ll mtries: ), ) 9, themtis ), d) tries Definition of tri Size of tri

More information

8 THREE PHASE A.C. CIRCUITS

8 THREE PHASE A.C. CIRCUITS 8 THREE PHSE.. IRUITS The signls in hpter 7 were sinusoidl lternting voltges nd urrents of the so-lled single se type. n emf of suh type n e esily generted y rotting single loop of ondutor (or single winding),

More information

CS311 Computational Structures Regular Languages and Regular Grammars. Lecture 6

CS311 Computational Structures Regular Languages and Regular Grammars. Lecture 6 CS311 Computtionl Strutures Regulr Lnguges nd Regulr Grmmrs Leture 6 1 Wht we know so fr: RLs re losed under produt, union nd * Every RL n e written s RE, nd every RE represents RL Every RL n e reognized

More information

1 Nondeterministic Finite Automata

1 Nondeterministic Finite Automata 1 Nondeterministic Finite Automt Suppose in life, whenever you hd choice, you could try oth possiilities nd live your life. At the end, you would go ck nd choose the one tht worked out the est. Then you

More information

Spacetime and the Quantum World Questions Fall 2010

Spacetime and the Quantum World Questions Fall 2010 Spetime nd the Quntum World Questions Fll 2010 1. Cliker Questions from Clss: (1) In toss of two die, wht is the proility tht the sum of the outomes is 6? () P (x 1 + x 2 = 6) = 1 36 - out 3% () P (x 1

More information

Parse trees, ambiguity, and Chomsky normal form

Parse trees, ambiguity, and Chomsky normal form Prse trees, miguity, nd Chomsky norml form In this lecture we will discuss few importnt notions connected with contextfree grmmrs, including prse trees, miguity, nd specil form for context-free grmmrs

More information

Probability. b a b. a b 32.

Probability. b a b. a b 32. Proility If n event n hppen in '' wys nd fil in '' wys, nd eh of these wys is eqully likely, then proility or the hne, or its hppening is, nd tht of its filing is eg, If in lottery there re prizes nd lnks,

More information

Counting Paths Between Vertices. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs

Counting Paths Between Vertices. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs. Isomorphism of Graphs Isomorphism of Grphs Definition The simple grphs G 1 = (V 1, E 1 ) n G = (V, E ) re isomorphi if there is ijetion (n oneto-one n onto funtion) f from V 1 to V with the property tht n re jent in G 1 if

More information

Chapter 3. Vector Spaces. 3.1 Images and Image Arithmetic

Chapter 3. Vector Spaces. 3.1 Images and Image Arithmetic Chpter 3 Vetor Spes In Chpter 2, we sw tht the set of imges possessed numer of onvenient properties. It turns out tht ny set tht possesses similr onvenient properties n e nlyzed in similr wy. In liner

More information

CSE 332. Sorting. Data Abstractions. CSE 332: Data Abstractions. QuickSort Cutoff 1. Where We Are 2. Bounding The MAXIMUM Problem 4

CSE 332. Sorting. Data Abstractions. CSE 332: Data Abstractions. QuickSort Cutoff 1. Where We Are 2. Bounding The MAXIMUM Problem 4 Am Blnk Leture 13 Winter 2016 CSE 332 CSE 332: Dt Astrtions Sorting Dt Astrtions QuikSort Cutoff 1 Where We Are 2 For smll n, the reursion is wste. The onstnts on quik/merge sort re higher thn the ones

More information

TIME AND STATE IN DISTRIBUTED SYSTEMS

TIME AND STATE IN DISTRIBUTED SYSTEMS Distriuted Systems Fö 5-1 Distriuted Systems Fö 5-2 TIME ND STTE IN DISTRIUTED SYSTEMS 1. Time in Distriuted Systems Time in Distriuted Systems euse eh mhine in distriuted system hs its own lok there is

More information

THE PYTHAGOREAN THEOREM

THE PYTHAGOREAN THEOREM THE PYTHAGOREAN THEOREM The Pythgoren Theorem is one of the most well-known nd widely used theorems in mthemtis. We will first look t n informl investigtion of the Pythgoren Theorem, nd then pply this

More information

Engr354: Digital Logic Circuits

Engr354: Digital Logic Circuits Engr354: Digitl Logi Ciruits Chpter 4: Logi Optimiztion Curtis Nelson Logi Optimiztion In hpter 4 you will lern out: Synthesis of logi funtions; Anlysis of logi iruits; Tehniques for deriving minimum-ost

More information

(a) A partition P of [a, b] is a finite subset of [a, b] containing a and b. If Q is another partition and P Q, then Q is a refinement of P.

(a) A partition P of [a, b] is a finite subset of [a, b] containing a and b. If Q is another partition and P Q, then Q is a refinement of P. Chpter 7: The Riemnn Integrl When the derivtive is introdued, it is not hrd to see tht the it of the differene quotient should be equl to the slope of the tngent line, or when the horizontl xis is time

More information

Computational Biology Lecture 18: Genome rearrangements, finding maximal matches Saad Mneimneh

Computational Biology Lecture 18: Genome rearrangements, finding maximal matches Saad Mneimneh Computtionl Biology Leture 8: Genome rerrngements, finding miml mthes Sd Mneimneh We hve seen how to rerrnge genome to otin nother one sed on reversls nd the knowledge of the preserved loks or genes. Now

More information

Learning Partially Observable Markov Models from First Passage Times

Learning Partially Observable Markov Models from First Passage Times Lerning Prtilly Oservle Mrkov s from First Pssge s Jérôme Cllut nd Pierre Dupont Europen Conferene on Mhine Lerning (ECML) 8 Septemer 7 Outline. FPT in models nd sequenes. Prtilly Oservle Mrkov s (POMMs).

More information

Symmetrical Components 1

Symmetrical Components 1 Symmetril Components. Introdution These notes should e red together with Setion. of your text. When performing stedy-stte nlysis of high voltge trnsmission systems, we mke use of the per-phse equivlent

More information

Linear Algebra Introduction

Linear Algebra Introduction Introdution Wht is Liner Alger out? Liner Alger is rnh of mthemtis whih emerged yers k nd ws one of the pioneer rnhes of mthemtis Though, initilly it strted with solving of the simple liner eqution x +

More information

CS 2204 DIGITAL LOGIC & STATE MACHINE DESIGN SPRING 2014

CS 2204 DIGITAL LOGIC & STATE MACHINE DESIGN SPRING 2014 S 224 DIGITAL LOGI & STATE MAHINE DESIGN SPRING 214 DUE : Mrh 27, 214 HOMEWORK III READ : Relte portions of hpters VII n VIII ASSIGNMENT : There re three questions. Solve ll homework n exm prolems s shown

More information

Chapter 2 Finite Automata

Chapter 2 Finite Automata Chpter 2 Finite Automt 28 2.1 Introduction Finite utomt: first model of the notion of effective procedure. (They lso hve mny other pplictions). The concept of finite utomton cn e derived y exmining wht

More information

INTEGRATION. 1 Integrals of Complex Valued functions of a REAL variable

INTEGRATION. 1 Integrals of Complex Valued functions of a REAL variable INTEGRATION NOTE: These notes re supposed to supplement Chpter 4 of the online textbook. 1 Integrls of Complex Vlued funtions of REAL vrible If I is n intervl in R (for exmple I = [, b] or I = (, b)) nd

More information

Prefix-Free Regular-Expression Matching

Prefix-Free Regular-Expression Matching Prefix-Free Regulr-Expression Mthing Yo-Su Hn, Yjun Wng nd Derik Wood Deprtment of Computer Siene HKUST Prefix-Free Regulr-Expression Mthing p.1/15 Pttern Mthing Given pttern P nd text T, find ll sustrings

More information

Unit 4. Combinational Circuits

Unit 4. Combinational Circuits Unit 4. Comintionl Ciruits Digitl Eletroni Ciruits (Ciruitos Eletrónios Digitles) E.T.S.I. Informáti Universidd de Sevill 5/10/2012 Jorge Jun 2010, 2011, 2012 You re free to opy, distriute

More information

University of Sioux Falls. MAT204/205 Calculus I/II

University of Sioux Falls. MAT204/205 Calculus I/II University of Sioux Flls MAT204/205 Clulus I/II Conepts ddressed: Clulus Textook: Thoms Clulus, 11 th ed., Weir, Hss, Giordno 1. Use stndrd differentition nd integrtion tehniques. Differentition tehniques

More information

CMSC 330: Organization of Programming Languages

CMSC 330: Organization of Programming Languages CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 CMSC 330 1 Types of Finite Automt Deterministic Finite Automt (DFA) Exctly one sequence of steps for ech string All exmples so fr Nondeterministic

More information

System Validation (IN4387) November 2, 2012, 14:00-17:00

System Validation (IN4387) November 2, 2012, 14:00-17:00 System Vlidtion (IN4387) Novemer 2, 2012, 14:00-17:00 Importnt Notes. The exmintion omprises 5 question in 4 pges. Give omplete explntion nd do not onfine yourself to giving the finl nswer. Good luk! Exerise

More information

Activities. 4.1 Pythagoras' Theorem 4.2 Spirals 4.3 Clinometers 4.4 Radar 4.5 Posting Parcels 4.6 Interlocking Pipes 4.7 Sine Rule Notes and Solutions

Activities. 4.1 Pythagoras' Theorem 4.2 Spirals 4.3 Clinometers 4.4 Radar 4.5 Posting Parcels 4.6 Interlocking Pipes 4.7 Sine Rule Notes and Solutions MEP: Demonstrtion Projet UNIT 4: Trigonometry UNIT 4 Trigonometry tivities tivities 4. Pythgors' Theorem 4.2 Spirls 4.3 linometers 4.4 Rdr 4.5 Posting Prels 4.6 Interloking Pipes 4.7 Sine Rule Notes nd

More information

Lecture 08: Feb. 08, 2019

Lecture 08: Feb. 08, 2019 4CS4-6:Theory of Computtion(Closure on Reg. Lngs., regex to NDFA, DFA to regex) Prof. K.R. Chowdhry Lecture 08: Fe. 08, 2019 : Professor of CS Disclimer: These notes hve not een sujected to the usul scrutiny

More information

Bases for Vector Spaces

Bases for Vector Spaces Bses for Vector Spces 2-26-25 A set is independent if, roughly speking, there is no redundncy in the set: You cn t uild ny vector in the set s liner comintion of the others A set spns if you cn uild everything

More information

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont.

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont. NFA DFA Exmple 3 CMSC 330: Orgniztion of Progrmming Lnguges NFA {B,D,E {A,E {C,D {E Finite Automt, con't. R = { {A,E, {B,D,E, {C,D, {E 2 Equivlence of DFAs nd NFAs Any string from {A to either {D or {CD

More information

Table of Content. c 1 / 5

Table of Content. c 1 / 5 Tehnil Informtion - t nd t Temperture for Controlger 03-2018 en Tble of Content Introdution....................................................................... 2 Definitions for t nd t..............................................................

More information

@#? Text Search ] { "!" Nondeterministic Finite Automata. Transformation NFA to DFA and Simulation of NFA. Text Search Using Automata

@#? Text Search ] { ! Nondeterministic Finite Automata. Transformation NFA to DFA and Simulation of NFA. Text Search Using Automata g Text Serh @#? ~ Mrko Berezovský Rdek Mřík PAL 0 Nondeterministi Finite Automt n Trnsformtion NFA to DFA nd Simultion of NFA f Text Serh Using Automt A B R Power of Nondeterministi Approh u j Regulr Expression

More information

Lesson 2: The Pythagorean Theorem and Similar Triangles. A Brief Review of the Pythagorean Theorem.

Lesson 2: The Pythagorean Theorem and Similar Triangles. A Brief Review of the Pythagorean Theorem. 27 Lesson 2: The Pythgoren Theorem nd Similr Tringles A Brief Review of the Pythgoren Theorem. Rell tht n ngle whih mesures 90º is lled right ngle. If one of the ngles of tringle is right ngle, then we

More information

Alpha Algorithm: Limitations

Alpha Algorithm: Limitations Proess Mining: Dt Siene in Ation Alph Algorithm: Limittions prof.dr.ir. Wil vn der Alst www.proessmining.org Let L e n event log over T. α(l) is defined s follows. 1. T L = { t T σ L t σ}, 2. T I = { t

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb.

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb. CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

Convert the NFA into DFA

Convert the NFA into DFA Convert the NF into F For ech NF we cn find F ccepting the sme lnguge. The numer of sttes of the F could e exponentil in the numer of sttes of the NF, ut in prctice this worst cse occurs rrely. lgorithm:

More information

Generalization of 2-Corner Frequency Source Models Used in SMSIM

Generalization of 2-Corner Frequency Source Models Used in SMSIM Generliztion o 2-Corner Frequeny Soure Models Used in SMSIM Dvid M. Boore 26 Mrh 213, orreted Figure 1 nd 2 legends on 5 April 213, dditionl smll orretions on 29 My 213 Mny o the soure spetr models ville

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2 CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

Behavior Composition in the Presence of Failure

Behavior Composition in the Presence of Failure Behvior Composition in the Presene of Filure Sestin Srdin RMIT University, Melourne, Austrli Fio Ptrizi & Giuseppe De Giomo Spienz Univ. Rom, Itly KR 08, Sept. 2008, Sydney Austrli Introdution There re

More information

Minimal DFA. minimal DFA for L starting from any other

Minimal DFA. minimal DFA for L starting from any other Miniml DFA Among the mny DFAs ccepting the sme regulr lnguge L, there is exctly one (up to renming of sttes) which hs the smllest possile numer of sttes. Moreover, it is possile to otin tht miniml DFA

More information

Regular expressions, Finite Automata, transition graphs are all the same!!

Regular expressions, Finite Automata, transition graphs are all the same!! CSI 3104 /Winter 2011: Introduction to Forml Lnguges Chpter 7: Kleene s Theorem Chpter 7: Kleene s Theorem Regulr expressions, Finite Automt, trnsition grphs re ll the sme!! Dr. Neji Zgui CSI3104-W11 1

More information

Section 1.3 Triangles

Section 1.3 Triangles Se 1.3 Tringles 21 Setion 1.3 Tringles LELING TRINGLE The line segments tht form tringle re lled the sides of the tringle. Eh pir of sides forms n ngle, lled n interior ngle, nd eh tringle hs three interior

More information

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3 2 The Prllel Circuit Electric Circuits: Figure 2- elow show ttery nd multiple resistors rrnged in prllel. Ech resistor receives portion of the current from the ttery sed on its resistnce. The split is

More information

TOPIC: LINEAR ALGEBRA MATRICES

TOPIC: LINEAR ALGEBRA MATRICES Interntionl Blurete LECTUE NOTES for FUTHE MATHEMATICS Dr TOPIC: LINEA ALGEBA MATICES. DEFINITION OF A MATIX MATIX OPEATIONS.. THE DETEMINANT deta THE INVESE A -... SYSTEMS OF LINEA EQUATIONS. 8. THE AUGMENTED

More information

Test Generation from Timed Input Output Automata

Test Generation from Timed Input Output Automata Chpter 8 Test Genertion from Timed Input Output Automt The purpose of this hpter is to introdue tehniques for the genertion of test dt from models of softwre sed on vrints of timed utomt. The tests generted

More information

Homework 3 Solutions

Homework 3 Solutions CS 341: Foundtions of Computer Science II Prof. Mrvin Nkym Homework 3 Solutions 1. Give NFAs with the specified numer of sttes recognizing ech of the following lnguges. In ll cses, the lphet is Σ = {,1}.

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design nd Anlysis LECTURE 5 Supplement Greedy Algorithms Cont d Minimizing lteness Ching (NOT overed in leture) Adm Smith 9/8/10 A. Smith; sed on slides y E. Demine, C. Leiserson, S. Rskhodnikov,

More information

Converting Regular Expressions to Discrete Finite Automata: A Tutorial

Converting Regular Expressions to Discrete Finite Automata: A Tutorial Converting Regulr Expressions to Discrete Finite Automt: A Tutoril Dvid Christinsen 2013-01-03 This is tutoril on how to convert regulr expressions to nondeterministic finite utomt (NFA) nd how to convert

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design nd Anlysis LECTURE 8 Mx. lteness ont d Optiml Ching Adm Smith 9/12/2008 A. Smith; sed on slides y E. Demine, C. Leiserson, S. Rskhodnikov, K. Wyne Sheduling to Minimizing Lteness Minimizing

More information

A Study on the Properties of Rational Triangles

A Study on the Properties of Rational Triangles Interntionl Journl of Mthemtis Reserh. ISSN 0976-5840 Volume 6, Numer (04), pp. 8-9 Interntionl Reserh Pulition House http://www.irphouse.om Study on the Properties of Rtionl Tringles M. Q. lm, M.R. Hssn

More information

ILLUSTRATING THE EXTENSION OF A SPECIAL PROPERTY OF CUBIC POLYNOMIALS TO NTH DEGREE POLYNOMIALS

ILLUSTRATING THE EXTENSION OF A SPECIAL PROPERTY OF CUBIC POLYNOMIALS TO NTH DEGREE POLYNOMIALS ILLUSTRATING THE EXTENSION OF A SPECIAL PROPERTY OF CUBIC POLYNOMIALS TO NTH DEGREE POLYNOMIALS Dvid Miller West Virgini University P.O. BOX 6310 30 Armstrong Hll Morgntown, WV 6506 millerd@mth.wvu.edu

More information

CS 275 Automata and Formal Language Theory

CS 275 Automata and Formal Language Theory CS 275 utomt nd Forml Lnguge Theory Course Notes Prt II: The Recognition Prolem (II) Chpter II.5.: Properties of Context Free Grmmrs (14) nton Setzer (Bsed on ook drft y J. V. Tucker nd K. Stephenson)

More information

1B40 Practical Skills

1B40 Practical Skills B40 Prcticl Skills Comining uncertinties from severl quntities error propgtion We usully encounter situtions where the result of n experiment is given in terms of two (or more) quntities. We then need

More information

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.)

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.) CS 373, Spring 29. Solutions to Mock midterm (sed on first midterm in CS 273, Fll 28.) Prolem : Short nswer (8 points) The nswers to these prolems should e short nd not complicted. () If n NF M ccepts

More information

THE INFLUENCE OF MODEL RESOLUTION ON AN EXPRESSION OF THE ATMOSPHERIC BOUNDARY LAYER IN A SINGLE-COLUMN MODEL

THE INFLUENCE OF MODEL RESOLUTION ON AN EXPRESSION OF THE ATMOSPHERIC BOUNDARY LAYER IN A SINGLE-COLUMN MODEL THE INFLUENCE OF MODEL RESOLUTION ON AN EXPRESSION OF THE ATMOSPHERIC BOUNDARY LAYER IN A SINGLE-COLUMN MODEL P3.1 Kot Iwmur*, Hiroto Kitgw Jpn Meteorologil Ageny 1. INTRODUCTION Jpn Meteorologil Ageny

More information

Transition systems (motivation)

Transition systems (motivation) Trnsition systems (motivtion) Course Modelling of Conurrent Systems ( Modellierung neenläufiger Systeme ) Winter Semester 2009/0 University of Duisurg-Essen Brr König Tehing ssistnt: Christoph Blume In

More information

Thermodynamics. Question 1. Question 2. Question 3 3/10/2010. Practice Questions PV TR PV T R

Thermodynamics. Question 1. Question 2. Question 3 3/10/2010. Practice Questions PV TR PV T R /10/010 Question 1 1 mole of idel gs is rought to finl stte F y one of three proesses tht hve different initil sttes s shown in the figure. Wht is true for the temperture hnge etween initil nd finl sttes?

More information

Review Topic 14: Relationships between two numerical variables

Review Topic 14: Relationships between two numerical variables Review Topi 14: Reltionships etween two numeril vriles Multiple hoie 1. Whih of the following stterplots est demonstrtes line of est fit? A B C D E 2. The regression line eqution for the following grph

More information

Numbers and indices. 1.1 Fractions. GCSE C Example 1. Handy hint. Key point

Numbers and indices. 1.1 Fractions. GCSE C Example 1. Handy hint. Key point GCSE C Emple 7 Work out 9 Give your nswer in its simplest form Numers n inies Reiprote mens invert or turn upsie own The reiprol of is 9 9 Mke sure you only invert the frtion you re iviing y 7 You multiply

More information

AP Calculus BC Chapter 8: Integration Techniques, L Hopital s Rule and Improper Integrals

AP Calculus BC Chapter 8: Integration Techniques, L Hopital s Rule and Improper Integrals AP Clulus BC Chpter 8: Integrtion Tehniques, L Hopitl s Rule nd Improper Integrls 8. Bsi Integrtion Rules In this setion we will review vrious integrtion strtegies. Strtegies: I. Seprte the integrnd into

More information

Outline Data Structures and Algorithms. Data compression. Data compression. Lossy vs. Lossless. Data Compression

Outline Data Structures and Algorithms. Data compression. Data compression. Lossy vs. Lossless. Data Compression 5-2 Dt Strutures n Algorithms Dt Compression n Huffmn s Algorithm th Fe 2003 Rjshekr Rey Outline Dt ompression Lossy n lossless Exmples Forml view Coes Definition Fixe length vs. vrile length Huffmn s

More information

For a, b, c, d positive if a b and. ac bd. Reciprocal relations for a and b positive. If a > b then a ab > b. then

For a, b, c, d positive if a b and. ac bd. Reciprocal relations for a and b positive. If a > b then a ab > b. then Slrs-7.2-ADV-.7 Improper Definite Integrls 27.. D.dox Pge of Improper Definite Integrls Before we strt the min topi we present relevnt lger nd it review. See Appendix J for more lger review. Inequlities:

More information

Chapter 8 Roots and Radicals

Chapter 8 Roots and Radicals Chpter 8 Roots nd Rdils 7 ROOTS AND RADICALS 8 Figure 8. Grphene is n inredily strong nd flexile mteril mde from ron. It n lso ondut eletriity. Notie the hexgonl grid pttern. (redit: AlexnderAIUS / Wikimedi

More information

Intermediate Math Circles Wednesday 17 October 2012 Geometry II: Side Lengths

Intermediate Math Circles Wednesday 17 October 2012 Geometry II: Side Lengths Intermedite Mth Cirles Wednesdy 17 Otoer 01 Geometry II: Side Lengths Lst week we disussed vrious ngle properties. As we progressed through the evening, we proved mny results. This week, we will look t

More information

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science CSCI 340: Computtionl Models Kleene s Theorem Chpter 7 Deprtment of Computer Science Unifiction In 1954, Kleene presented (nd proved) theorem which (in our version) sttes tht if lnguge cn e defined y ny

More information

Part I: Study the theorem statement.

Part I: Study the theorem statement. Nme 1 Nme 2 Nme 3 A STUDY OF PYTHAGORAS THEOREM Instrutions: Together in groups of 2 or 3, fill out the following worksheet. You my lift nswers from the reding, or nswer on your own. Turn in one pket for

More information