Content-assisted File Decoding for Nonvolatile Memories

Size: px
Start display at page:

Download "Content-assisted File Decoding for Nonvolatile Memories"

Transcription

1 Content-asssted Fle Decodng for Nonvolatle Memores Yue L, Yue Wang, Anxao (Andrew) Jang Department of Computer Scence and Engneerng Texas A&M Unversty College Staton, TX 7784 {yl, yuewang, Jehoshua Bruck Department of Electrcal Engneerng Calforna Insttute of Technology Pasadena, CA 95 Abstract Nonvolatle memores (NVMs) such as flash memores play a sgnfcant role n meetng the data storage requrements of today s computaton actvtes. The rapd ncrease of storage densty for NVMs however brngs relablty ssues due to closer algnment of adacent cells on chp, and more levels that are programmed nto a cell. We propose a new method for error correcton, whch uses the random access capablty of NVMs and the redundancy that nherently exsts n nformaton content. Although t s theoretcally possble to remove the redundancy va data compresson, exstng source codng algorthms do not remove all of t for effcent computaton. We propose a method that can be combned wth exstng storage solutons for text fles, namely content-asssted decodng. Usng the statstcal propertes of words and phrases n the text of a gven language, our decoder dentfes the locaton of each subcodeword representng some word n a gven nput nosy codeword, and flps the bts to compute a most lkely word sequence. The decoder can be adapted to work together wth tradtonal ECC decoders to keep the number of errors wthn the correcton capablty of tradtonal decoders. The combned decodng framework s evaluated wth a set of benchmark fles. I. INTRODUCTION Nonvolatle memores (NVMs), such as flash memores, have excellent speed and storage capacty. They have emerged as a crucal technology for storage systems. However, accompanyng the mprovement n data densty, the relablty ssue of NVMs are attractng more and more attenton []. In ths paper, we propose a new method for error correcton named content-asssted decodng. Our method uses the fast random access capablty of NVMs and the redundancy that nherently exsts n nformaton content. Although t s theoretcally possble to remove the redundancy va data compresson, exstng source codng algorthms do not remove all of t for effcent computaton. Our method can be combned wth exstng storage solutons for text fles. Wth dctonares storng the statstcal propertes of words and phrases of the same language, our decoder frst breaks the nput nosy codeword nto subcodewords, wth each subcodeword correspondng to a set of possble words. The decoder then flps the bts n each nosy subcodeword to select a most lkely word sequence as the correcton. Consder the example n Fgure. The Ths work was supported n part by the NSF CAREER Award CCF and the NSF grant CCF The authors thank the Texas A&M Unversty Brazos HPC cluster for provdng computng resources to support the research reported here. Codeword Text Huffman encodng (,,,,,,, ) I am ECC encodng (,,,,,,,,,,, ) I am Nose receved (,,,,,,,,,,, ) IIaa ECC decodng falure (,,,,,,,,,,, ) IIaa Content-asssted decodng (,,,,,,,,,,, ) I am ECC decodng success (,,,,,,,,,,, ) I am Fg.. An example on correctng errors n the codeword of a text. Englsh text I am s stored usng the Huffman codng: {I (, ), (, ), a (, ), m (, )}, where denotes a space. The nformaton bts are encoded wth a (, 8)-shortened Hammng code whch corrects sngle bt errors (the bold bts denote the party check bts). Assume that three errors (marked by the underlnes) are receved by the codeword. The number of errors exceeds the code s correcton capablty, and ECC decodng fals. Our decoder takes n the nosy codeword, and corrects the errors n the nformaton symbols by lookng up a dctonary whch contans two words {I, am}. Ths brngs the number of errors down to one. Therefore, the second tral of ECC decodng succeeds, and all the errors are corrected. Our approach s sutable for natural languages, and can potentally be extended to other types of data where the redundancy n nformaton content s not fully removed by data compresson. The scheme takes advantage of the fast random access speed provded by flash memores for fast dctonary look-up and content verfcaton. For performance evaluaton, we have tested a decodng framework that combnes a soft decson decoder of low-densty party-check (LDPC) codes and our scheme wth a set of text fle benchmarks. Expermental results show that our decoder ndeed ncreases the correcton capablty of the LDPC decoder. The rest of the paper s organzed as follows. Secton II presents the prelmnares, and defnes the text fle decodng problem. Secton III specfes the algorthms of the contentasssted fle decoder. Secton IV dscusses mplementaton detals and expermental results. II. THE MODELS OF FILE DECODING We frst defne a few notatons used throughout ths paper. Let x denote a bnary codeword (x, x,, x n ) {, } n, and we use x[ : ] to represent the subcodeword (x, x +,, x ). Let the functon length(x) compute the

2 length of a codeword x, and we use d H (x, x ) for computng the Hammng dstance between two codewords of the same length. Let A be an alphabet set, and let s A be a symbol. We denote a space by A. A word w (s,, s n ) of length n s a fnte sequence of symbols wthout any space. A phrase p (w,, w ) s defned as a combnaton of two words separated by a space. Defne a text t (w,, w,,,, w n ) as a sequence of words separated by. A word dctonary D w {[w : p ], [w : p ], } s a fnte set of records where a record [w : p] has a key w and a value p >. The value p s an average probablty that the word w occurs n a text. Smlarly, a phrase dctonary D p {[p : p ], [p : p ], } stores the probabltes that a set of phrases appear n any gven text. The dctonary look-up operatons denoted by D w [w] and D p [p] return the probabltes of words and phrases, respectvely. We use the notaton w D w (or p D p ) to ndcate that there s a record n D w (or D p ) wth key w (or p). Let π s be a bectve mappng between a symbol and a bnary codeword, and let x s = π s ( ). In ths paper, the mappng π s s used durng data compresson before ECC encodng, and t encodes each symbol separately. In the example of Secton I, π s refers to the Huffman codebook. The bectve mappng between a word w = (s,, s n ) and ts bnary codeword s defned as π w (w) (π s (s ),, π s (s n )), and the bectve mappng from a text to ts bnary representaton s defned as π t (t) (π w (w ), x s,, x s, π w (w n )) where x s = π s ( ). We use πs, πw to denote the correspondng nverse mappngs. The model of the data storage channel s shown n Fgure. Encoder Fg.. and π t Channel Encoder Nose Channel The channel model for data storage. A text t s generated by the source. The text s compressed by the source encoder, producng a bnary codeword y = π t (t) {, } k. The compressed bts are fed to a channel encoder, obtanng an ECC codeword x = ψ(y) {, } n where n > k. Here we assume a systematc ECC s used. The codeword s then stored by memory cells, and receves an addtve error e {, } n. In ths paper, a bnary symmetrc channel (BSC) wth bt-flppng rate f s assumed. When the cells are read, the channel outputs a nosy codeword x = x e where s the bt-wse exclusve-or over codewords. The nosy codeword s frst corrected by a channel decoder, producng an estmated ECC codeword ŷ = ψ (x ). The source decoder decompresses the corrected codeword, and returns an estmated text ˆt = πt (ŷ) upon success. Ths work focuses on desgnng better channel decoders ψ for correctng bt errors n text fles. We propose a new decodng framework whch connects a tradtonal ECC decoder wth a content-asssted decoder (CAD) as shown n Fgure. A nosy codeword s frst passed nto an ECC decoder. If decodng fals, the decodng output s passed to Fg.. ECC Channel Succeeds or Y Reaches Iteraton Lmt? N Content-asssted The work-flow of a channel decoder wth content-asssted decodng. CAD. Wth the statstcal nformaton stored n D w and D p, the CAD selects a word for each subcodeword to form a lkely text as the correcton for the nosy codeword. The corrected text s fed back to the ECC decoder. The teraton contnues untl ether the ECC decoder succeeds or an teraton lmt s reached. The text fle decodng problem for our CAD s defned as follows. Defnton. Let t be some text generated from the source, and let x {, } n be a nosy channel output codeword of t. Gven two dctonares D w and D p, the text fle decodng problem for the CAD s to fnd an estmated text ˆt whch s the most lkely correcton for x,.e. argmax Pr{ˆt x, D p, D w }. ˆt III. THE CONTENT-ASSISTED DECODING ALGORITHMS The CAD approxmates the soluton to the problem n Defnton n the three steps: () estmate space postons n the nosy codeword to dvde the codeword nto subcodewords, wth each subcodeword representng a set of words n D w. () Resolve ambguty by selectng a word for each subcodeword to form a most lkely sequence. () Perform postprocessng to revert the aggressve bt flps done n () and (). We descrbe the algorthm of each step n ths secton. A. Creatng dctonares The dctonares D w and D p are used n our decodng algorthms. To create the dctonares, we smply count the frequences of words and phrases of two words whch appear n a relatvely large set of dfferent texts n the same language as the texts generated by the source. Fast dctonary look-up s acheved by storng the dctonares n a content-addressable way thanks to the random access n flash memores,.e., the probablty n a dctonary record s addressed by the value of the correspondng word or phrase. As we show later n secton IV, the completeness of the dctonares effects the decodng performance. B. Codeword segmentaton The codeword segmentaton functon σ takes n a nosy codeword and a word dctonary, then flps the mnmum number of bts to make the corrected codeword represent a text, e.g., a sequence of vald words separated by spaces. If σ(x, D w ) = ((x, x,, x k ), (,,, k )), where the number of records D w s bounded by some constant K, and N s the ndex of the frst bt of the -th space n x, the subcodeword x = x[ : ], x k = x[ k + length(x s ) : length(x)], and x = x[ + length(x s ) : ] for {,,, k }. The mappng σ s requred to satsfy the followng propertes: () for each subcodeword x, w D w such that length(x ) = length(π w (w)).

3 () d H (x, (x, x s, x, x s,, x s, x k )) s mnmzed. Intutvely, as the bt-flp rate f s very small (whch s common for NVM channels), the segmentaton functon s a maxmum lkelhood decoder whch flps the mnmum number of bts of the codeword. Let the cost functon c(, ) return the mnmum number of flps taken to convert the subcodeword x[ : ] to represent a text. We have the followng recurrence: { mn{g(, ), h(, )} f < c(, ) otherwse, where g(, ) mn w Dw d H (π w (w), x[ : ]), h(, ) mn k [+, length(xs )] c(, k ) + c(k + length(x s ), )+ d H (x[k : k + length(x s ) ], x s ). The functon g(, ) computes the mnmum number of flps taken to turn x[ : ] nto the codeword of a word n D w. The functon h(, ) computes the mnmum flp cost taken to obtan a codeword representng a text wth at least two words. Example. Consder the example n secton I. The nput nosy codeword x = (,,,,,,, ), and the word dctonary D w = {[I :.5], [am :.5]}. We have σ(x, D w ) = (((, ), (,,, )), ()). Startng from c(, 8), we recursvely compute c(, ) for all <. The results are shown n Fgure 4(b). For nstance, to compute c(5, 8), we frst compute g(5, 8) = as the subcodeword can be turned to represent the word I wth bt-flp. We then compute h(5, 8) =. Ths s because length(x s ) = and the mnmum codeword length of a word n D w s, therefore t s mpossble to splt the subcodeword (,,, ) by a space. Fnally, we have c(5, 8) = mn(, ) =. Our obectve s to compute c(, n) gven an nput codeword of length n, and fnd out the space postons whch help acheve the mnmum cost. When c(, ) s computed recursvely startng from c(, n), some entres wll be recomputed unnecessarly. For nstance, n example, the entry c(4, 5) needs to be computed when we compute c(, 7) and c(, 8). A good way for speedng up such computaton s to use dynamc programmng technques shown n Algorthm, whch computes the fnal result teratvely startng from c(, ), an entry computed n the prevous teraton s saved for later teratons. The algorthm treats c(, ) as the entres of a two dmensonal table. Startng from c(, ), the table the algorthm flls each entry dagonally across the table as shown n Fgure 4(a). The correspondng space locatons for breakng the subcodeword x[ : ], or the set of words that x[ : ] can be flpped to represent s recorded usng a two dmensonal table m. In practce, as f s close to, the average number of errors n the codeword of a word s small. Computng the set of possble words S w for a gven nosy codeword can be accelerated by passng an addtonal Hammng dstance lmt d to reduce the search space,.e. nstead of searchng the whole D w as n g(, ), we search the set {w w D w, d H (π w (w), x[ : ]) < d} to skp the words whch are too far from the nosy codeword n terms of d and Hammng dstance metrc. As we are more Algorthm CodewordSegmentaton(x, D w ) n length(x), l length(x s ) Let c and m be two n n tables Let wordsets and spaces be two empty lsts for t from to n do for from to n t + do + t d mn mn w Dw d H (π w (w), x[ : ]) S w {w w D w, d H (π w (w), x[ : ]) = d mn } k for k from + to l do d c(, k) + c(k + l, ) + d H (x s, x[k : k + l ]) f d < d mn then d mn d k k f k = then m(, ).words S w m(, ).words m(, ).space k c(, ) d mn TraceBack(, n, spaces, wordsets, m, l) return wordsets and spaces Algorthm TraceBack(,, spaces, wordsets, m, l) f m(, ).words = then k m(, ).space TraceBack(, k, spaces, m, l) spaces. append(k) TraceBack(k + l,, spaces, m, l) wordsets. append(m(, ).words) nterested n the space locatons than the value of c(, ), after the entres of c and m have been flled, Algorthm s used to recursvely trace back the soluton path recorded n m. The results are the ordered space locatons and the sets of words for the codewords between the spaces. Assume that K s a constant whch s much smaller than N, and that the codeword of each word has lmted length bounded by some constant. The tme complexty of our dynamc programmng algorthm s O(n). Ths s because only O(n) entres need to be computed and each computaton takes O() tme. The algorthm requres O(n ) space for storng the tables c and m. Example. For the example n secton I, the tables c and m computed by Algorthm are shown n Fgure 4(b) and 4(c). The mnmum flppng cost s c(, 8) =, and the ndex of the estmated space s m(, 8).space =. Wth the estmated space, the subcodeword x[ : ] = (, ) can be flpped to denote a word n the set {I}, and the subcodeword x[5 : 8] = (,,, ) can be flpped to denote a word n the set {am}. C. Ambguty resoluton Gven the subcodewords (x, x,, x k ) between the estmated spaces, and a lst of word sets (W, W,, W k ) computed from the codeword segmentaton algorthm, for {,, k} we select a word w from W to form a most probable text ˆt = (w,, w,,,, w k ). The codeword π t (ˆt) s a correcton for the nput nosy codeword. Specfcally,

4 n n X X X X X {am} {am} {am} {am} {am} X X X X X X {I} {I} {I} {I} {I} {I} 4 X 5 7 {I} (a) Iteratve table fllng. (b) Table c. (c) Table m. Fg. 4. The examples of codeword segmentaton. In Fgure (c): A number n the table denotes the ndex of the frst bt of an estmated space; a set of word means the subcodeword can be flpped to any of the word n the set. The cross means a subcodeword can nether be flpped to represent a word nor to a text wth at least two words. ths step s to compute w, w, w, w 4, argmax (w,w,,w k ) W W W k Pr{(w, w,, w k ), (x, x,, x k )}. Let the functon P(w ) compute the maxmal ont probablty when some word w s selected from W and appended to the prevously selected word sequence (w, w,, w ). For [, k], we have P(w ) max (w,,w ) W W Pr{(w,, w ), (x,, x )}. Assume the words n a text form a one-step Markov chan,.e., for, Pr{w (w, w,, w )} = Pr{w w }. Therefore, we rewrte the equaton above as: P(w ) = max (w,,w ) W W Pr{w w } Pr{x w } Pr{w } Pr{w w } Pr{w w } k= Pr{x k w k } = max (w,,w ) W W Pr{w w } Pr{x w } Pr{(w,, w ), (x,, x )} = max w W Pr{x w } Pr{w w } max (w,,w ) W W Pr{(w,, w ), (x,, x )} = max w W Pr{x w } Pr{w w } P(w ). () and P(w ) = Pr{w } Pr{x w }. The condtonal probablty Pr{x k w k } s computed from the channel statstcs by Pr{x k w k } = f d H(π w (w k ),x k ) ( f ) length(x k) d H (π w (w k ),x k ). The probabltes Pr{w } = D w [w ] and Pr{w k w k } = D p [(w k,, w k )] are looked up from the dctonares: The derved recurrence suggests that the optmzaton problem can be mapped to the problem of trells decodng, whch s agan solved by dynamc programmng. The trells for our problem has k tme stages. The observed codeword at the - th stage s x for {,, k}. There are W vertces at stage wth each representng an element w of W and beng assocated wth the condtonal probablty Pr{w x }. The weght of the drected edge from a vertex at stage wth word w x to a vertex of stage + wth word w y s the condtonal probablty Pr{w y w x }. An example of the mappng s shown n Fgure 5. Our target s to compute the sequence whch acheves max wk W k P(w k ), whch leads to the Vterb path n the correspondng trells startng from a vertex n stage and endng at a vertex n stage k. The dynamc programmng algorthm for solvng our trells decodng problem s specfed n Algorthm, whch s w, w, w, w, w, w 4, Fg. 5. Example of the mappng to trells decodng. The word sets W = {w,, w, }, W = {w,, w,, w, }, W = {w,, w,, w, } and W 4 = {w 4,, w 4, } respectvely corresponds to the subcodewords x, x, x and x 4. adapted from the Vterb decodng []. The fnal soluton s computed teratvely, startng from P(w ) accordng to the recurrence. When the last teraton s fnshed, we trace back Algorthm Vterb((W,, W k ), (x,, x k ), f, D w, D p ) n max l [,k] W l Let p and s be two n k tables p max, ndex for t from to k do for from to W t do p f d H(π w (W t []),x t ) ( f ) length(x t) d H (π w (W t []),x t ) f t = then p(, t) p D w [W t []] p max, ndex for from to W t do p p D p [(W t [],, W t [])] p[, t ] f p > p max then p max p ndex p(, t) p max s(, t) ndex words [W k [ndex]] for t from k to do s(ndex, t) words. appendtofront(w t []) ndex return words along the Vterb path recorded n the table s, collectng the selected words to form an estmated text ˆt. The complexty of the Vterb decodng algorthm s O(n k) where k = O(N) s the length of the nput codeword lst, and n = max [,k] W = O(K) s the cardnalty of the bggest nput word set. As K s a constant whch s much smaller than N, the Vterb decodng for our case has tme complexty O(N). The algorthm requres O(nk) = O(N) space for storng the tables p and s.

5 D. Post-processng If unknown words or phrases occur n the nput codeword, addtonal errors wll be ntroduced durng codeword segmentaton and ambguty resoluton. Unknown words (phrase) refers to new or rare words (phrases) whch are not ncluded n D w (D p ). Upon encounterng an unknown word, the codeword segmentaton algorthm tends to splt ts codeword nto subcodewords representng known words wth the space symbol. Such segmentaton ntroduces addtonal bt errors. We use a smple post-processng step whch undoes the btflps ssued by such aggressve segmentaton. The dea s to use the phrase dctonary D p to check whether two adacent words returned by the Vterb decoder s known to D p. If so, the post-processor smply accepts the segmentaton, otherwse the correspondng bts n the ntal nosy codeword are used to replace the codewords for those unknown phrases. The complexty of ths step s O(k) = O(N). A. Implementaton detal IV. EXPERIMENTS Our mplementaton supports the use of basc punctuaton n the nput text fles, ncludng,,.,? and!. Ths s done by addng another functon n the defnton of c(, ) when <. The functon measures the number of flps taken to turn a subcodeword to represent a word followed by a punctuaton. Durng ambguty resoluton, overflow may occur n the multplcatons of probabltes when N s large. We thus use a logarthmc verson of Eq.(). Usng addtons nstead of multplcatons of floatng pont numbers sgnfcantly delays the overflow. A smoothng technque s used for computng Pr{w w }. The probablty Pr{w } wll be used f the phrase (w,, w ) s unknown to D p. The reason s that returnng for unknown phrases suddenly makes the whole ont probablty be n Eq.() and cancels the path. B. Evaluaton We evaluated decodng performance of the channel decoder combnng the LDPC sum-product decoder and the CAD. We compared the bt error rates (BER) of the combned channel decoder wth those of the scheme usng the LDPC sumproduct decodng alone. The test nputs nclude self-collected paragraphs and 8 paragraphs randomly extracted from the Canterbury Corpus, the Calgary Corpus, the Large Corpus [], and the large text compresson benchmark [4] (see Table I). The dctonares are bult usng the books randomly extracted from Proect Gutenberg [5]. The functons π s and πs are mplemented wth Huffman codng. A (584, 4)-random LDPC code s used as the ECC. The teraton lmt of the sum-product decoder s. The teraton threshold for the LDPC-CAD exchange s. The bt-flp rate of the BSC s., whch makes the sum-product decoder fal to converge wth hgh probablty. The decodng BERs for complete and ncomplete dctonares are shown n Table I and Table II, respectvely. The BERs for each benchmark are averaged from experments. In Table I, the combned channel decoder sgnfcantly outperforms the tradtonal decoder thanks to TABLE I THE DECODING BERS WHEN THE DICTIONARIES ARE COMPLETE Name Category From ECC only Combned emal Emal dscusson Calgary lcet Lecture notes Canterbury 8.4. alce Novel Canterbury confntro Call for paper Self-made 8.7. bble The bble Large asyoulke Shakespeare play Canterbury plrabn Poetry Canterbury 8.6. news Web news Self-made enwk Wkpeda texts Large text 8.. world9 The world fact book Large the completeness of the dctonares. The performance for the benchmark world9 s not as good as others. Ths s because world9 has much more punctuaton but fewer words than other benchmarks do, and more errors occur n the punctuatons whch the CAD s not good at correctng. In Table II, to see the effectveness of the post-processor, we also show the performance of the combned decoder wthout the post-processor. The completeness of the dctonares deter- TABLE II THE DECODING BERS WHEN THE DICTIONARIES ARE INCOMPLETE. Name ECC only Combned After PP UW% UP% emal lcet alce confntro bble asyoulke plrabn news enwk world mnes the decodng performance. For nstance, the benchmarks world9 and enwk have consderable number of words and phrases whch are unknown to our dctonares. The combned decoder wthout post-processng ntroduces addtonal errors by aggressvely breakng the codewords of the unknown words nto subcodewords separated wth spaces. In such cases, the post-processor s able to recognze and revert most of the over-aggressve bt-flps. Ths greatly reduces the number of addtonal errors ntroduced due to the gnorance of the CAD. For the benchmark confntro, the performance of the decoder wthout post-processng s much better than that of the decoder usng post-processng. Ths s because confntro has only a few unknown words but many techncal phrases whch are unknown to D p. The unknown phrases makes the postprocessor tend to revert reasonable correctons done n the prevous steps. REFERENCES [] L. M. Grupp, J. D. Davs, and S. Swanson, The bleak future of nand flash memory, n Proceedngs of the th USENIX conference on Fle and Storage Technologes, Berkeley, CA, USA,. [] A. Vterb, Error bounds for convolutonal codes and an asymptotcally optmum decodng algorthm, IEEE Transactons on Informaton Theory, vol., no., pp. 6 69, aprl 967. [] The Canterbury Corpus: [4] Large Text Compresson Benchmark: [5] Proect Gutenberg:

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Low Complexity Soft-Input Soft-Output Hamming Decoder

Low Complexity Soft-Input Soft-Output Hamming Decoder Low Complexty Soft-Input Soft-Output Hammng Der Benjamn Müller, Martn Holters, Udo Zölzer Helmut Schmdt Unversty Unversty of the Federal Armed Forces Department of Sgnal Processng and Communcatons Holstenhofweg

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals Smultaneous Optmzaton of Berth Allocaton, Quay Crane Assgnment and Quay Crane Schedulng Problems n Contaner Termnals Necat Aras, Yavuz Türkoğulları, Z. Caner Taşkın, Kuban Altınel Abstract In ths work,

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms Proceedngs of the 5th WSEAS Internatonal Conference on Telecommuncatons and Informatcs, Istanbul, Turkey, May 27-29, 26 (pp192-197 DC-Free Turbo Codng Scheme Usng MAP/SOVA Algorthms Prof. Dr. M. Amr Mokhtar

More information

S Advanced Digital Communication (4 cr) Targets today

S Advanced Digital Communication (4 cr) Targets today S.72-3320 Advanced Dtal Communcaton (4 cr) Convolutonal Codes Tarets today Why to apply convolutonal codn? Defnn convolutonal codes Practcal encodn crcuts Defnn qualty of convolutonal codes Decodn prncples

More information

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

CSE4210 Architecture and Hardware for DSP

CSE4210 Architecture and Hardware for DSP 4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Lecture 5 Decoding Binary BCH Codes

Lecture 5 Decoding Binary BCH Codes Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture

More information

Exercises. 18 Algorithms

Exercises. 18 Algorithms 18 Algorthms Exercses 0.1. In each of the followng stuatons, ndcate whether f = O(g), or f = Ω(g), or both (n whch case f = Θ(g)). f(n) g(n) (a) n 100 n 200 (b) n 1/2 n 2/3 (c) 100n + log n n + (log n)

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

Dynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)

Dynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence) /24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS) Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

Patterned Cells for Phase Change Memories

Patterned Cells for Phase Change Memories 2011 IEEE Internatonal Symposum on Informaton Theory Proceedngs Patterned Cells for Phase Change Memores Anxao (Andrew) Jang Computer Sc. and Eng. Texas A&M Unversty College Staton, TX 77843 ajang@cse.tamu.edu

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

Chapter 6. BCH Codes

Chapter 6. BCH Codes Wreless Informaton Transmsson System Lab Chapter 6 BCH Codes Insttute of Communcatons Engneerng Natonal Sun Yat-sen Unversty Outlne Bnary Prmtve BCH Codes Decodng of the BCH Codes Implementaton of Galos

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL The Synchronous 8th-Order Dfferental Attack on 12 Rounds of the Block Cpher HyRAL Yasutaka Igarash, Sej Fukushma, and Tomohro Hachno Kagoshma Unversty, Kagoshma, Japan Emal: {garash, fukushma, hachno}@eee.kagoshma-u.ac.jp

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016 CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Desgn and Analyss of Algorthms CSE 53 Lecture 4 Dynamc Programmng Junzhou Huang, Ph.D. Department of Computer Scence and Engneerng CSE53 Desgn and Analyss of Algorthms The General Dynamc Programmng Technque

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

High-Speed Decoding of the Binary Golay Code

High-Speed Decoding of the Binary Golay Code Hgh-Speed Decodng of the Bnary Golay Code H. P. Lee *1, C. H. Chang 1, S. I. Chu 2 1 Department of Computer Scence and Informaton Engneerng, Fortune Insttute of Technology, Kaohsung 83160, Tawan *hpl@fotech.edu.tw

More information

CS 331 DESIGN AND ANALYSIS OF ALGORITHMS DYNAMIC PROGRAMMING. Dr. Daisy Tang

CS 331 DESIGN AND ANALYSIS OF ALGORITHMS DYNAMIC PROGRAMMING. Dr. Daisy Tang CS DESIGN ND NLYSIS OF LGORITHMS DYNMIC PROGRMMING Dr. Dasy Tang Dynamc Programmng Idea: Problems can be dvded nto stages Soluton s a sequence o decsons and the decson at the current stage s based on the

More information

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter Representng Informaton 1 1 1 1 Bt Jugglng - Representng nformaton usng bts - Number representatons - Some other bts Chapter 3.1-3.3 REMINDER: Problem Set #1 s now posted and s due next Wednesday L3 Encodng

More information

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

Single-Facility Scheduling over Long Time Horizons by Logic-based Benders Decomposition

Single-Facility Scheduling over Long Time Horizons by Logic-based Benders Decomposition Sngle-Faclty Schedulng over Long Tme Horzons by Logc-based Benders Decomposton Elvn Coban and J. N. Hooker Tepper School of Busness, Carnege Mellon Unversty ecoban@andrew.cmu.edu, john@hooker.tepper.cmu.edu

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Cathy Walker March 5, 2010

Cathy Walker March 5, 2010 Cathy Walker March 5, 010 Part : Problem Set 1. What s the level of measurement for the followng varables? a) SAT scores b) Number of tests or quzzes n statstcal course c) Acres of land devoted to corn

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Large-Margin HMM Estimation for Speech Recognition

Large-Margin HMM Estimation for Speech Recognition Large-Margn HMM Estmaton for Speech Recognton Prof. Hu Jang Department of Computer Scence and Engneerng York Unversty, Toronto, Ont. M3J 1P3, CANADA Emal: hj@cs.yorku.ca Ths s a jont work wth Chao-Jun

More information

Turing Machines (intro)

Turing Machines (intro) CHAPTER 3 The Church-Turng Thess Contents Turng Machnes defntons, examples, Turng-recognzable and Turng-decdable languages Varants of Turng Machne Multtape Turng machnes, non-determnstc Turng Machnes,

More information

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming EEL 6266 Power System Operaton and Control Chapter 3 Economc Dspatch Usng Dynamc Programmng Pecewse Lnear Cost Functons Common practce many utltes prefer to represent ther generator cost functons as sngle-

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

General theory of fuzzy connectedness segmentations: reconciliation of two tracks of FC theory

General theory of fuzzy connectedness segmentations: reconciliation of two tracks of FC theory General theory of fuzzy connectedness segmentatons: reconclaton of two tracks of FC theory Krzysztof Chrs Ceselsk Department of Mathematcs, West Vrgna Unversty and MIPG, Department of Radology, Unversty

More information

Microwave Diversity Imaging Compression Using Bioinspired

Microwave Diversity Imaging Compression Using Bioinspired Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

A Comparison between Weight Spectrum of Different Convolutional Code Types

A Comparison between Weight Spectrum of Different Convolutional Code Types A Comparson between Weght Spectrum of fferent Convolutonal Code Types Baltă Hora, Kovac Mara Abstract: In ths paper we present the non-recursve systematc, recursve systematc and non-recursve non-systematc

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

A linear imaging system with white additive Gaussian noise on the observed data is modeled as follows:

A linear imaging system with white additive Gaussian noise on the observed data is modeled as follows: Supplementary Note Mathematcal bacground A lnear magng system wth whte addtve Gaussan nose on the observed data s modeled as follows: X = R ϕ V + G, () where X R are the expermental, two-dmensonal proecton

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Message modification, neutral bits and boomerangs

Message modification, neutral bits and boomerangs Message modfcaton, neutral bts and boomerangs From whch round should we start countng n SHA? Antone Joux DGA and Unversty of Versalles St-Quentn-en-Yvelnes France Jont work wth Thomas Peyrn 1 Dfferental

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013 ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run

More information

Contents 1 Basics of Convolutional Coding.

Contents 1 Basics of Convolutional Coding. Contents 1 Bascs of Convolutonal Codng 1 Φ 11 Relatng G 1 wth the mpulse responses g j 1 ;j 12 TransformAnalyssn ConvolutonalCodes context 4 13 Constructng a convolutonal encoder structure from the generator

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

An Interactive Optimisation Tool for Allocation Problems

An Interactive Optimisation Tool for Allocation Problems An Interactve Optmsaton ool for Allocaton Problems Fredr Bonäs, Joam Westerlund and apo Westerlund Process Desgn Laboratory, Faculty of echnology, Åbo Aadem Unversty, uru 20500, Fnland hs paper presents

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information