On fuzzy information theory

Size: px
Start display at page:

Download "On fuzzy information theory"

Transcription

1 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: On fuzzy nformaton theory A. Zarand Department of athematcs & Computer, Islamc Azad unversty Kerman branch, Kerman, Iran afshn_zarand@yahoo.com Abstract Informaton theory s generally consdered to have been founded n 948 by Claude Shannon. In ths paper we ntroduce the concept of fuzzy nformaton theory usng the noton of fuzzy sets. We study t and gve some example of t. Keywords: Informaton theory, fuzzy set, fuzzy probablty, entropy. Introducton The man concepts of nformaton theory can be grasped by consderng the most wdespread means of human communcaton: language. Two mportant aspects of a concse language are as follows: Frst, the most common words (e.g., "a", "the" & "I") should be shorter than less common words (e.g., "beneft", "generaton" & "medocre"), so that sentences wll not be too long. Such a tradeoff n word length s analogous to data compresson and s the essental aspect of source codng. Second f part of a sentence s unheard or msheard due to nose e.g., a passng car the lstener should stll s able to glean the meanng of the underlyng message. Such robustness s as essental for an electronc communcaton system as t s for a language; properly buldng such robustness nto communcatons s done by channel codng. Source codng and channel codng are the fundamental concerns of nformaton theory (Reke et al.,997, Burnham & Anderson, 2002). Note that these concerns have nothng to do wth the mportance of messages. For e.g., a plattude such as "Thank you; come agan" takes about as long to say or wrte as the urgent plea, "Call an ambulance!" whle the latter may be more mportant and more meanngful n many contexts. Informaton theory, however, does not consder message mportance or meanng, as these are matters of the qualty of data rather than the quantty and readablty of data, the latter of whch s determned solely by probabltes (Anderson, 2003). Informaton theory s generally consdered to have been founded n 948 by Claude Shannon n hs semnal work, "A mathematcal theory of communcaton." (Shannon, 948; Shannon & Warren Weaver, 949) The central paradgm of classcal nformaton theory s the engneerng problem of the transmsson of nformaton over a nosy channel. The most fundamental results of ths theory are Shannon's source codng theorem whch establshes that on average the number of bts needed to represent the result of an uncertan event s gven by ts entropy; and Shannon's nosy-channel codng theorem whch states that relable communcaton s possble over nosy channels provded that the rate of communcaton s below a certan threshold called the channel capacty. The channel capacty can be approached n practce by usng approprate encodng and decodng systems. Informaton theory s closely assocated wth a collecton of pure and appled dscplnes that have been nvestgated and reduced to engneerng practce under a varety of rubrcs throughout the world over the past half century or more: adaptve systems, antcpatory systems, artfcal ntellgence, complex systems, complexty scence, cybernetcs, nformatcs, machne learnng, along wth systems scences of many descrptons. Informaton theory s a broad and deep mathematcal theory, wth equally broad and deep applcatons, amongst whch s the vtal feld of codng theory (Gbson, 998). A branch of communcaton theory devoted to problems n codng. A unque feature of nformaton theory s ts use of a numercal measure of the amount of nformaton ganed when the contents of a message are learned. Informaton theory reles heavly on the mathematcal scence of probablty. For ths reason the term nformaton theory s often appled loosely to other probablstc studes n communcaton theory, such as sgnal detecton, random nose, and predcton. See also electrcal communcatons; Probablty (Gallager, 968; Yeung, 2002). In desgnng a one-way communcaton system from the standpont of nformaton theory, three parts are consdered beyond the control of the system desgner: () the source, whch generates messages at the transmttng end of the system, (2) the destnaton, whch ultmately receves the messages, and (3) the channel, consstng of a transmsson medum or devce for conveyng sgnals from the source to the destnaton. The source does not usually produce messages n a form acceptable as nput by the channel. The transmttng end of the system contans another devce, called an encoder, whch prepares the source's messages for nput to the channel. Smlarly the recevng end of the system wll contan a decoder to convert the output of the channel nto a form that s recognzable by the destnaton. The encoder and the decoder are the parts to be desgned. In Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol.

2 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: rado systems ths desgn s essentally the choce of a modulator and a detector. See also odulaton. A source s called dscrete f ts messages are sequences of elements (letters) taken from an enumerable set of possbltes (alphabe. Thus sources producng nteger data or wrtten Englsh are dscrete. Sources whch are not dscrete are called contnuous, for example, speech and musc sources. The treatment of contnuous cases s sometmes smplfed by notng that sgnal of fnte bandwdth can be encoded nto a dscrete sequence of numbers. The output of a channel need not agree wth ts nput. For e.g., a channel mght, for secrecy purposes, contan a cryptographc devce to scramble the message. Stll, f the output of the channel can be computed knowng just the nput message, then the channel s called noseless. If, however, random agents make the output unpredctable even when the nput s known, then the channel s called nosy (Reza, 994). any encoders frst break the message nto a sequence of elementary blocks; next they substtute for each block a representatve code, or sgnal, sutable for nput to the channel. Such encoders are called block encoders. For e.g., telegraph and teletype systems both use block encoders n whch the blocks are ndvdual letters. Entre words form the blocks of some commercal cablegram systems. It s generally mpossble for a decoder to reconstruct wth certanty a message receved va a nosy channel. Sutable encodng, however, may make the nose tolerable (Csszar & Janos, 997). Even when the channel s noseless a varety of encodng schemes exsts and there s a problem of pckng a good one. Of all encodngs of Englsh letters nto dots and dashes, the Contnental orse encodng s nearly the fastest possble one. It acheves ts speed by assocatng short codes wth the most common letters. A noseless bnary channel (capable of transmttng two knds of pulse 0,, of the same duraton) provdes the followng e.g. suppose one had to encode Englsh text for ths channel. A smple encodng mght just use 27 dfferent fve-dgt codes to represent word space (denoted by #), A, B,, Z; say # 00000, A 0000, B 0000, C 000,, Z 0. The word #CAB would then be encoded nto A smlar encodng s used n teletype transmsson; however, t places a thrd knd of pulse at the begnnng of each code to help the decoder stay n synchronsm wth the encoder (Goldman, 2005). Informaton theory s based on probablty theory and statstcs. The most mportant quanttes of nformaton are entropy, the nformaton n a random varable and mutual nformaton, the amount of nformaton n common between two random varables. The former quantty ndcates how easly message data can be compressed whle the latter can be used to fnd the communcaton rate across a channel. The choce of logarthmc base n the followng formulae determnes the unt of nformaton entropy that s used. The most common unt of nformaton s the bt, based on the bnary logarthm. Other unts nclude the nat, whch s based on the natural logarthm, and the hartley, whch s based on the common logarthm. In what follows, an expresson of the form s consdered by conventon to be equal to zero whenever p = 0. Ths s justfed because for any logarthmc base (Jaynes 957, ackay 2003). Entropy Entropy of a Bernoull tral as a functon of success probablty often called the bnary entropy functon H b ( p). The entropy s maxmzed at bt per tral when the two possble outcomes are equally probable, as n an unbased con toss (Kolmogorov,968). The entropy, H, of a dscrete random varable s a measure of the amount of uncertanty assocated wth the value of. Suppose one transmts 000 bts (0s & s). If these bts are known ahead of transmsson (to be a certan value wth absolute probablt, logc dctates that no nformaton has been transmtted. If, however, each s equally and ndependently lkely to be 0 or, 000 bts (n the nformaton theoretc sense) have been transmtted. Between these two extremes, nformaton can be quantfed as follows. If s the set of all messages { x,... x n } that could be, and x) s the probablty of gven some, then the entropy of s defned: H ( ) = x x)log x) Here, I(x) s the self-nformaton, whch s the entropy contrbuton of an ndvdual message and s the expected value. An mportant property of entropy s that t s maxmzed when all the messages n the message space are equprobable x) = / n,.e., most unpredctable n H( ) = logn. whch case The specal case of nformaton entropy for a random varable wth two outcomes s the bnary entropy functon, usually taken to the logarthmc base 2. To recaptulate, we assume the followng four condtons as axoms: - H (,,..., ) = f ( ) s a monotoncally ncreasng functon of (=, 2, ). 2- f ( L) = f ( ) + f ( L) (, L=, 2, ) 3- H ( p, p,..., p 2 ) = p pr H ( p + p pr, pr p ) + ( p pr ) H (,..., ) r r p p pr+ p + ( p r p ) H (,..., p p = r+ = r+ ) = = Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol.

3 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: H( p, p) s a contnuous functon of p. The four axoms essentally determne the uncertanty measure. ore precsely, we have the followng theorem. Theorem. (Yeung 2008) The only functon satsfyng the four gven axoms s H ( p,..., p ) = C p log p = Where, C s an arbtrary postve number and the logarthm base s any number greater than. Unless otherwse specfed, we shall assume C= and take logarthms to the base 2. The unts of H are sometmes called bts (a contracton of bnary dgts). Thus the unts are chosen so that there s one bt of uncertanty assocated wth the toss of an unbased con. Basng the con tends to decrease the uncertanty. We remark n passng that the average uncertanty of a random varable does not depend on the values the random varable assumes or on anythng else except the probabltes assocated wth those values. The average uncertanty assocated wth the toss of an unbased con s not changed by addng the condton that the expermenter wll be shot f the con comes up tals. Jont entropy The jont entropy of two dscrete random varables and Y s merely the entropy of ther parng: (, Y). Ths mples that f and Y are ndependent, then ther jont entropy s the sum of ther ndvdual entropes. For e.g, f (, Y) represents the poston of a chess pece the row and Y the column, then the jont entropy of the row of the pece and the column of the pece wll be the entropy of the poston of the pece. H (, Y) = y log Despte smlar notaton, jont entropy should not be confused wth cross entropy. Condtonal entropy (equvocaton) The condtonal entropy or condtonal uncertanty of gven random varable Y (also called the equvocaton of about Y) s the average condtonal entropy over Y: H ( Y) = log y Because entropy can be condtoned on a random varable or on that random varable beng a certan value, care should be taken not to confuse these two defntons of condtonal entropy, the former of whch s n more common use. A basc property of ths form of condtonal entropy s that (Gallager 968): H ( Y ) = H (, Y ) H ( Y ) utual nformaton (transnformaton) utual nformaton measures the amount of nformaton that can be obtaned about one random varable by observng another. It s mportant n communcaton where t can be used to maxmze the amount of nformaton shared between sent and receved sgnals. The mutual nformaton of relatve to Y s gven by: I(, Y) = log x y ) Where, SI (Specfc mutual Informaton) s the pont wse mutual nformaton. A basc property of the mutual nformaton s that I(, Y ) = H ( ) H ( Y ) That s, knowng Y, we can save an average of I(; Y) bts n encodng compared to not knowng Y. utual nformaton s symmetrc: I( Y ) = I( Y ) = H ( ) + H ( Y ) H (, Y ) utual nformaton can be expressed as the average Kullback Lebler dvergence (nformaton gan) of the posteror probablty dstrbuton of gven the value of Y to the pror dstrbuton on. I( ; Y ) = E p ( [ DKL ( Y = ))] In other words, ths s a measure of how much, on the average, the probablty dstrbuton on wll change f we are gven the value of Y. Ths s often recalculated as the dvergence from the product of the margnal dstrbutons to the actual jont dstrbuton: I( ; Y) = D (, Y ) ) Y )) KL utual nformaton s closely related to the loglkelhood rato test n the context of contngency tables and the multnomal dstrbuton and to Pearson's χ2 test: mutual nformaton can be consdered a statstc for assessng ndependence between a par of varables, and has a well-specfed asymptotc dstrbuton. Kullback Lebler dvergence (nformaton gan) The Kullback Lebler dvergence (or nformaton dvergence, nformaton gan, or relatve entrop s a way of comparng two dstrbutons: a "true" probablty dstrbuton p (), and an arbtrary probablty dstrbuton q (). If we compress data n a manner that assumes q () s the dstrbuton underlyng some data, when, n realty, p () s the correct dstrbuton, the Kullback Lebler dvergence s the number of average addtonal bts per datum necessary for compresson. It s thus defned x) DKL ( ) q( Y )) = x)log x q( x) Although t s sometmes used as a 'dstance metrc', t s not a true metrc snce t s not symmetrc and does not Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol.

4 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: satsfy the trangle nequalty (makng t a semquasmetrc) (ansurpur, 987). Other quanttes Other mportant nformaton theoretc quanttes nclude Rény entropy, (a generalzaton of entropy,) dfferental entropy, (a generalzaton of quanttes of nformaton to contnuous dstrbutons,) and the condtonal mutual nformaton. Codng theory Codng theory s one of the most mportant and drect applcatons of nformaton theory. It can be subdvded nto source codng theory and channel codng theory. Usng a statstcal descrpton for data, nformaton theory quantfes the number of bts needed to descrbe the data, whch s the nformaton entropy of the source. Data compresson (source codng): There are two formulatons for the compresson problem: lossless data compresson: the data must be reconstructed exactly; lossy data compresson: allocates bts needed to reconstruct the data, wthn a specfed fdelty level measured by a dstorton functon. Ths subset of Informaton theory s called rate dstorton theory. Errorcorrectng codes (channel codng): Whle data compresson removes as much redundancy as possble, an error correctng code adds just the rght knd of redundancy (.e., error correcton) needed to transmt the data effcently and fathfully across a nosy channel. Ths dvson of codng theory nto compresson and transmsson s justfed by the nformaton transmsson theorems, or source channel separaton theorems that justfy the use of bts as the unversal currency for nformaton n many contexts. However, these theorems only hold n the stuaton where one transmttng user wshes to communcate to one recevng user. In scenaros wth more than one transmtter (the multpleaccess channel), more than one recever (the broadcast channel) or ntermedary "helpers" (the relay channel), or more general networks, compresson followed by transmsson may no longer be optmal. Network nformaton theory refers to these mult-agent communcaton models. Source theory Any process that generates successve messages can be consdered a source of nformaton. A memory-less source s one n whch each message s an ndependent dentcally-dstrbuted random varable, whereas the propertes of ergodcty and statonarty mpose more general constrants. All such sources are stochastc. These terms are well studed n ther own rght outsde nformaton theory. Rate Informaton rate s the average entropy per symbol. For memory-less sources, ths s merely the entropy of Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol. 023 each symbol, whle, n the case of a statonary stochastc process, t s r = n n lm H ( n n, n 2, 3,...) that s, the condtonal entropy of a symbol gven all the prevous symbols generated. For the more general case of a process that s not necessarly statonary, the average rate s r = lm H (, 2, 3, 4,...) n n that s, the lmt of the jont entropy per symbol. For statonary sources, these two expressons gve the same result. It s common n nformaton theory to speak of the "rate" or "entropy" of a language. Ths s approprate, for example, when the source of nformaton s Englsh prose. The rate of a source of nformaton s related to ts redundancy and how well t can be compressed, the subject of source codng (Arndt, 2004). Channel capacty Communcatons over a channel such as an ethernet wre s the prmary motvaton of nformaton theory. As anyone who's ever used a telephone (moble or landlne) knows, however, such channels often fal to produce exact reconstructon of a sgnal; nose, perods of slence, and other forms of sgnal corrupton often degrade qualty. How much nformaton can one hope to communcate over a nosy (or otherwse mperfec channel? Consder the communcatons process over a dscrete channel. A smple model of the process s shown below: Here represents the space of messages transmtted and Y the space of messages receved durng a unt tme over our channel. Let y x) be the condtonal probablty dstrbuton functon of Y gven. We wll consder y x) to be an nherent fxed property of our communcatons channel (representng the nature of the nose of our channel). Then the jont dstrbuton of and Y s completely determned by our channel and by our choce of f(x), the margnal dstrbuton of messages we choose to send over the channel. Under these constrants, we would lke to maxmze the rate of nformaton, or the sgnal, we can communcate over the channel. The approprate measure for ths s the mutual nformaton, and ths maxmum mutual nformaton s called the channel capacty and s gven by: C = max I( ; Y) f Ths capacty has the followng property related to communcatng at nformaton rate R (where R s usually bts per symbol). For any nformaton rate R < C and codng error ε > 0, for large enough N, there exsts a code of length N and rate R and a decodng algorthm, such that the maxmal probablty of block error s ε; that s, t

5 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: s always possble to transmt wth arbtrarly small block error. In addton, for any rate R > C, t s mpossble to transmt wth arbtrarly small block error. Channel codng s concerned wth fndng such nearly optmal codes that can be used to transmt data over a nosy channel wth a small codng error at a rate near the channel capacty. Capacty of partcular channel models A contnuous-tme analog communcatons channel subject to Gaussan nose see Shannon Hartley theorem. A bnary symmetrc channel (BSC) wth crossover probablty p s a bnary nput, bnary output channel that flps the nput bt wth probablty p. The BSC has a capacty of ( p) bts per channel use, where H b H b s the bnary entropy functon to the base 2 logarthm: no known attack can break them n a practcal amount of tme. Informaton theoretc securty refers to methods such as the one-tme pad that are not vulnerable to such brute force attacks. In such cases, the postve condtonal mutual nformaton between the plantext and cphertext (condtoned on the ke can ensure proper transmsson, whle the uncondtonal mutual nformaton between the plantext and cphertext remans zero, resultng n absolutely secure communcatons. In other words, an eavesdropper would not be able to mprove hs or her guess of the plantext by ganng knowledge of the cphertext but not of the key. However, as n any other cryptographc system, care must be used to correctly apply even nformaton-theoretcally secure methods; the Venona project was able to crack the one-tme pads of the Sovet Unon due to ther mproper reuse of key materal (Ash, 990, Cover & Joy, 2006). A bnary erasure channel (BEC) wth erasure probablty p s a bnary nput, ternary output channel. The possble channel outputs are 0,, and a thrd symbol 'e' called an erasure. The erasure represents complete loss of nformaton about an nput bt. The capacty of the BEC s - p bts per channel use. Applcatons to other felds Intellgence uses and secrecy applcatons: Informaton theoretc concepts apply to cryptography and cryptanalyss. Turng's nformaton unt, the ban, was used n the Ultra project, breakng the German Engma machne code and hastenng the end of WWII n Europe. Shannon hmself defned an mportant concept now called the uncty dstance. Based on the redundancy of the plantext, t attempts to gve a mnmum amount of cphertext necessary to ensure unque decpherablty. Informaton theory leads us to beleve t s much more dffcult to keep secrets than t mght frst appear. A brute force attack can break systems based on asymmetrc key algorthms or on most commonly used methods of symmetrc key algorthms (sometmes called secret key algorthms), such as block cphers. The securty of all such methods currently comes from the assumpton that Entropy by fuzzy probablty In ths secton we use the fuzzy probablty nstead of probablty n entropy. Defnton. Let Ω be a sample space and P be a probablty measure on Ω. If A ~ s a fuzzy event of Ω, then probablty of A ~ s defned as ~ A( w) d w) f Ω s not dscerete ~ Ω A) = A ~ ( w) w) f Ω s dscerete w Ω Example 2. Let tossng a con and A ~ be a small number occurs and B ~ be approxmately number 5. Then the events are: ~ A = {,,,, } ~ B = {,,,, } In ths case the probablty of event A ~ s ~ ~ p ( A) = A( Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol. t Ω ~ = A () {}) + A ~ (2) {2}) + K + A ~ (6) {6}) = = 2.6 = and ~ p ( B ~ ) = B( = t Ω 6 Therefore we can n above example the probablty of a fuzzy event may be dffer wth the (crsp) probablty of an

6 Indan Journal of Scence and Technology Vol. 3 No. 9 (Sep 200) ISSN: event snce the fuzzy event was dffered wth (crsp) event. In entropy formula H ( p,..., p ) = C p log p we = consder the fuzzy probablty nstead of (crsp) probablty for generalzaton of ths noton and we ntroduce the fuzzy entropy. Therefore we can generalze all of the above notons and then we can ntroduce the fuzzy nformaton theory. Defnton 3. Let the entropy formula p be fuzzy probablty. Then we can use H ( p,..., p ) = C p log p = and compute the entropy for the fuzzy event wth fuzzy probablty nstead of (crsp) probablty. For convenence we put C = and then we have H ( p,..., p ) = p log p. = Example 4. Consder the fuzzy events of Example 2; we can compute the entropy for these fuzzy events. ~ ~ 2 H ( A, B) = p log p = (0.433log(0.433) log(0.46)) = = Concluson In ths paper we study the nformaton theory, entropy, codng and applcatons. We ntroduced the entropy wth fuzzy probablty as a generalzaton of entropy wth probablty whch s studed before. Snce we lve n fuzzy world and always we work wth vague notons therefore for transferrng the fuzzy events we must use another way snce studed method was about crsp events. We hope that ths noton s used for compactfcaton the data for transformaton and decrease the speed of networks. As a next work we can study ths noton and develop ths method for practcal problem. Acknowledgement The author would lke to express hs sncere thanks to the referees for ther valuable suggestons and comments. References. Anderson DR (2003) Some background on why people n the emprcal scences may want to better understand the nformaton-theoretc methods. en/coeevoluton/events/tms/why. Retreved on Dec. 30th, Arndt Chrstoph (2004) Informaton measures, nformaton and ts descrpton n scence and engneerng. Sgnals and Communcaton Technology, Sprnger Seres. 3. Ash B. Robert (990) Informaton theory. Interscence, NY; Dover, NY. 4. Burnham KP and Anderson DR (2002) odel selecton and multmodel nference: A practcal nformaton-theoretc approach, 2nd Edton, Sprnger Scence, NY. 5. Cover. Thomas and Joy A. Thomas (2006) Elements of nformaton theory. 2nd Edton, Wley- Inter-scence, NY. 6. Csszar Imre and Janos Korner (997) Informaton theory: codng theorems for dscrete memoryless systems. Akadema Kado, 2nd edton. 7. Davd J and C. ackay (2003) Informaton theory, nference, and learnng algorthms. Unversty Press, Cambrdge. 8. Gallager Robert (968) Informaton theory and relable communcaton. John Wley & Sons, NY. 9. Gbson Jerry D (998) Dgtal Compresson for ultmeda: Prncples and Standards. organ Kaufmann. g=pa56&dq=entropy-ate+condtonal &as_brr=3&e=ygdsrtzggkjupqka2l2xdw&sg=o0 UCtf0xZOflPIexPrjOKPgNc#PPA57,. 0. Goldman Stanford (2005) Informaton theory. Dover, NY.. Jaynes ET (957) Informaton theory and statstcal mechancs. Phys. Rev. 06, Kolmogorov Andrey (968) Three approaches to the quanttatve defnton of nformaton. Intl.J. Computer athematcs. Problems of Informaton Transmsson. No., ansurpur asud (987) Introducton to nformaton theory. Prentce Hall, NY. 4. Raymond W. Yeung (2008) Informaton theory and network codng. Sprnger. 5. Reza Fazlollah (994) An ntroducton to nformaton theory. Dover, NY. 6. Reke F, Warland D, Ruyter van Stevennck R and Balek Spkes W (997) Explorng the neural code. The IT press. 7. Shannon CE (948) A athematcal Theory of Communcaton. Bell System Techncal J. 27, & Shannon CE and Warren Weaver (949) The athematcal theory of communcaton. Unv of Illnos Press. 9. Yeung Raymond W (2002) A frst course n nformaton theory. Kluwer Academc/Plenum Publ. Research artcle Fuzzy nformaton theory Zarand Indan Socety for Educaton and Envronment (See) Indan J.Sc.Technol.

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003 Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Quantum and Classical Information Theory with Disentropy

Quantum and Classical Information Theory with Disentropy Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

3.1 ML and Empirical Distribution

3.1 ML and Empirical Distribution 67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Winter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan

Winter 2008 CS567 Stochastic Linear/Integer Programming Guest Lecturer: Xu, Huan Wnter 2008 CS567 Stochastc Lnear/Integer Programmng Guest Lecturer: Xu, Huan Class 2: More Modelng Examples 1 Capacty Expanson Capacty expanson models optmal choces of the tmng and levels of nvestments

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy ESCI 341 Atmospherc Thermodynamcs Lesson 10 The Physcal Meanng of Entropy References: An Introducton to Statstcal Thermodynamcs, T.L. Hll An Introducton to Thermodynamcs and Thermostatstcs, H.B. Callen

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Learning from Data 1 Naive Bayes

Learning from Data 1 Naive Bayes Learnng from Data 1 Nave Bayes Davd Barber dbarber@anc.ed.ac.uk course page : http://anc.ed.ac.uk/ dbarber/lfd1/lfd1.html c Davd Barber 2001, 2002 1 Learnng from Data 1 : c Davd Barber 2001,2002 2 1 Why

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights ACTA ET COMMENTATIONES UNIVERSITATIS TARTUENSIS DE MATHEMATICA Volume 7, Number 2, December 203 Avalable onlne at http://acutm.math.ut.ee A note on almost sure behavor of randomly weghted sums of φ-mxng

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Lecture 4: November 17, Part 1 Single Buffer Management

Lecture 4: November 17, Part 1 Single Buffer Management Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input

More information

An Application of Fuzzy Hypotheses Testing in Radar Detection

An Application of Fuzzy Hypotheses Testing in Radar Detection Proceedngs of the th WSES Internatonal Conference on FUZZY SYSEMS n pplcaton of Fuy Hypotheses estng n Radar Detecton.K.ELSHERIF, F.M.BBDY, G.M.BDELHMID Department of Mathematcs Mltary echncal Collage

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Refined Coding Bounds for Network Error Correction

Refined Coding Bounds for Network Error Correction Refned Codng Bounds for Network Error Correcton Shenghao Yang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, N.T., Hong Kong shyang5@e.cuhk.edu.hk Raymond W. Yeung Department

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

Entropy of Markov Information Sources and Capacity of Discrete Input Constrained Channels (from Immink, Coding Techniques for Digital Recorders)

Entropy of Markov Information Sources and Capacity of Discrete Input Constrained Channels (from Immink, Coding Techniques for Digital Recorders) Entropy of Marov Informaton Sources and Capacty of Dscrete Input Constraned Channels (from Immn, Codng Technques for Dgtal Recorders). Entropy of Marov Chans We have already ntroduced the noton of entropy

More information

ISSN On error probability exponents of many hypotheses optimal testing illustrations

ISSN On error probability exponents of many hypotheses optimal testing illustrations Journal Afrka Statstka Vol. 6, 2011, pages 307 315. DOI: http://dx.do.org/10.4314/afst.v61.1 Journal Afrka Statstka ISS 0825-0305 On error probablty exponents of many hypotheses optmal testng llustratons

More information

Digital Modems. Lecture 2

Digital Modems. Lecture 2 Dgtal Modems Lecture Revew We have shown that both Bayes and eyman/pearson crtera are based on the Lkelhood Rato Test (LRT) Λ ( r ) < > η Λ r s called observaton transformaton or suffcent statstc The crtera

More information

EPR Paradox and the Physical Meaning of an Experiment in Quantum Mechanics. Vesselin C. Noninski

EPR Paradox and the Physical Meaning of an Experiment in Quantum Mechanics. Vesselin C. Noninski EPR Paradox and the Physcal Meanng of an Experment n Quantum Mechancs Vesseln C Nonnsk vesselnnonnsk@verzonnet Abstract It s shown that there s one purely determnstc outcome when measurement s made on

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013 ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run

More information

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness. 20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The frst dea s connectedness. Essentally, we want to say that a space cannot be decomposed

More information

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00 A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00 Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory

More information

A random variable is a function which associates a real number to each element of the sample space

A random variable is a function which associates a real number to each element of the sample space Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Graph Reconstruction by Permutations

Graph Reconstruction by Permutations Graph Reconstructon by Permutatons Perre Ille and Wllam Kocay* Insttut de Mathémathques de Lumny CNRS UMR 6206 163 avenue de Lumny, Case 907 13288 Marselle Cedex 9, France e-mal: lle@ml.unv-mrs.fr Computer

More information

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Mathematical Models for Information Sources A Logarithmic i Measure of Information Introducton to Informaton Theory Wreless Informaton Transmsson System Lab. Insttute of Communcatons Engneerng g Natonal Sun Yat-sen Unversty Table of Contents Mathematcal Models for Informaton Sources

More information

Excess Error, Approximation Error, and Estimation Error

Excess Error, Approximation Error, and Estimation Error E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure Intern. J. Fuzzy Mathematcal rchve Vol. 5, No., 04, 9-7 ISSN: 30 34 (P, 30 350 (onlne Publshed on 5 November 04 www.researchmathsc.org Internatonal Journal of Complement of Type- Fuzzy Shortest Path Usng

More information

State Amplification and State Masking for the Binary Energy Harvesting Channel

State Amplification and State Masking for the Binary Energy Harvesting Channel State Amplfcaton and State Maskng for the Bnary Energy Harvestng Channel Kaya Tutuncuoglu, Omur Ozel 2, Ayln Yener, and Sennur Ulukus 2 Department of Electrcal Engneerng, The Pennsylvana State Unversty,

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

On mutual information estimation for mixed-pair random variables

On mutual information estimation for mixed-pair random variables On mutual nformaton estmaton for mxed-par random varables November 3, 218 Aleksandr Beknazaryan, Xn Dang and Haln Sang 1 Department of Mathematcs, The Unversty of Msssspp, Unversty, MS 38677, USA. E-mal:

More information

G /G Advanced Cryptography 12/9/2009. Lecture 14

G /G Advanced Cryptography 12/9/2009. Lecture 14 G22.3220-001/G63.2180 Advanced Cryptography 12/9/2009 Lecturer: Yevgeny Dods Lecture 14 Scrbe: Arsteds Tentes In ths lecture we covered the Ideal/Real paradgm and the noton of UC securty. Moreover, we

More information

The Second Anti-Mathima on Game Theory

The Second Anti-Mathima on Game Theory The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player

More information

Statistical Foundations of Pattern Recognition

Statistical Foundations of Pattern Recognition Statstcal Foundatons of Pattern Recognton Learnng Objectves Bayes Theorem Decson-mang Confdence factors Dscrmnants The connecton to neural nets Statstcal Foundatons of Pattern Recognton NDE measurement

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

a b a In case b 0, a being divisible by b is the same as to say that

a b a In case b 0, a being divisible by b is the same as to say that Secton 6.2 Dvsblty among the ntegers An nteger a ε s dvsble by b ε f there s an nteger c ε such that a = bc. Note that s dvsble by any nteger b, snce = b. On the other hand, a s dvsble by only f a = :

More information

Information Geometry of Gibbs Sampler

Information Geometry of Gibbs Sampler Informaton Geometry of Gbbs Sampler Kazuya Takabatake Neuroscence Research Insttute AIST Central 2, Umezono 1-1-1, Tsukuba JAPAN 305-8568 k.takabatake@ast.go.jp Abstract: - Ths paper shows some nformaton

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Introduction to Random Variables

Introduction to Random Variables Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information