Classification of Incomplete Patterns Based on the Fusion of Belief Functions

Size: px
Start display at page:

Download "Classification of Incomplete Patterns Based on the Fusion of Belief Functions"

Transcription

1 18th Internatonal Conference on Informaton Fuson Washngton, DC - July 6-9, 2015 Classfcaton of Incomplete Patterns Based on the Fuson of Belef Functons Zhun-ga Lu, Quan Pan, Jean Dezert, Arnaud Martn and GrégoreMercer Northwestern Polytechncal Unversty, X an, Chna. Emal: luzhunga@nwpu.edu.cn, quanpan@nwpu.edu.cn ONERA - The French Aerospace Lab, F Palaseau, France. Emal: jean.dezert@onera.fr IRISA, Unversty of Rennes 1, Lannon, France. Emal: Arnaud.Martn@unv-rennes1.fr Telecom Bretagne, CNRS UMR 6285 Lab-STICC/CID, Brest, France. Emal: Gregore.Mercer@telecom-bretagne.eu Abstract The nfluence of the mssng values n the classfcaton of ncomplete pattern manly depends on the context. In ths paper, we present a fast classfcaton method for ncomplete pattern based on the fuson of belef functons where the mssng values are selectvely (adaptvely) estmated. At frst, t s assumed that the mssng nformaton s not crucal for the classfcaton, and the object (ncomplete pattern) s classfed based only on the avalable attrbute values. However, f the object cannot be clearly classfed, t mples that the mssng values play an mportant role to obtan an accurate classfcaton. In ths case, the mssng values wll be mputed based on the K-nearest neghbor (K- NN) and self-organzng map (SOM) technques, and the edted pattern wth the mputaton s then classfed. The (orgnal or edted) pattern s respectvely classfed accordng to each tranng class, and the classfcaton results represented by basc belef assgnments (BBA s) are fused wth proper combnaton rules for makng the credal classfcaton. The object s allowed to belong wth dfferent masses of belef to the specfc classes and metaclasses (.e. dsjunctons of several sngle classes). Ths credal classfcaton captures well the uncertanty and mprecson of classfcaton, and reduces effectvely the rate of msclassfcatons thanks to the ntroducton of meta-classes. The effectveness of the proposed method wth respect to other classcal methods s demonstrated based on several experments usng artfcal and real data sets. Keywords: nformaton fuson, combnaton rule, belef functons, classfcaton, ncomplete pattern. I. INTRODUCTION In many practcal classfcaton problems, some attrbutes of object can be mssng for varous reasons (e.g. the falure of the sensors, etc). So t s crucal to develop effcent technques to classfy as best as possble the objects wth mssng attrbute values (ncomplete pattern), and the search for a soluton of ths problem remans an mportant research topc n the communty [1], [2]. Many classfcaton approaches have been proposed to deal wth the ncomplete patterns [1]. The smplest method conssts n removng (gnorng) drectly the patterns wth mssng values, and the classfer s desgned only for the complete patterns. Ths method s acceptable when the ncomplete data set s only a very small subset (e.g. less than 5%) of the whole data set. A wdely adopted method s to fll the mssng values wth proper estmatons [3], and then to classfy the the edted patterns. There have been dfferent works devoted to the mputaton (estmaton) of mssng data. For example, the mputaton can be done ether by the statstcal methods, e.g. mean mputaton [4], regress mputaton [2], etc, or by machne learnng methods, e.g. K-nearest neghbors (K-NN) mputaton [5], Fuzzy c-means (FCM) mputaton [6], [7], etc. Some model-based technques have also been developed for dealng wth ncomplete patterns [8]. The probablty densty functon (PDF) of the tranng data (complete and ncomplete cases) s estmated at frst, and then the object s classfed usng bayesan reasonng. Other classfers [9] have also been proposed to drectly handle ncomplete pattern wthout mputng the mssng values. All these methods attempt to classfy the object nto a partcular class wth maxmal probablty or lkelhood measure. However, the estmaton of mssng values s n general qute uncertan, and the dfferent mputatons of mssng values can yeld very dfferent classfcaton results, whch prevent us to correctly commt the object nto a partcular class. Belef functon theory (BFT), also called Dempster-Shafer theory (DST) [10] and ts extenson [11], [12] offer a mathematcal framework for modelng uncertanty and mprecse nformaton [13]. BFT has already been appled successfully for object classfcaton [14], [15], [17] [19], clusterng [20] [23] and mult-source nformaton fuson [24], etc. Some classfers for the complete pattern based on DST have been developed by Denœux and hs collaborators to come up wth the evdental K-nearest neghbors [14], evdental neural network [19], etc. The extra gnorance element represented by the dsjuncton of all the elements n the whole frame of dscernment s ntroduced n these classfers to capture the totally gnorant nformaton. However, the partal mprecson, whch s very mportant n the classfcaton, s not well characterzed. That s why we have proposed new credal classfers n [15] [17], [22]. Our new classfers take nto account all the possble metaclasses (.e. the partcular dsjunctons of several sngleton classes) to model the partal mprecse nformaton thanks to belef functons. The credal classfcaton allows the objects to belong (wth dfferent masses of belef) not only to the ISIF 2033

2 sngleton classes, but also to any set of classes correspondng to the meta-classes. In our recent research works, a prototype-based credal classfcaton (PCC) [25] method for the ncomplete patterns has been ntroduced to well capture the mprecson of classfcaton caused by the mssng values. The object hard to correctly classfy are commtted to a sutable meta-class by PCC, whch captures well the mprecson of classfcaton caused by the mssng values and also reduces the msclassfcaton errors. In PCC, the mssng values n all the ncomplete patterns are mputed usng the prototype of each class, and the edted pattern wth each mputaton s respectvely classfed by a standard classfer (used for the classfcaton of complete pattern). Wth PCC, one obtans c peces of classfcaton results for one ncomplete pattern n a c class problem, and the global fuson of the c results s used for the credal classfcaton. Unfortunately, PCC classfer s computatonally greedy and tme-consumng, and the method of mputaton of the mssng values based on the prototype of each class s not so precse and accurate. That s why we propose a new nnovatve and more effectve method for credal classfcaton of ncomplete pattern wth adaptve mputaton of mssng values, and ths method can be called Credal Classfcaton wth Adaptve Imputaton (CCAI) for short. The pattern to classfy usually conssts of multple attrbutes. Sometmes, the class of the pattern can be precsely determned usng only a part (a subset) of the avalable attrbutes, whch means that the other attrbutes are redundant and n fact unnecessary for the classfcaton. In the classfcaton of ncomplete pattern wth mssng values, one can attempt at frst to classfy the object only usng the known attrbutes value. If a specfc classfcaton result s obtaned, t very lkely means that the mssng values are not very necessary for the classfcaton, and we drectly take the decson on the class of the object based on ths result. However, f we the object cannot be clearly classfed wth the avalable nformaton, t means that the mssng nformaton ncluded n the mssng attrbute values s probably very crucal for makng the classfcaton. In ths case, we propose a sophstcated classfcaton strategy for the edted pattern wth proper mputaton of mssng values obtaned usng K-NN and self-organzng map (SOM) technques [26]. The nformaton fuson technque s adopted n the classfcaton of orgnal ncomplete pattern (wthout mputaton of mssng values) or the edted pattern (wth mputaton of mssng values) to obtan the good results. One can respectvely get the smple classfcaton result represented by a smple basc belef assgnment (BBA) accordng to each tranng class. The global fuson (ensemble) of these multple BBA s wth a proper combnaton rule,.e. Dempster-Shafer (DS) rule or a new rule nspred by Dubos Prade (DP) rule dependng on the actual case, s then used to determne the class of the object. Ths paper s organzed as follows. The bascs of belef functon theory s brefly recalled n secton II. The new credal classfcaton method for ncomplete patterns s presented n the secton III, and the proposed method s then tested and evaluated n secton IV compared wth several other classcal methods. It s concluded n the fnal. II. BASIS OF BELIEF FUNCTION THEORY The Belef Functon Theory (BFT) s also known as Dempster-Shafer Theory (DST), or the Mathematcal Theory of Evdence [10] [12]. Let us consder a frame of dscernment consstng of c exclusve and exhaustve hypotheses (classes) denoted by Ω = {ω, = 1,2,...,c}. The power-set of Ω denoted 2 Ω s the set of all the subsets of Ω, empty set ncluded. In the classfcaton problem, the sngleton element (e.g. ω ) represents a specfc class. In ths work, the dsjuncton (unon) of several sngleton elements s called a meta-class whch characterzes the partal gnorance of classfcaton. In BFT, the basc belef assgnment (BBA) s a functon m(.) from 2 Ω to [0, 1] satsfyng m( ) = 0 and the normalzaton condton m(a) = 1. The subsets A of Ω such that m(a) > 0 are A 2 Ω called the focal elements of the belef mass m(.). The credal classfcaton (or parttonng) [20], [21] s defned as n-tuple M = (m 1,,m n ) of BBA s, where m s the basc belef assgnment of the object x X, = 1,...,n assocated wth the dfferent elements n the power-set 2 Θ. The credal classfcaton can well model the mprecse and uncertan nformaton thanks to the ntroducton of meta-class. For combnng multple sources of evdence represented by a set of BBA s, the well-known Dempster s rule [10] s stll wdely used. We denote t by DS (standng for Dempster- Shafer) because Dempster s rule has been wdely promoted by Shafer n [10]. The combnaton of two BBA s m 1 (.) and m 2 (.) over 2 Ω s done wth DS rule of combnaton defned by m DS ( ) = 0 and for A,B,C 2 Ω by m 1 (B)m 2 (C) B C=A m DS (A) = 1 m 1 (B)m 2 (C) B C= DS rule s commutatve and assocatve, and makes a compromse between the specfcty and complexty for the combnaton of BBA s. However, DS rule produces unreasonable results n hgh conflctng cases, and as well as n some specal low conflctng cases [27]. Many alternatve rules have been proposed to overcome the lmtatons of DS rule, e.g. Dubos- Prade (DP) rule [28] and Proportonal Conflct Redstrbutons (PCR) rules [29]. Our method s nspred by DP rule [28] defned by m DP ( ) = 0 and for A,B,C 2 Θ by m DP (A) = m 1 (B)m 2 (C)+ m 1 (B)m 2 (C) B C=A B C= B C=A (2) In DP rule, the partal conflctng belefs are all transferred to the unon of the elements (.e. meta-class) nvolved n the partal conflct. III. CREDAL CLASSIFICATION OF INCOMPLETE PATTERN Our new method conssts of two man steps. In the frst step, the object (ncomplete pattern) s drectly classfed accordng (1) 2034

3 to the known attrbute values only, and the mssng values are gnored. If one can get a specfc classfcaton result, the classfcaton procedure s done because the avalable attrbute nformaton s suffcent for makng the classfcaton. But f the class of the object cannot be clearly dentfed n the frst step, t means that the unavalable nformaton ncluded n the mssng values s lkely crucal for the classfcaton. In ths case, one has to enter n the second step of the method to classfy the object wth a proper mputaton of mssng values. In the classfcaton procedure, the orgnal or edted pattern wll be respectvely classfed accordng to each class of tranng data. The global fuson of these classfcaton results, whch can be consdered as multple sources of evdence represented by BBA s, s then used for the credal classfcaton of the object. The new method s referred as Credal Classfcaton wth Adaptve Imputaton of mssng values denoted by CCAI for concseness. A. Step 1: Drect classfcaton of ncomplete pattern Let us consder a set of test patterns (samples) X = {x 1,...,x n } to be classfed based on a set of labeled tranng patterns Y = {y 1,...,y s } over the frame of dscernment Ω = {ω 1,...,ω c }. In ths work, we focus on the classfcaton of ncomplete pattern n whch some attrbute values are absent. So we consder all the test patterns (e.g. x, = 1,...,n) wth several mssng values. The tranng data set Y may also have ncomplete patterns n some applcatons. However, f the ncomplete patterns take a very small amount say less than 5% n the tranng data set, they can be gnored n the classfcaton. If the percentage of ncomplete patterns s bg, the mssng values must usually be estmated at frst, and the classfer wll be traned usng the edted (complete) patterns. In the real applcatons, one can also just choose the complete labeled patterns to nclude n the tranng data set when the tranng nformaton s suffcent. So for smplcty and convenence, we consder that the labeled samples (e.g. y j,j = 1,...,s) of the tranng set Y are all complete patterns n the sequel. In the frst step of classfcaton, the ncomplete pattern say x wll be respectvely classfed accordng to each tranng class by a normal classfer (for dealng wth the complete pattern) at frst, and all the mssng values are gnored here. In ths work, we adopt a very smple classfcaton method for the convenence of computaton, and x s drectly classfed based on the dstance to the prototype of each class. The prototype of each class {o 1,...,o c } correspondng to {ω 1,...,ω c } s gven by the arthmetc average vector of the tranng patterns n the same class. Mathematcally, the prototype s computed for g = 1,...,c by o g = 1 y j (3) N g y j ω g where N g s the number of the tranng samples n the class ω g. In a c-class problem, one can get c peces of smple classfcaton result for x accordng to each class of tranng data, and each result s represented by a smple BBA s ncludng two focal elements,.e. the sngleton class and the gnorant class (Ω) to characterze the full gnorance. The belef of x belongng to class w g s computed based on the dstance between x and the correspondng prototype o g. Mahalanobs dstance s adopted here to deal wth the ansotropc class, and the mssng values are gnored n the calculaton of ths dstance. The other mass of belef s assgned to the gnorant class Ω. Therefore, the BBA s constructon s done by { m og (w g ) = e ηdg (4) m og (Ω) = 1 e ηdg wth d g = 1 p ( ) 2 xj o gj (5) p and δ gj = j=1 1 N g δ gj y ω g (y j o gj ) 2 (6) where x j s value of x n j-th dmenson, and y j s value of y n j-th dmenson. p s the number of avalable attrbute values n the object x. The coeffcent 1/p s necessary to normalze the dstance value because each test sample can have a dfferent number of mssng values. δ gj s the average dstance of all tranng samples n class ω g to the prototype o g n j-th dmenson. N g s the number of tranng samples n ω g. η s a tunng parameter, and the bgger η generally yelds smaller mass of belef on the specfc class w g. Obvously, the smaller dstance measure, the bgger mass of belef on the sngleton class. Ths partcular structure of BBA s ndcates that we can just confrm the degree of the object x assocated wth the specfc class w g only accordng to tranng data n w g. The other mass of belef reflects the level of belef one has on full gnorance, and t s commtted to the gnorant class Ω. Smlarly, one calculates c ndependent BBA s m og (w g ),g = 1,...,c based on the dfferent tranng classes. Before combnng these c BBA s, we examne whether a specfc classfcaton result can be derved from these c BBA s. Ths s done as follows: f t holds that m o1st (st ) = argmax g (m og (w g )), then the object wll be consdered to belong very lkely to the class st, whch obtans the bggest mass of belef n thecbba s. The class wth the second bggest mass of belef s denoted nd. The dstngushablty degree χ (0,1] of an object x assocated wth dfferent classes s defned by: χ = mo 2nd (nd ) (7) m omax (w max ) Let ǫ be a chosen small postve dstngushablty threshold value n (0,1]. If the condton χ ǫ s satsfed, t means that all the classes nvolved n the computaton of χ can be clearly dstngushed of x. In ths case, t s very lkely to obtan a specfc classfcaton result from the fuson of the c BBA s. The condton χ ǫ also ndcates that the avalable attrbute nformaton s suffcent for makng the 2035

4 classfcaton of the object, and the mputaton of the mssng values s not necessary. If χ ǫ condton holds, he c BBA s are drectly combned wth DS rule (1) to obtan the fnal classfcaton results of the object because DS rule usually produces specfc combnaton result wth acceptable computaton burden n the low conflctng case. In such case, the meta-class s not ncluded n the fuson result, because these dfferent classes are consdered dstngushable based on the condton of dstngushablty. Moreover, the mass of belef of the full gnorance class Ω, whch represents the nosy data (outlers), can be proportonally redstrbuted to other sngleton classes for more specfc results f one knows a pror that the nosy data s not nvolved. If the dstngushablty condton χ ǫ s not satsfed, t means that the classes st and nd cannot be clearly dstngushed for the object wth respect to the chosen threshold value ǫ, ndcatng that mssng attrbute values play almost surely a crucal role n the classfcaton. In ths case, the mssng values must be properly mputed to recover the unavalable attrbute nformaton before enterng the classfcaton procedure. Ths s the Step 2 of our method whch s explaned n the next subsecton. B. Step 2: Classfcaton of ncomplete pattern wth mputaton of mssng values 1) Multple estmaton of mssng values: In the estmaton of the mssng attrbute values, there exst varous methods. Partcularly, the K-NN mputaton method generally provdes good performance. However, the man drawback of KNN method s ts bg computatonal burden, snce one needs to calculate the dstances of the object wth all the tranng samples. Inspred by [30], we propose to use the Self Organzed Map (SOM) technque [26], [30] to reduce the computatonal complexty. SOM can be appled n each class of tranng data, and then M N weghtng vectors wll be obtaned after the optmzaton procedure. These optmzed weghtng vectors allow to characterze well the topologcal features of the whole class, and they wll be used to represent the correspondng data class. The number of the weghtng vectors s usually small (e.g. 5 6). So the K nearest neghbors of the test pattern assocated wth these weghtng vectors n the SOM can be easly found wth low computatonal complexty 1. The selected weghtng vector no. k n the class w g, g = 1,...,c s denoted σ wg k, for k = 1,...,K. In each class, the K selected close weghtng vectors provde dfferent contrbutons (weghts) n the estmaton of mssng values. The weghtp wg k of each vector s defned based on the dstance between the object x and weghtng vector σ wg k as follows p wg k = e( λdwg k ) (8) 1 The tranng of SOM usng the labeled patterns becomes tme consumng when the number of labeled patterns s bg, but fortunately t can be done offlne. In our experments, the runnng tme performance shown n the results does not nclude the computatonal tme spent for the off-lne procedures. wth λ = cnm(cnm 1) 2 d(σ,σ j ),j where d wg k s the Eucldean dstance between x and the neghboro wg k gnorng the mssng values, and 1 λ s the average dstance between each par of weghtng vectors produced by SOM n all the classes; c s the number of classes; M N s the number of weghtng vectors obtaned by SOM n each class; and d(σ,σ j ) s the Eucldean dstance between any two weghtng vectors σ and σ j. The weghted mean value ŷ wg of the selected K weghtng vectors n class tranng class w g wll be used for the mputaton of mssng values. It s calculated by K K ŷ wg = ( p wg k σwg k )/( p wg k ) (10) k=1 k=1 The mssng values n x wll be flled by the values of ŷ wg n the same dmensons. By dong ths, we get the edted pattern x wg accordng to the tranng class w g. Then x wg wll be smply classfed only based on the tranng data n w g as smlarly done n the drect classfcaton of ncomplete pattern usng eq. (4) of Step 1 for convenence 2. The classfcaton of x wth the estmaton of mssng values s also respectvely done based on the other tranng classes accordng to ths procedure. For a c-class problem, there are c tranng classes, and therefore one can get c peces of classfcaton results wth respect to one object. 2) Ensemble classfer for credal classfcaton: These c peces of results obtaned by each class of tranng data n a c-class problem are consdered wth dfferent weghts, snce the estmatons of the mssng values accordng to dfferent classes have dfferent relabltes. The weghtng factor of the classfcaton result assocated wth the class w g can be defned by the sum of the weghts of the K selected SOM weghtng vectors for the contrbutons to the mssng values mputaton n w g, whch s gven by K ρ wg = p wg k (11) k=1 The result wth the bggest weghtng factor ρ wmax s consdered as the most relable, because one assumes that the object must belong to one of the labeled classes (.e. w g, g = 1,...,c). So the bggest weghtng factor wll be normalzed as one. The other relatve weghtng factors are defned by: ˆα wg = ρwg ρ wmax (9) (12) If the condton 3 ˆα wg < ǫ s satsfed, the correspondng estmaton of the mssng values and the classfcaton result 2 Of course, some other sophstcated classfers can also be appled here accordng to the selecton of user, but the choce of classfer s not the man purpose of ths work. 3 The threshold ǫ s the same as n secton III-A, because t s also used to measure the dstngushablty degree here. 2036

5 are not very relable. Very lkely, the object does not belong to ths class. It s mplctly assumed that the object can belong to only one class n realty. If ths result whose relatve weghtng factor s very small (w.r.t.ǫ) s stll consdered useful, t wll be (more or less) harmful for the fnal classfcaton of the object. So f the condton ˆα wg < ǫ holds, then the relatve weghtng factor s set to zero. More precsely, we wll take { w 0, f ˆα g α wg = ρ wg ρ wmax < ǫ, otherwse. (13) After the estmaton of weghtng (dscountng) factorsα wg the c classfcaton results (the BBA s m og (.)) are classcally dscounted [10] by { ˆm og (w g ) = α wg m og (w g ) (14) ˆm og (Ω) = 1 α wg +α wg m og (Ω) These dscounted BBA s wll be globally combned to get the credal classfcaton result. If α wg = 0, one gets ˆm og (Ω) = 1, and ths fully gnorant (vacuous) BBA plays a neutral role n the global fuson process for the fnal classfcaton of the object. Although we have done our best to estmate the mssng values, the estmaton can be qute mprecse when the estmatons are obtaned from dfferent class wth the smlar weghtng factors, and the dfferent estmatons probably lead to dstnct classfcaton results. In such case, we prefer to cautously keep (rather to gnore) the uncertanty, and mantan the uncertanty n the classfcaton result. Such uncertanty can be well reflected by the conflct of these classfcaton results represented by the BBA s. DS rule s not sutable here, because all the conflctng belefs are dstrbuted to other focal elements. A partcular combnaton rule nspred by DP rule s ntroduced here to fuse these BBA s accordng to the current context. In our new rule, the partal conflctng belefs are prudently transferred to the proper meta-class to reveal the mprecson degree of the classfcaton caused by the mssng values. Ths new rule of combnaton s defned by: m (w g ) = ˆm og (w g ) ˆm oj (Ω) j g m (A) = ˆm oj (w j ) ˆm o k (Ω) (15) k j w j=a j, The global fuson formula (15) conssts of two parts. In the frst part, we use the conjunctve combnaton to commt the mass of belef to the specfc (sngleton) class, whereas the dsjunctve combnaton s used to transfer the conflctng belefs to the proper meta-class n the second part. The test pattern can be classfed accordng to the fuson results, and the object s consdered belongng to the class (sngleton class or meta-class) wth the maxmum mass of belef. Ths s called hard credal classfcaton. If one object s classfed to a partcular class, t means that ths object has been correctly classfed wth the proper mputaton of mssng values. If one object s commtted to a meta-class (e.g. A B), t means that we just know that ths object belongs to one of the specfc classes (e.g. A or B) ncluded n the meta-class, but we cannot specfy whch one. Ths case can happen when the mssng values are essental for the accurate classfcaton of ths object, but the mssng values cannot be estmated very well accordng to the context, and dfferent estmatons wll nduce the classfcaton of the object nto dstnct classes (e.g. A or B). Wth tradtonal classfers, the mssng values n each object are usually estmated before makng the classfcaton. In our CCAI approach, many objects can be drectly classfed based on the dstances to each class prototype, and the mputaton of mssng values s gnored accordng to the context. So the computaton complexty of CCAI s generally relatvely low wth respect to other methods lke KNNI, PCC, etc. Gudelne for tunng of the parameters ǫ and η: η n eq. (4) s assocated wth the calculaton of mass of belef on the specfc class, and the bgger η value wll lead to smaller mass of belef commtted to the specfc class. We advse to take η [0.5,0.8], and the value η = 0.7 can be taken as the default value. The parameter ǫ s the threshold for changng the classfcaton strategy. It s also used n Eq. (13) for the calculaton of the dscountng factor. The bgger ǫ wll makes fewer objects commtted to the meta-classes (correspondng to the low mprecson of classfcaton), but t ncreases the rsk of msclassfcaton error. ǫ should be tuned accordng to the compromse one can accept between the msclassfcaton error and mprecson. IV. EXPERIMENTS Two experments wth artfcal and real data sets have been used to test the performance of ths new CCAI method compared wth the K-NN mputaton (KNNI) method [5], FCM mputaton (FCMI) method [6], [7] and our prevous credal classfcaton PCC method [25]. The evdental neural network classfer (ENN) [19] s adopted here to classfy the edted pattern wth the estmated values n PCC, KNNI and FCMI, snce ENN produce generally good results n the classfcaton. The parameters of ENN can be automatcally optmzed as explaned n [19]. In the applcatons of PCC, the tunng parameter ǫ can be tuned accordng to the mprecson rate one can accept. In CCAI, a small number of the nodes n the 2-dmensonal grd of SOM s gven by M N = 3 4, and we take the value of K = N = 4 n K-NN for the mputaton of mssng values. Ths seems to provde good performance n the sequel experments. In order to show the ablty of CCAI and PCC to deal wth the meta-classes, the hard credal classfcaton s appled, and the class of each object s decded accordng to the crteron of the maxmal mass of belef. In our smulatons, the msclassfcaton s declared (counted) for one object truly orgnated from w f t s classfed nto A wth w A =. If w A and A w then t wll be consdered as an mprecse classfcaton. The error rate denoted by Re s calculated by Re = N e /T, where N e s number of msclassfcaton errors, and T s the number 2037

6 of objects under test. The mprecson rate denoted by R j s calculated by R j = N j /T, where N j s number of objects commtted to the meta-classes wth the cardnalty value j. In our experments, the classfcaton of object s generally uncertan (mprecse) among a very small number (e.g. 2) of classes, and we only take R 2 here snce there s no object commtted to the meta-class ncludng three or more specfc classes. A. Experment 1 (artfcal data set) In the frst experment, we show the nterest of credal classfcaton based on belef functons wth respect to the tradtonal classfcaton workng wth probablty framework. A 3-class data set Ω = {ω 1,ω 2,ω 3 } obtaned from three 2- D unform dstrbutons s consdered here. Each class has 200 tranng samples and 200 test samples, and there are 600 tranng samples and 600 test samples n total as shown n Fg (a). Classfcaton result by FCMI (Re = 14.67,tme = s) tr tr tr te te te (b). Classfcaton result by KNNI (Re = 14.17,tme = s) Fgure 1. Tranng data and test data ,2,3,3 The unform dstrbutons of the three classes are characterzed by the followng nterval bounds: x-label nterval y-label nterval (5, 65) (5, 25) (95, 155) (5, 25) (50, 110) (50, 70) The values n the second dmenson correspondng to y- coordnate of test samples are all mssng. So test samples are classfed accordng to the only one avalable value n the frst dmenson correspondng to x-coordnate. A partcular value of K = 9 s selected n the classfer K-NN mputaton method 4. The classfcaton results of the test objects by dfferent methods are gven n Fg. 2 (a) (c). For notaton concseness, we have denoted w te w test, w tr w tranng and w,...,k w... w k. The error rate (n %) and mprecson rate (n %) are specfed n the capton of each subfgure. 4 In fact, the choce of K rankng from 7 to 15 does not affect serously the results (c). Classfcaton result by CCAI (Re = 5.83,R 2 = 16.83,tme = s). Fgure 2. Classfcaton results of a 3-class artfcal data set by dfferent methods. Because the y value n the test sample s mssng, the class appears partally overlapped wth the classes and on ther margns accordng to the value of x-coordnate as shown n Fg. 1. The mssng value of the samples n the overlapped parts can be flled by qute dfferent estmatons obtaned from dfferent classes wth the almost same relabltes. For example, the estmaton of the mssng values of the objects n the rght margn of and the left margn of can be obtaned accordng to the tranng class or. The edted pattern wth the estmaton from wll be classfed nto 2038

7 class, whereas t wll be commtted to class f the estmaton s drawn from. It s smlar to the test samples n the left margn of and the rght margn of. Ths ndcates that the mssng value play a crucal rule n the classfcaton of these objects, but unfortunately the estmaton of these nvolved mssng values are qute uncertan accordng to context. So these objects are prudently classfed nto the proper meta-class (e.g. and ) by CCAI. The CCAI results ndcate that these objects belong to one of the specfc classes ncluded n the meta-classes, but these specfc classes cannot be clearly dstngushed by the object based only on the avalable values. If one wants to get more precse and accurate classfcaton results, one needs to request for addtonal resources for gatherng more useful nformaton. The other objects n the left margn of, rght margn of and mddle of can be correctly classfed based on the only known value n x-coordnate, and t s not necessary to estmate the mssng value for the classfcaton of these objects n CCAI. However, all the test samples are classfed nto specfc classes by the tradtonal methods KNNI and FCMI, and ths causes many errors due to the lmtaton of probablty framework. Thus, CCAI produces less error rate than KNNI and FCMI thanks to the use of meta-classes. Meanwhle, the computatonal tme of CCAI s smlar to that of FCMI, and s much shorter than KNNI because of the ntroducton of SOM technque n the estmaton of mssng values. It shows that the computatonal complexty of CCAI s relatvely low. Ths smple example shows the nterest and the potental of the credal classfcaton obtaned wth CCAI method. B. Experment 2 (real data set) Four well known real data sets (Breast cancer, Irs, Seeds and Wne data sets) avalable from UCI Machne Learnng Repostory [32] are used n ths experment to evaluate the performance of CCAI wth respect to KNNI, FCMI and PCC. ENN s also used here as standard classfer. The basc nformaton of these four real data sets s gven n Table I. The cross valdaton s performed on all the data sets, and we use the smplest 2-fold cross valdaton 5 here, snce t has the advantage that the tranng and test sets are both large, and each sample s used for both tranng and testng on each fold. Each test sample hasnmssng (unknown) values, and they are mssng completely at random n every dmenson. The average error rate Re and mprecson rate R (for PCC and CCAI) of the dfferent methods are gven n Table II. Partcularly, the reported classfcaton result of KNNI s the average wth K value rangng from 5 to 15. One can see that the credal classfcaton of PCC and CCAI always produce the lower error rate than the tradtonal FCMI and KNNI methods, snce some objects that cannot be 5 More precsely, the samples n each class are randomly assgned to two sets S 1 and S 2 havng equal sze. Then we tran on S 1 and test on S 2, and recprocally. Table I BASIC INFORMATION OF THE USED DATA SETS. name classes attrbutes nstances Breast (B) Irs (I) Seeds (S) Wne (W) Table II CLASSIFICATION RESULTS FOR DIFFERENT REAL DATA SETS (IN %). data n FCMI KNNI PCC CCAI Re Re {Re,R 2 } {Re,R 2 } B {3.81, 2.34} {3.66, 0} B {5.42,1.32} {4.83, 1.61} B {10.10, 2.64} {9.00, 0.66} I {5.33, 2.67} {4.00, 1.33} I {8.67,4.00} {8.00, 4.67} I {12.67, 9.33} {11.33, 12} S {9.52, 4.76 } {9.52, 0} S {10.48, 4.29} {10.00, 0.48} S {16.19, 14.76} {16.19, 13.81} W {26.97, 1.69} {6.74, 1.12} W {29.78, 2.25} {7.30, 3.93} W {30.34, 2.81} {12.36, 3.93} correctly classfed usng only the avalable attrbute values have been properly commtted to the meta-classes, whch can well reveal the mprecson of classfcaton. In CCAI, some objects wth the mputaton of mssng values are stll classfed nto the meta-class. It ndcates that these mssng values play a crucal role n the classfcaton, but the estmaton of these mssng values s no very good. In other words, the mssng values can be flled wth the smlar relabltes by dfferent estmated data, whch lead to dstnct classfcaton results. So we have to cautously assgn them to the meta-class to reduce the rsk of msclassfcaton. Compared wth our prevous method PCC, ths new method CCAI generally provde better performance wth lower error rate and mprecson rate, and t s manly because more accurate estmaton method (.e. SOM +KNN) for mssng values s adopted n CCAI. Ths thrd experment usng real data sets for dfferent applcatons shows the effectveness and nterest of ths new CCAI method wth respect to other methods. V. CONCLUSION A fast credal classfcaton method wth adaptve mputaton of mssng values (called CCAI) for dealng wth ncomplete pattern has been presented. In step 1 of CCAI method, some objects (ncomplete pattern) are drectly classfed gnorng the mssng values f the specfc classfcaton result can be obtaned, whch effectvely reduces the computaton complexty because t avods the mputaton of the mssng values. However, f the avalable nformaton s not suffcent to acheve a specfc classfcaton of the object, we estmate (recover) the mssng values before enterng the classfcaton 2039

8 procedure n the second step. The SOM and K-NN approaches are appled to make the estmaton of mssng attrbutes wth a good compromse between the estmaton accuracy and computaton burden. Informaton fuson technque s employed to combne the multple smple classfcaton results respectvely obtaned from each tranng class for the fnal credal classfcaton of object. The credal classfcaton n ths work allows the object to belong to dfferent sngleton classes and meta-class wth dfferent masses of belef. Once the object s commtted to a meta-class (e.g. A B), t means that the mssng values cannot be accurately recovered accordng to the context, and the estmaton s not very good. Dfferent estmatons wll lead the object to dstnct classes (e.g. A or B) nvolved n the meta-class. So some other sources of nformaton wll be requred to acheve more precse classfcaton of the object f necessary. Two experments have been appled to test the performance of CCAI method wth artfcal and real data sets. The results show that the credal classfcaton s able to well capture the mprecson of classfcaton and effectvely reduces the msclassfcaton errors as well. Acknowledgements Ths work has been partally supported by Natonal Natural Scence Foundaton of Chna (Nos , , ), the Fundamental Research Funds for the Central Unverstes (No JCQ01067) and Natural Scence Foundaton of Shaanx (No. 2015JQ6265). REFERENCES [1] P. Garca-Laencna, J. Sancho-Gomez, A. Fgueras-Vdal, Pattern classfcaton wth mssng data: a revew, Neural Comput Appl. Vol.19: , [2] R.J. Lttle, D.B. Rubn, Statstcal Analyss wth Mssng Data, John Wley & Sons, New York, 1987 (Second edton publshed n 2002). [3] A. Farhangfar, L. Kurgan, J. Dy, Impact of mputaton of mssng values on classfcaton error for dscrete data, Pattern Recognton Vol.41: , [4] D.J. Mundfrom, A. Whtcomb, Imputng mssng values: The effect on the accuracy of classfcaton, Multple Lnear Regresson Vewponts. Vol. 25(1):13 19, [5] G. Batsta, M.C. Monard, A Study of K-Nearest Neghbour as an Imputaton Method, n Proc. of Second Internatonal Conference on Hybrd Intellgent Systems (IOS Press, v. 87), pp , [6] J. Luengo, J.A. Saez, F.Herrera, Mssng data mputaton for fuzzy rulebased classfcaton systems, Soft Computng, Vol.16(5): , [7] D. L, J.Deogun, W. Spauldng, B. Shuart, Towards mssng data mputaton: a study of fuzzy k-means clusterng method, In: 4th nternatonal conference of rough sets and current trends n computng (RSCTC04), pp , [8] Z. Ghahraman, M.I. Jordan, Supervsed learnng from ncomplete data va an EM approach, In: Cowan JD et al. (Eds) Adv. Neural Inf. Process., Morgan Kaufmann Publshers Inc., Vol.6: , [9] K. Pelckmans, J.D. Brabanter, J.A.K. Suykens, B.D. Moor, Handlng mssng values n support vector machne classfers, Neural Networks, Vol. 18(5-6): , [10] G. Shafer, A mathematcal theory of evdence, Prnceton Unv. Press, [11] F. Smarandache, J. Dezert (Edtors), Advances and applcatons of DSmT for nformaton fuson, Amercan Research Press, Rehoboth, Vol. 1-4, [12] P. Smets, The combnaton of evdence n the transferable belef model, IEEE Trans. on Pattern Anal. and Mach. Intell., Vol. 12(5): ,1990. [13] A.-L. Jousselme, C. Lu, D. Grener, E. Bossé, Measurng ambguty n the evdence theory, IEEE Trans. on SMC, Part A: 36(5): , Sept [14] T. Denœux, A k-nearest neghbor classfcaton rule based on Dempster- Shafer Theory, IEEE Trans. on Systems, Man and Cybernetcs, Vol. 25(5): , [15] Z.-g. Lu, Q. Pan, J. Dezert, A new belef-based K-nearest neghbor classfcaton method, Pattern Recognton, Vol.46(3): , [16] Z.-g. Lu, Q. Pan, J. Dezert, G. Mercer, Fuzzy-belef K-nearest neghbor classfer for uncertan data, n Proc. of 17th Int. Conference on Informaton Fuson (Fuson 2014), Span, July 7-10, [17] Z.-g. Lu, Q. Pan, J. Dezert, G. Mercer, Credal classfcaton rule for uncertan data based on belef functons, Pattern Recognton, Vol. 47(7): ,2014. [18] T. Denœux, P. Smets, Classfcaton usng belef functons: relatonshp between case-based and model-based approaches, IEEE Trans. on Systems, Man and Cybernetcs, Part B: Vol.36(6): , [19] T. Denœux, A neural network classfer based on Dempster-Shafer theory, IEEE Trans. Systems, Man and Cybernetcs: Part A, Vol.30(2): , [20] M.-H. Masson, T. Denœux, ECM: An evdental verson of the fuzzy c-means algorthm, Pattern Recognton, Vol.41(4): , [21] T. Denœux, M.-H. Masson, EVCLUS: EVdental CLUSterng of proxmty data, IEEE Trans. on Systems, Man and Cybernetcs: Part B, Vol.34(1):95 109, [22] Z.-g. Lu, Q. Pan, J. Dezert, G. Mercer, Credal c-means clusterng method based on belef functons, Knowledge-based systems, Vol.74: , [23] T. Denœux, Maxmum lkelhood estmaton from uncertan data n the belef functon framework, IEEE Transactons on Knowledge and Data Engneerng, Vol. 25(1): , [24] Z.-g. Lu, J. Dezert, Q. Pan, G. Mercer, Combnaton of sources of evdence wth dfferent dscountng factors based on a new dssmlarty measure, Decson Support Systems, Vol.52: , [25] Z.-g. Lu, Q. Pan, G. Mercer, J. Dezert, A new ncomplete pattern classfcaton method based on evdental reasonng, IEEE Trans. Cybernetcs, Vol.45(4) : , [26] T. Kohonen, The Self-Organzng Map, Proceedngs of the IEEE, Vol.78(9): , [27] J. Dezert, A. Tchamova, On the valdty of Dempster s fuson rule and ts nterpretaton as a generalzaton of Bayesan fuson rule, Internatonal Journal of Intellgent Systems, Vol. 29(3): , [28] D. Dubos, H. Prade, Representaton and combnaton of uncertanty wth belef functons and possblty measures, Computatonal Intellgence, Vol. 4(4): , [29] F. Smarandache, J. Dezert, On the consstency of PCR6 wth the averagng rule and ts applcaton to probablty estmaton, n Proc. of 16th Int. Conference on Informaton Fuson (Fuson 2013), Istanbul, Turkey, July 9-12, [30] I. Hammam, J. Dezert, G. Mercer, A. Hamouda, On the estmaton of mass functons usng Self Organzng Maps, n Proc. of Belef 2014 Conf. Oxford, UK, Sept , [31] S. Gesser, Predctve nference: an ntroducton, New York, NY: Chapman and Hall, [32] A. Frank, A. Asuncon, UCI machne learnng repostory, Unversty of Calforna, School of Informaton and Computer Scence, Irvne, CA, USA,

Adaptive imputation of missing values for. incomplete pattern classification

Adaptive imputation of missing values for. incomplete pattern classification Adaptve mputaton of mssng values for 1 ncomplete pattern classfcaton Zhun-ga Lu 1, Quan Pan 1, Jean Dezert 2, Arnaud Martn 3 1. School of Automaton, Northwestern Polytechncal Unversty, X an, Chna. Emal:

More information

Classification of incomplete patterns based on fusion of belief functions

Classification of incomplete patterns based on fusion of belief functions Classfcaton of ncomplete patterns based on fuson of belef functons Zhun-Ga Lu, Quan Pan, Jean Dezert, Arnaud Martn, Gregore Mercer To cte ths verson: Zhun-Ga Lu, Quan Pan, Jean Dezert, Arnaud Martn, Gregore

More information

Pattern classification with missing data using belief functions

Pattern classification with missing data using belief functions Pattern classfcaton wth mssng data usng belef functons Zhun-ga Lu a,b, Quan Pan a, Gregore Mercer b, Jean Dezert c a. School of Automaton, Northwestern Polytechncal Unversty, X an, Chna. Emal: luzhunga@gmal.com

More information

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

A new belief entropy: possible generalization of Deng entropy, Tsallis entropy and Shannon entropy

A new belief entropy: possible generalization of Deng entropy, Tsallis entropy and Shannon entropy A new belef entropy: possble generalzaton of Deng entropy, Tsalls entropy and Shannon entropy Bngy Kang a, Yong Deng a,b, a School of Computer and Informaton Scence, Southwest Unversty, Chongqng, 400715,

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

& Florentin SMARANDACHE 2

& Florentin SMARANDACHE 2 A DSmT Based System for Wrter-Independent Handwrtten Sgnature Verfcaton Nassm ABBAS 1 nabbas@usthb.dz, Youcef CHIBANI 1, Blal HADJADJI 1, ychban@usthb.dz bhadjadj@usthb.dz & Florentn SMARANDACHE 2 smarand@unm.edu

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

A New Evidence Combination Method based on Consistent Strength

A New Evidence Combination Method based on Consistent Strength TELKOMNIKA, Vol.11, No.11, November 013, pp. 697~6977 e-issn: 087-78X 697 A New Evdence Combnaton Method based on Consstent Strength Changmng Qao, Shul Sun* Insttute of Electronc Engneerng, He Longang

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

A Method for Filling up the Missed Data in Information Table

A Method for Filling up the Missed Data in Information Table A Method for Fllng up the Mssed Data Gao Xuedong, E Xu, L Teke & Zhang Qun A Method for Fllng up the Mssed Data n Informaton Table Gao Xuedong School of Management, nversty of Scence and Technology Beng,

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

A Network Intrusion Detection Method Based on Improved K-means Algorithm

A Network Intrusion Detection Method Based on Improved K-means Algorithm Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Hidden Markov Models

Hidden Markov Models CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte

More information

SDMML HT MSc Problem Sheet 4

SDMML HT MSc Problem Sheet 4 SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure Intern. J. Fuzzy Mathematcal rchve Vol. 5, No., 04, 9-7 ISSN: 30 34 (P, 30 350 (onlne Publshed on 5 November 04 www.researchmathsc.org Internatonal Journal of Complement of Type- Fuzzy Shortest Path Usng

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper Games of Threats Elon Kohlberg Abraham Neyman Workng Paper 18-023 Games of Threats Elon Kohlberg Harvard Busness School Abraham Neyman The Hebrew Unversty of Jerusalem Workng Paper 18-023 Copyrght 2017

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

Evaluation for sets of classes

Evaluation for sets of classes Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

Linear Classification, SVMs and Nearest Neighbors

Linear Classification, SVMs and Nearest Neighbors 1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Space of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics

Space of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics /7/7 CSE 73: Artfcal Intellgence Bayesan - Learnng Deter Fox Sldes adapted from Dan Weld, Jack Breese, Dan Klen, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer What s Beng Learned? Space

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta

Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta Bayesan Networks Course: CS40022 Instructor: Dr. Pallab Dasgupta Department of Computer Scence & Engneerng Indan Insttute of Technology Kharagpur Example Burglar alarm at home Farly relable at detectng

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Learning from Data 1 Naive Bayes

Learning from Data 1 Naive Bayes Learnng from Data 1 Nave Bayes Davd Barber dbarber@anc.ed.ac.uk course page : http://anc.ed.ac.uk/ dbarber/lfd1/lfd1.html c Davd Barber 2001, 2002 1 Learnng from Data 1 : c Davd Barber 2001,2002 2 1 Why

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Sparse Gaussian Processes Using Backward Elimination

Sparse Gaussian Processes Using Backward Elimination Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an

More information

Sampling Theory MODULE VII LECTURE - 23 VARYING PROBABILITY SAMPLING

Sampling Theory MODULE VII LECTURE - 23 VARYING PROBABILITY SAMPLING Samplng heory MODULE VII LECURE - 3 VARYIG PROBABILIY SAMPLIG DR. SHALABH DEPARME OF MAHEMAICS AD SAISICS IDIA ISIUE OF ECHOLOGY KAPUR he smple random samplng scheme provdes a random sample where every

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Primer on High-Order Moment Estimators

Primer on High-Order Moment Estimators Prmer on Hgh-Order Moment Estmators Ton M. Whted July 2007 The Errors-n-Varables Model We wll start wth the classcal EIV for one msmeasured regressor. The general case s n Erckson and Whted Econometrc

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

CS47300: Web Information Search and Management

CS47300: Web Information Search and Management CS47300: Web Informaton Search and Management Probablstc Retreval Models Prof. Chrs Clfton 7 September 2018 Materal adapted from course created by Dr. Luo S, now leadng Albaba research group 14 Why probabltes

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x

More information

Instance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification

Instance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

The L(2, 1)-Labeling on -Product of Graphs

The L(2, 1)-Labeling on -Product of Graphs Annals of Pure and Appled Mathematcs Vol 0, No, 05, 9-39 ISSN: 79-087X (P, 79-0888(onlne Publshed on 7 Aprl 05 wwwresearchmathscorg Annals of The L(, -Labelng on -Product of Graphs P Pradhan and Kamesh

More information

Outline. Multivariate Parametric Methods. Multivariate Data. Basic Multivariate Statistics. Steven J Zeil

Outline. Multivariate Parametric Methods. Multivariate Data. Basic Multivariate Statistics. Steven J Zeil Outlne Multvarate Parametrc Methods Steven J Zel Old Domnon Unv. Fall 2010 1 Multvarate Data 2 Multvarate ormal Dstrbuton 3 Multvarate Classfcaton Dscrmnants Tunng Complexty Dscrete Features 4 Multvarate

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Supplementary Notes for Chapter 9 Mixture Thermodynamics Supplementary Notes for Chapter 9 Mxture Thermodynamcs Key ponts Nne major topcs of Chapter 9 are revewed below: 1. Notaton and operatonal equatons for mxtures 2. PVTN EOSs for mxtures 3. General effects

More information

Formalisms For Fusion Belief in Design

Formalisms For Fusion Belief in Design XII ADM Internatonal Conference - Grand Hotel - Rmn Italy - Sept. 5 th -7 th, 200 Formalsms For Fuson Belef n Desgn Mchele Pappalardo DIMEC-Department of Mechancal Engneerng Unversty of Salerno, Italy

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

A new construction of 3-separable matrices via an improved decoding of Macula s construction

A new construction of 3-separable matrices via an improved decoding of Macula s construction Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information