Research Article ISSN:
|
|
- Edmund Leonard
- 5 years ago
- Views:
Transcription
1 Research Artcle [Dave, 1(2): Aprl, 2012] IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY Emprcal Study On Error Correctng Output Code Based On Multclass Classfcaton Ms. Devangn Dave 1, Prof. M. Samvatsar 2 3, Prof. P. K. Bhanoda devangpdave@yahoo.co.n Abstract A common way to address a mult-class classfcaton problem s to desgn a model that conssts of hand pcked bnary classfers and to combne them so as to solve the problem. Error-Correctng Output Codes () s one such framework that deals wth mult-class classfcaton problems. Recent works n the C doman has shown promsng results demonstratng mproved performance. Therefore, framework s a powerful tool to deal wth mult-class classfcaton problems. The error correctng ablty mproves and enhances the generalzaton ablty of the base classfers. Ths paper ntroduces state-of-the-art codng (one-versus-one, one-versus-all, dense random, sparse random, D, forest-, and -ONE) and decodng desgns (hammng, Eucldean, nverse hammng, laplacan, β-densty, attenuated, loss-based, probablstc kernel-based, and loss weghted) perspectves along wth emprcal study of C followng comparson of varous methods n the above context. Towards the end, our paper consoldates detals relatng to comparson of varous classfcaton methods wth Error Correctng Output Code method avalable n wake, after carryng out experments wth weak tool as a fnal supplement to our studes. Index Terms Codng, Decodng, Error Correctng Output Codes, Multclass Classfcaton.. Introducton The task of supervsed machne learnng can be seen as the problem of fndng an unknownn functon C(x) gven the tranng set of example pars < x ; C (x ) >. C(x) s usually a set of dscrete labels. For example, n face detecton, C(x) s a bnary functon C(x) {face, nonface}, n optcal dgt recognton C(x) {0,, 9}. In order to address the bnary classfcaton task many technques and algorthms have been proposed: decson trees, neural networks, large margn classfcaton technques, etc. Some of those methods can be easly extended to multclass problems. However, some other powerful and popular classfers, such as AdaBoost [4] and Support Vector machnes [3], do not extend to multclass easly. In those stuatons, the usual way to proceed s to reduce the complexty of the multclass problem nto multple smpler bnary classfcaton problems. There are many dfferent approaches for reducng multclass to bnary classfcaton problems. The smplest approach consders the comparson between each class aganst all the others. Ths produces N c bnary problems, where N c s the number of classes. Other researchers suggested the comparson of all possble pars of classes [5], resultng n an N c (N c - 1)/2 set of bnary problems. Detterch and Bakr [7] presented a general framework n whch the classfcaton s performed accordng to a set of bnary error correctng output codes (). In ths approach, the problem s transformed n n bnary classfcaton sub problems, where n s the error correctng output code length n ε {N c,, }. Then, the output of all classfers must be combned tradtonally usng Hammng dstance. The approach of Detterch and Bakr was mproved by Allwen et al. [6] by ntroducng an uncertanty value n the desgn and explorng alternatves for mxng the resultng outputs of the classfers. In partcular, they ntroduced loss-based decodng as a way of mergng the classfers. Recently, Passern et al. [2] proposed a new decodng functon that combnes the margns through an estmate of the class condtonal probabltes. strateges have been proven to be qute compettve wth/better than other multclass extensons of SVM and Adaboost [8], [9]. Although most of the mprovements n error correctng output codes have been made n the (C) Internatonal Journal of Engneerng Scences & Research Technology[52-60]
2 decodng process, lttle attenton has been pad to the desgn of the codes themselves. Crammer and Snger n [1] were the frst to report mprovements n the desgn of the codes. However, the results were rather pessmstc snce they proved that the problem of fndng the optmal dscrete codes s computatonally ntractable snce t s NP-complete. Error Correctng Output Codes Gven a set of N c classes, the bass of the framework conssts of desgnng a codeword for each of the classes. 1) These code words encode the membershp nformaton of each class for a gven bnary problem. 2) Arrangng the code words as rows of a matrx, we obtan a codng matrx M c, where M c {-1, 0, +1} Nc n, beng n the length of the code words codfyng each classes. 3) From the pont of vew of learnng, M c s constructed by consderng n bnary problems each one correspondng to a column of the matrx M c. 4) Each of these bnary problems splts the set of classes n two parttons (coded by +1 or -1 n M c accordng to ther class set membershp or 0 f the class s not consdered by the current bnary problem). 5) Then at the decodng step, applyng the n traned bnary classfers, a code s obtaned for each data pont n the test set. 6) Ths code s compared to the base code words of each class defned n the matrx M c, and the data pont s assgned to the class wth the closest codeword. Below fgure show codng desgn for a 4- class problem. Whte, black and grey postons corresponds to the symbols +1, -1 and 0, respectvely. Once the four bnary problems are learnt, at the decodng step a new test sample X s tested by the n classfers. Then the new codeword x={x 1,, xn} s compared wth the class code words {C 1,, C4}, classfyng the new sample by the class C whch codeword mnmzes the decodng measure. Codng Desgn Here the codng desgn covers the state-of-the art of codng strateges, manly dvded n two man groups: problem-ndependent approaches, whch do not take nto account the dstrbuton of the data to defne the codng matrx, and the problem-dependent desgns, where nformaton of the partcular doman s used to gude the codng desgn. Problem-Independent Codng Desgns One-versus-all (Rfkn and Klautau, 2004): N c dchotomzers are learnt for Nc classes, where each one splts one class from the rest of classes. One-versus-one (Nlsson, 1965): n = N c (N c 1)/2 dchotomzers are learnt for N c classes, splttng each possble par of classes. Dense Random (Allwen et al., 2002): n = 10 logn c dchotomzers are suggested to be learnt for N c classes, where P ( 1) = 1 P (+1), beng P ( 1) and P (+1) the probablty of the symbols -1 and +1 to appear, respectvely. Then, from a set of defned random matrces, the one whch maxmzes a decodng measure among all possble rows of M c s selected. Sparse Random (Escalera et al., 2009): n = 15 logn c dchotomzers are suggested to be learnt for N c classes, where P (0) = 1 P ( 1) P (+1), defnng a set of random matrces M c and selectng the one whch maxmzes a decodng measure among all possble rows of M c. Problem-Dependent Codng Desgns D (Puol et al., 2006): problemdependent desgn that uses n = N c 1 dchotomzers. The parttons of the problem are learnt by means of a bnary tree structure usng exhaustve search or a SFFS crteron.
3 Fnally, each nternal node of the tree s embedded as a column n M c. Forest- (Escalera et al., 2007): problemdependent desgn that uses n = (N c 1) T dchotomzers, where T stands for the number of bnary tree structures to be embedded. Ths approach extends the varablty of the classfers of the D desgn by ncludng extra dchotomzers. -ONE (Puol et al., 2008): problemdependent desgn that uses n = 2 N c suggested dchotomzers. A valdaton sub-set s used to extend any ntal matrx M c and to ncrease ts generalzaton by ncludng new dchotomzers that focus on dffcult to splt classes. Decodng Desgn The notaton used refers to that used n (Escalera et al., 2008): Hammng decodng: = n = HD( x, y ) 1(1 sgn( x y ))/ 2, beng x a test codeword and y a codeword from M c correspondng to class C. Inverse Hammng decodng: IHD(x, y) = max ( 1 D T ), where ( 1, 2 ) = HD (y 1, y 2 ), and D s the vector of Hammng decodng values of the test codeword x for each of the base code words y. Eucldean decodng: ED( x, y ) n 2 = = 1( x y ) Attenuated Eucldean decodng: AED( x, y ) n = = Loss-based decodng: n 1 y x ( x y ) LB( ρ, v ) = = 1 L( y. f ( ρ)), Where ρ s a test sample, L s a loss functon, and f s a real-valued functon f: R n R. Probablstc-based decodng: PD( y, x P( x ) = log( π [1,..., n] : M = M (, ) f c ) + K), c 2 (, ) 0 Where K s a constant factor that collects the probablty mass dspersed on the nvald codes, and the probablty P(x = Mc (, ) f ) s estmated by means of P y f ey ( = ) = 1/1 + v f w, + ) ( x where vectors υ and ω are obtaned by solvng an optmzaton problem (Passern et al., 2004). Laplacan decodng: α + 1 LAP(x, y ) =, where α s the number α + β + K of matched postons between x and y, β s the number of mss-matches wthout consderng the postons coded by 0, and K s an nteger value that codfes the number of classes consdered by the classfer. Pessmstc β-densty Dstrbuton decodng: v 1 Accuracy s : v sψ( v, α, β ) dv =, where 3 1 α β ψ ( v, α, β ) = v (1 v), ψ s the β- K Densty Dstrbuton between a codeword x and a class codeword y for class c, and v R :[0,1]. Loss-Weghted decodng: LW ( ρ, ) n 1M W (, ) L( y. f ( ρ, )), Where = = M (, ) = H(, ) / = 1 H(, ), w m k H(, ) = 1/ m = 1 ϕ ( h ( ρ ),, ), n = = x x y 1, ϕ (,, ) 0, otherwse m s the number of tranng samples from class C, and ρ k s the k th sample from class C. k Outlne Of Ecoc Algorthm Tranng Load tranng data and parameters,.e., the length of code L and tranng class K. 1. Create a L-bt code for the K classes usng a knd of codng algorthm. 2. For each bt, tran the base classfer usng the bnary class (0 and 1) over the total tranng data. Testng 1. Apply each of the L classfers to the test example.
4 2. Assgn the test example the class wth the largest votes. What Makes A Good Ecoc? The key problem for approach s how to desgn the codng matrx M. Many studes [10, 11, 12, 13, and 14] have shown that the fnal classfer wll have good dscrmnate ablty f the codng matrx M has the followng characterstcs: Characterstc 1: Row separaton Each codeword (a row n the codng matrx M) should be well-separated n Hammng dstance from each of the other code words. Characterstc 2: Column separaton Each column should be uncorrelated wth one another. Ths means that the bnary classfers of dfferent columns have low correlatons among them. Characterstc 3: Bnary classfers have low Errors Whle for recognton of a large number of classes, besdes classfcaton accuracy, the effcency s also qute mportant. To make a quck decson, t s expected to evaluate as few bnary classfers as possble. Ths requres the codeword to be effcent (.e. contans a small number of bts). As explaned n [15], for a code to be effcent, dfferent bts should be ndependent of each other, and each bt has a 50% chance of beng one or zero. In desgn, ndependent bts can be relaxed as uncorrelated columns (.e. property 2 mentoned above). And 50% chance of frng for each bt requres. Characterstc 4: Balanced column For each column, the numbers of 1 and 1 are equal,.e., M ( r, ) = 0. R Fndng an satsfyng the above characterstcs s a NP-hard problem [16]. So we can say that for effcent and accurate recognton of a large number of classes, a good s expected to have the followng characterstcs: Effcent - requres a small number of bts. Good dversty - the codng matrx has good row and column separaton. The resultng bnary classfers are accurate. What s So Good About Ecoc? 1. Improves classfcaton accuracy. 2. Can be used wth many dfferent classfers. 3. Commonly used n many areas. 4. Not prone to over fttng. 5. Possbly try a varant. Practcal Advantages Of Ecoc 1. It s fast, smple and easy to program 2. It s flexble can combne wth any learnng algorthm 3. Able to reduce the bas and varance produced by the learnng algorthm. So t wdely used to deal wth mult-class categorzaton problems. 4. Low computatonal cost. 5. Outperforms the drect multclass method. 6. Can use wth data that s textual, numerc, dscrete, etc. 7. General learnng scheme - can be used for varous learnng tasks. 8. Good generalzaton. Dsadvantages 1. s not effectve f each ndvdual codeword s not separated from each of the other code words wth a large Hammng dstance. 2. only succeed f the errors made n the ndvdual bt postons are relatvely uncorrelated, so that the numbers of smultaneous errors n many bt postons s small. If there are many smultaneous errors, the wll not able to correct them (Peterson & Weldon, 1972). 3. support vector machnes are not always superor to one-aganst-all fuzzy support vector machnes. 4. One-versus-all schemes are more stable than other schemes. 5. Sometmes decomposton of mult-class problem nto multple bnary problems we are dong n ncurs consderable bas for centrod classfer, whch results n notceable degradaton of performance for centrod classfer. 6. Fndng the optmal s NP hard. Comparson Of Some Ecoc Methods. One-Versus-All strategy. The most well-known bnary codng strateges are the one-versus-all strategy [17], where each class s dscrmnated aganst the rest of classes. In Fg. 1a,
5 the one-versus-all desgn for a four-class problem s shown. The whte regons of the codng matrx M correspond to the postons coded by 1 and the black regons to -1. Thus, the code word for class C 1 s {1,-1,-1,-1}. Each column of the codng matrx codfes a bnary problem learned by ts correspondng dchotomzer h. For nstance, dchotomzer h 1 learns C 1 aganst classes C 2, C 3, and C 4, dchotomzer h 2 learns C 2 aganst classes C 1, C 3, and C 4, etc. The Dense Random Strategy. The dense random strategy [10], where a random matrx M s generated, maxmzng the rows and columns separablty n terms of the Hammng dstance [7]. An example of a dense random matrx for a four class problem s shown n Fg. 1c. One-Versus-One and Random Sparse Strategy. It was when Allwen et al. [10] ntroduced a thrd symbol (the zero symbol) n the codng process when the codng step receved specal attenton. Ths symbol ncreases the number of parttons of classes to be consdered n a ternary framework by allowng some classes to be gnored. Then, the ternary codng matrx becomes M { 1,0,1} N n.in ths case, the symbol zero means that a partcular class s not consdered by a certan bnary classfer. Thanks to ths, strateges such as one-versus-one [20] and random sparse codng [10] can be formulated n the framework. Fg. 1b shows the one-versus-one confguraton for a four-class problem. In ths case, the gray postons correspond to the zero symbol. A possble sparse random matrx for a four-class problem s shown n Fg. 1d. Fg. 1. (a) One-versus-all, (b) one-versus-one, (c) dense random, and (d) sparse random desgns. Spectral Error Correctng Output Codes for Effcent Multclass Recognton. Algorthm: Input: Gven the class set C = {c 1, c 2,..., c n } 1. Tran a SVM classfer f for each class par {c, c } 2. Construct the smlarty graph G. Set each class c as a vertex and the weght w. 3. Compute the normalzed Laplacan Lsym of G. 4. Compute the egenvectors v 1, v 2,..., v n of Lsym. 5. Transform each v, 2, to a partton ndcator vector m 6. Generate an matrx M l wth code length l: M l = [m 2, m3,, ml +1 ] { f l = 1 7. Tran bnary classfers } to form code predcton functon f l (.) = [f 1 (.), f 2 (.),..., f l (.)] 8. Search the optmal code length l*. Output: M l * and f l * (.) TABLE I EFFICIENCY COMPARISON ON FACE RECOGNITION AND FLOWER CLASSIFICATION DATA. Dataset Methods Accuracy Code length face one-aganst- 99.5% 300
6 recognton wth k = 18 flower classfcaton wth k=18 all Spectral 99.1% 30 Random 98.9% 120 dense Random 98.8% 100 sparse Class Map 97.3% 150 Dscrmnate 69.0% 200 one-aganstall 54.6% 102 Spectral 54.7% 15 Random 54.0% 83 dense Random 54.1% 44 sparse Class Map 50.2% 35 Dscrmnate 35.2% 100 Conclusons In ths paper the dfferent codng and decodng methods for Error Correctng Output Code have been studed. Advantages and dsadvantages of some codng methods are dscussed. From ths study on one can conclude that compare to other methods, better performance can be acheved by usng Error Correctng Output Code. TABLE II performance parameters (based on results obtaned usng weka on dataset contact-lenses wth 10 folds cross valdaton) Method Name Tme Taken (Seconds) Correctly Classfed Instances (%) Incorrect ly Classfe d Instances (%) Kappa Statst c Mean Absolu te Error Root Mean Squar e Error Relatv e Absolut e Error (%) Root relatve square d error (%) Multclass Classfcato n usng
7 SVM(funct ons.smo ) ANN (functons. Multlayer Perceptron) Meta Baggng Meta AdaboostM Nested Dchotomes Stackng Reference [1] K. Crammer and Y. Snger, On the Learnablty and Desgn of Output Codes for Multclass Problems, Machne Learnng, vol. 47, no. 2-3, pp , [2] A. Passern, M. Pontl, and P. Frascon, New Results on Error Correctng Codes of Kernel Machnes, IEEE Trans. Neural Networks, vol. 15, no. 1,pp , [3] V.N. Vapnk, The Nature of Statstcal Learnng Theory. Sprnger [4] Y. Freund and R.E. Shapre, A Decson- Theoretc Generalzaton of On-Lne Learnng and an Applcaton to Boostng, J. Computer
8 and System Scences, vol. 55,no. 1, pp , [5] T. Haste and R. Tbshran, Classfcaton by Parwse Couplng, Annals of Statstcs, vol. 26, no. 2, pp , [6] E.L Allwen, R.E Shapre, and Y. Snger, Reducng Multclass to Bnary: A Unfyng Approach for Margn Classfers, J. Machne Learnng Research, vol. 1, pp , [7] T.G. Detterch and G. Bakr, Solvng Multclass Learnng Problems va Error- Correctng Output Codes J. Artfcal Intellgence Research, vol. 2, pp ,1995. [8] R.E. Schapre, Usng Output Codes to Boost Multclass Learnng Problems, Machne Learnng: Proc. 14th Int l Conf., pp , [9] C. Hsu and C. Ln, A Comparson of Methods for Mult-Class Support Vector Machnes, IEEE Trans. Neural Networks, vol. 13, no. 2, pp , Mar [10] E. L. Allwen and R. E. Schapre. Reducng multclass to bnary: A unfyng approach for margn classfers. Journal of Machne Learnng Research, 1: , [11] N. Garc ıa-pedraas and C. Fyfe. Evolvng output codes for multclass problems. IEEE Trans. Evolutonary Computaton, 12(1):93 106, [12] R. Ghader and T.Wndeatt. Crcular ecoc: A theoretcal and expermental analyss. In ICPR, pages , [13] O. Puol and P. Radeva. Dscrmnant ecoc: A heurstc method for applcaton dependent desgn of error correctng output codes. PAMI, 28(6): , [14] R. Schapr and Y. Snger. Solvng multclass learnng problems va error-correctng output codes. Journal of Artfcal Intellgence Research, 2: , [15] Y. Wess, A. Torralba, and R. Fergus. Spectral hashng. In NIPS, [16] K. Crammer and Y. Snger. On the learnablty and desgn of output codes for multclass problems. Machne Learnng, 47(2-3): , [17] N.J. Nlsson, Learnng Machnes. McGraw- Hll, [18] T. Haste and R. Tbshran, Classfcaton by Parwse Groupng, Proc. Neural Informaton Processng Systems Conf., vol. 26,pp , 1998.
Boostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationNatural Language Processing and Information Retrieval
Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationBOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu
BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com
More informationMAXIMUM A POSTERIORI TRANSDUCTION
MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationCSC 411 / CSC D11 / CSC C11
18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationMULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN
MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology
More informationUsing Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*
Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton
More informationSemi-supervised Classification with Active Query Selection
Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More informationINF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018
INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton
More informationISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013
ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationStatistical machine learning and its application to neonatal seizure detection
19/Oct/2009 Statstcal machne learnng and ts applcaton to neonatal sezure detecton Presented by Andry Temko Department of Electrcal and Electronc Engneerng Page 2 of 42 A. Temko, Statstcal Machne Learnng
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationMIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU
Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationEvaluation for sets of classes
Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton
More informationSparse Gaussian Processes Using Backward Elimination
Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationUnified Subspace Analysis for Face Recognition
Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA
More informationBounds on the Generalization Performance of Kernel Machines Ensembles
Bounds on the Generalzaton Performance of Kernel Machnes Ensembles Theodoros Evgenou theos@a.mt.edu Lus Perez-Breva lpbreva@a.mt.edu Massmlano Pontl pontl@a.mt.edu Tomaso Poggo tp@a.mt.edu Center for Bologcal
More informationNP-Completeness : Proofs
NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem
More informationLinear Classification, SVMs and Nearest Neighbors
1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationSupport Vector Machines
Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far So far Supervsed machne learnng Lnear models Non-lnear models Unsupervsed machne learnng Generc scaffoldng So far
More informationSupport Vector Machines
Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far Supervsed machne learnng Lnear models Least squares regresson Fsher s dscrmnant, Perceptron, Logstc model Non-lnear
More informationDepartment of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING
MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationOPTIMAL COMBINATION OF FOURTH ORDER STATISTICS FOR NON-CIRCULAR SOURCE SEPARATION. Christophe De Luigi and Eric Moreau
OPTIMAL COMBINATION OF FOURTH ORDER STATISTICS FOR NON-CIRCULAR SOURCE SEPARATION Chrstophe De Lug and Erc Moreau Unversty of Toulon LSEET UMR CNRS 607 av. G. Pompdou BP56 F-8362 La Valette du Var Cedex
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationCluster Validation Determining Number of Clusters. Umut ORHAN, PhD.
Cluster Analyss Cluster Valdaton Determnng Number of Clusters 1 Cluster Valdaton The procedure of evaluatng the results of a clusterng algorthm s known under the term cluster valdty. How do we evaluate
More informationWe present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}.
CS 189 Introducton to Machne Learnng Sprng 2018 Note 26 1 Boostng We have seen that n the case of random forests, combnng many mperfect models can produce a snglodel that works very well. Ths s the dea
More informationChapter 8 Indicator Variables
Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n
More informationLecture 5 Decoding Binary BCH Codes
Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationDiscretization of Continuous Attributes in Rough Set Theory and Its Application*
Dscretzaton of Contnuous Attrbutes n Rough Set Theory and Its Applcaton* Gexang Zhang 1,2, Lazhao Hu 1, and Wedong Jn 2 1 Natonal EW Laboratory, Chengdu 610036 Schuan, Chna dylan7237@sna.com 2 School of
More informationAn Extended Hybrid Genetic Algorithm for Exploring a Large Search Space
2nd Internatonal Conference on Autonomous Robots and Agents Abstract An Extended Hybrd Genetc Algorthm for Explorng a Large Search Space Hong Zhang and Masum Ishkawa Graduate School of L.S.S.E., Kyushu
More informationIntro to Visual Recognition
CS 2770: Computer Vson Intro to Vsual Recognton Prof. Adrana Kovashka Unversty of Pttsburgh February 13, 2018 Plan for today What s recognton? a.k.a. classfcaton, categorzaton Support vector machnes Separable
More informationClassification. Representing data: Hypothesis (classifier) Lecture 2, September 14, Reading: Eric CMU,
Machne Learnng 10-701/15-781, 781, Fall 2011 Nonparametrc methods Erc Xng Lecture 2, September 14, 2011 Readng: 1 Classfcaton Representng data: Hypothess (classfer) 2 1 Clusterng 3 Supervsed vs. Unsupervsed
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationNatural Images, Gaussian Mixtures and Dead Leaves Supplementary Material
Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationExcess Error, Approximation Error, and Estimation Error
E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple
More informationSpeeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem
H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationImage classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?
Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationTransient Stability Assessment of Power System Based on Support Vector Machine
ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract
More informationSimulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests
Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth
More informationProbability-Theoretic Junction Trees
Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed
More informationA new construction of 3-separable matrices via an improved decoding of Macula s construction
Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula
More informationFundamental loop-current method using virtual voltage sources technique for special cases
Fundamental loop-current method usng vrtual voltage sources technque for specal cases George E. Chatzaraks, 1 Marna D. Tortorel 1 and Anastasos D. Tzolas 1 Electrcal and Electroncs Engneerng Departments,
More informationNon-linear Canonical Correlation Analysis Using a RBF Network
ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationThe L(2, 1)-Labeling on -Product of Graphs
Annals of Pure and Appled Mathematcs Vol 0, No, 05, 9-39 ISSN: 79-087X (P, 79-0888(onlne Publshed on 7 Aprl 05 wwwresearchmathscorg Annals of The L(, -Labelng on -Product of Graphs P Pradhan and Kamesh
More informationThe exam is closed book, closed notes except your one-page cheat sheet.
CS 89 Fall 206 Introducton to Machne Learnng Fnal Do not open the exam before you are nstructed to do so The exam s closed book, closed notes except your one-page cheat sheet Usage of electronc devces
More informationNumber of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k
ANOVA Model and Matrx Computatons Notaton The followng notaton s used throughout ths chapter unless otherwse stated: N F CN Y Z j w W Number of cases Number of factors Number of covarates Number of levels
More informationHongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)
ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of
More informationRelevance Vector Machines Explained
October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes
More informationKristin P. Bennett. Rensselaer Polytechnic Institute
Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal
More informationRELIABILITY ASSESSMENT
CHAPTER Rsk Analyss n Engneerng and Economcs RELIABILITY ASSESSMENT A. J. Clark School of Engneerng Department of Cvl and Envronmental Engneerng 4a CHAPMAN HALL/CRC Rsk Analyss for Engneerng Department
More informationA Multimodal Fusion Algorithm Based on FRR and FAR Using SVM
Internatonal Journal of Securty and Its Applcatons A Multmodal Fuson Algorthm Based on FRR and FAR Usng SVM Yong L 1, Meme Sh 2, En Zhu 3, Janpng Yn 3, Janmn Zhao 4 1 Department of Informaton Engneerng,
More information