Keyword Reduction for Text Categorization using Neighborhood Rough Sets
|
|
- April Doyle
- 5 years ago
- Views:
Transcription
1 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): Keyword Reducton for Text Categorzaton usng Neghborhood Rough Sets S-Yuan Jng Schuan rovnce Unversty Key Laboratory of Internet Natural Language Intellgent rocessng, Leshan Normal Unversty, Leshan, , Chna Abstract Keyword reducton s a technque that removes some less mportant eywords from the orgnal dataset. Its am s to decrease the tranng tme of a learnng machne and mprove the performance of text categorzaton. Some researchers appled rough sets, whch s a popular computatonal ntellgent tool, to reduce eywords. However, classcal rough sets model, whch s usually adopted, can just deal wth nomnal value. In ths wor, we try to apply neghborhood rough sets to solve the eyword reducton problem. A heurstc algorthm s proposed meanwhle compared wth some classcal methods, such as Informaton Gan, Mutual Informaton, CHI square statstcs, etc. The expermental results show that the proposed methods can outperform other methods. Keywords: Text Categorzaton; Keyword Reducton; Neghborhood Rough Sets; Heurstc Algorthm. 1. Introducton Automatc text categorzaton s always a hot ssue n pattern classfcaton and web data mnng. There are several ey steps n automatc text categorzaton, ncludng text pre-processng, eyword reducton (or called feature selecton), learnng machne tranng, etc. Keyword reducton technque remove less mportant eywords from orgnal eyword set, thus mprove the result of text categorzaton as well as the tranng tme. Statstcal methods are the most commonly used n eyword reducton, such as statstcs, nformaton gan, mutual nformaton [1], etc. Some researchers use some other technques whch can also play well n eyword reducton, such as rough sets [, 13]. Rough sets s a powerful computatonal ntellgent tool whch proposed by Z. awla n 198 [3]. It has been proven to be effcent to handle mprecse, uncertan and vague nowledge. Attrbute reducton s one of the most mportant applcatons for rough sets. In last decade, some researchers appled rough sets to solve the eyword reducton problem and obtaned promsng results. For example, A. Chouchoulas and Q. Shen frstly ntroduced a common method to apply rough sets to eyword reducton n text categorzaton [4]. Mao et al. proposed a hybrd algorthm for text categorzaton whch combned tradtonal classfers wth varable precson rough set []. Y. L et al. presented a novel rough set-based case-based reasoner for use n text categorzaton [5]. However all the research adopt classcal rough sets model (.e. awla rough sets model), whch can just deal wth nomnal values. The number of eyword s contnuous, thus they must dscrete data before computaton. Ths wll loss some mportant nformaton and mpacts the fnal result. To address ths problem, we try to apply neghborhood rough sets to solve the eyword reducton problem n ths paper. A new heurstc algorthm for eyword reducton s proposed. We use two famous classfers,.e. NN and SVM, to test the performance of the new algorthm. The expermental datasets are Reuters-1578 and 0- newsgroup. Expermental results show that the proposed method can acheve good performance and outperform some well-nown statstcal methods. The rest of the paper s organzed as follow: secton wll recall some basc concept and nowledge of neghborhood rough sets as well as eyword reducton technques. Secton 3 ntroduces a new heurstc algorthm based on neghborhood rough sets for eyword reducton. Secton 4 gves the expermental result and the dscusson. Fnally, secton 5 gves a concluson.. relmnares of Neghborhood Rough Sets Neghbor theory s proposed by Ln et al. n 1990 [9]. It has been an mportant tool for many artfcal ntellgence tass, such as machne learnng, pattern classfcaton, data mnng, natural language understandng and nformaton retreval etc. [8]. Neghborhood rough set s a new model whch combnes neghbor theory and rough set theory together, and t can be used to deal wth numerc value or hybrd value. In ths secton, we brefly recall some basc concepts and theory of neghborhood rough sets n [6, 10]. 015 Internatonal Journal of Computer Scence Issues
2 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): Defnton.1: U s a non-empty fnte set of objects, s a gven functon. We say NAS U, s a neghbor approxmaton space where: 1). x x x x, 0,, 0, f and only f x x, 1 x, x U ). 1,, 1, 1, 3). x x x x x x U x, x x, x x, x, x, x, x U We say s the dstance functon n ths neghbor space. There are many useful dstance functon can be used, such as Eucldean dstance functon, Mnows dstance, etc. To construct a neghborhood rough set model for unverse granulaton on the numercal attrbute, Hu et al. proposed a neghbor [7]. Defnton.: Gven a neghbor space U,, x U, 0, we say x s a neghbor of x whose centre s x and radus s, where: x y x, y, y U (.1) It s easy to fnd that a set of neghbors wll nduce a famly of nformaton granules and they can be used to approxmate any concepts n unverse. Defnton.3: A neghborhood nformaton system s a NIS U, A,, whereu s a non-empty fnte set of trple objects; A s a non-empty fnte set of attrbutes; s a dstance functon; A and form a famly of neghborhood relaton onu. We gve an example to llustrate how to compute neghbors n a gve nformaton system. Table 1 shows an nformaton table, n whch there are sx objects and each of them has two attrbutes. We use an mproved Eucldean dstance proposed n [10] as the dstance functon. The results are shown n table. If we x x, x, set threshold to 0.3, we can get: x x, x4, x3 x3, x5, x4 x, x4 x x, x, x, x x , Table. The results of dstance computaton x 1 x x 3 x 4 x 5 x 5 x x x x x x Based on above defnton, we can construct lower approxmaton and upper approxmaton n neghborhood rough sets. Defnton.4: Gven a neghborhood approxmaton NAS U, and X U, the lower approxmaton space and upper approxmaton about X can be defned as: N X x x X, x U N X x x X, x U And X U, N X X N X (.). Moreover, we can respectvely defne the boundary doman, postve doman and negatve doman as follow: BN X N X N X (.3) OS X N X (.4) NEG X U N X (.5) ostve doman OS X represents the granules contaned by X ; NEG X represents the granules not contaned by X ; BN X represents the granules partally contaned by X. If N X N X, we say X n ths neghborhood approxmaton space s defnable; otherwse, t s ndefnable, namely t s rough. In practcal envronment, some data are nosy. The model defned above s crsp and wthout tolerance ablty of nose. Therefore, we adopt varable precson rough sets to mprove formula.. The functon card s to count the object number n a set. arameter s usually set between 0.5 to 1. Table 1. An llustrated example for neghbor computaton obj. attrbute1 attrbute obj. attrbute1 attrbute x 1 4 x x 1 7 x x x x card x X N X x, x U card card x X N X x 1, x U card x (.6) 015 Internatonal Journal of Computer Scence Issues
3 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): Data n many pattern classfcaton problems, such as the eyword reducton problem, can be represented by a decson table. Therefore, we ntroduce some concepts and a neghborhood rough sets model to handle decson table. Defnton.5: Gven a neghborhood nformaton NIS U, A,, we say t s a neghborhood system decson table NDT U, C D, f A C D, C represents the condton attrbute set, D represents the decson attrbute, and C forms a famly of neghborhood relaton onu. Defnton.6: Gven a neghborhood decson NDT U, C D,, C, decson table attrbute D dvde the unverse U nto some equvalence classes, X1, X,, X n,the lower approxmaton and upper approxmaton about D relatve to can be defned as: N D N X N X N X N D N X N X N X 1 n 1 n (.7) Obvously OS D N D, NEG D U N D p BN D N D N D. 3. Algorthm p, Generally speang, there are two steps n text categorzaton tas,.e. text preprocessng and classfer buldng. Text preprocessng has fve sub-steps whch are word extractng, stoppng word omttng, word stemmng, eyword reducng and eyword weghtng. The former three sub-steps s responsble to extract words from raw text meanwhle delete some useless words whch are n a gven stoppng word lst, and combne the words wth same stem. After that, we commonly use VSM (Vector Space Model) to represent the texts. The last two sub-steps,.e. eyword reducng and weghtng, remove eywords whch have lttle help to classfcaton and assgn a new value to each eyword whch can emphasze the mportance of dfferent eywords. Classfer buldng n text categorzaton s same wth other machne learnng tass and the explanaton s omtted here. Keyword reducton s a ey step n text categorzaton. Its am s to remove some eywords whch are less mportant. A good algorthm of eyword reducton can eep the most mportant eywords and t wll greatly mprove classfer s performance as well as decrease the tranng tme. In ths secton, we frstly gve some mportant concepts and theores to measure the mportance degree of eywords based on neghborhood rough sets, and then we propose a heurstc algorthm for ey reducton n text categorzaton. To apply neghborhood rough set s to text categorzaton, we must represent the problem by neghborhood decson NDT U, C D,, where U represents a set of table samples (.e. texts), C represents a set of eywords, D represents the label of documents, s a pre-defned dstance functon. Then, for C, we can defne the dependency relaton between and D as: card OS D D (3.1) card U Obvously, 0 D 1. The dependency D reflects the dependence degree of D relatve to. Ths defnton s n accordance wth the classcal rough set theory. Theorem 3.1: Gven a neghborhood nformaton system NIS U, A, and threshold, 1, C,, then x U x x f 1., 1 roof. It s obvous thus the proof s omtted here. Theorem 3.: Gven a neghborhood decson table NDT U, C D,, 1, C, f 1, then x OS D x OS D. 1 roof. Wthout loss of generalty, let x OS D D j, represents the jth equvalence class dvded by decson D. Accordng to theorem 3.1, f 1, x x x OS D. Ths then. Thus, 1 completes the proof. Theorem 3.3: D C f 1 s monotone, that s to say,, then D D D 1 C 1. roof. Accordng to theorem 3., f 1 C, OS D OS D OS D. Accordng then 1 C to formula 3.1, we can get D D D Ths completes the proof.. 1 C Theorem 3.3 shows an mportant property of dependency functon. Accordng to t, we can defne the sgnfcance of each eyword a relatve to decson D (.e. labels), as follow: sg a,, D a D D (3.) Based on the sgnfcance metrc, we can desgn a heurstc eyword reducton algorthm. The algorthm begns wth a 015 Internatonal Journal of Computer Scence Issues
4 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): relatve core set searched by an algorthm proposed n [11]. For each teraton, t computes sgnfcance of all rest eywords and selects a eyword that has the hghest sgnfcance and adds t to a fnal eyword set. The algorthm does not end untl the pre-defned eyword number s acheved. The pseudo-code s shown below. NRS-KR Algorthm: Input: NDT U, C D,, a predefned threshold and eyword number N Output: A eyword reducton, denoted as red 1. red core ; N core do. for =1: 3. m = 0, canddate=null; 4. for a C red do sg a, red, D >m do 5. f 6. canddate = 7. end f 8. end for 9. red red canddate; 10. end for 4. Experments a ; m=,, sg a red D ; In ths secton, we perform experments to evaluate the proposed algorthm. Frstly, we wll ntroduce expermental datasets, the selected methods whch are used for comparson, and performance metrcs n the experments. Then, we wll explan how we perform the experments and show the results. 4.1 Selected Datasets We select two datasets to evaluate the proposed algorthm. The frst one s Reuters-1578, whch s a standard text categorzaton benchmar and collected by the Carnege group from the Reuters newswre n We only choose the most populous ten categores from ths corpus, whch are earn, acq, coffee, sugar, trade, shp, crude, nterest, money-fx, gold. The second one s 0-newsgroup, whch contans approxmately 0,000 newsgroup texts beng dvded nearly among 0 dfferent newsgroups. 4. Methods for Comparson We choose four methods to compare wth the proposed method. The frst one s classcal rough sets based eyword reducton algorthm and the other three are statstcs (CHI), Informaton Gan (IG) and Mutual Informaton (MI), respectvely. The statstcs measures the lac of ndependence between eyword and label and can be compared to the dstrbuton wth one degree of freedom to judge extremeness. The statstcs s defned as formula 4.1, where t means term (.e. eyword) and c means category (.e. label), A s the number of tmes t and c occur at the same tme, B s the number of tmes t occurs wthout c, C s the number of tmes c occurs wthout t, and D s the number of tmes nether t or c occur. N s the number of texts n dataset. tc, N AD CB A CB D A BC D (4.1) The nformaton gan measures the number of bts of nformaton obtaned for category predcton by nowng the presence or absence of a term n a text. The nformaton gan of a term t s defned as follow. m r log r 1 r m r log r r IG t c c t 1 m 1 r c t c t t c t log r c t (4.) Mutual nformaton s a crteron commonly used n statstcal modelng. The basc dea behnd ths metrc s that the larger mutual nformaton s, the hgher the cooccurrence probablty between term t and category c s. MI can be defned as follow. The symbol N, A, B, C s same wth formula 4.1. r, MI t, c log log r 4.3 erformance Metrcs tc A N tr c A C A B (4.3) For dfferent purpose, there are several performance metrcs can be chose n text categorzaton evaluaton, ncludng recall, precson, F-measure, Mcro-average of F- measure and Macro-average of F-measure. In the experments, we choose Mcro-average of F1-measure (abbrevated as F1-Mcro) and Macro-average of F1- measure (abbrevated as F1-Macro) as the performance metrcs. Snce the two metrcs are based on other basc metrcs, we brefly explan here. Frstly, we gve the defnton of recall and precson, as follow. 015 Internatonal Journal of Computer Scence Issues
5 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): a recall 100% a c a precson 100% a b (4.4) (4.5) Where a s the number of texts correctly classfed to that class, b s the number of texts ncorrectly classfed to that class, c s the number of text ncorrectly rejected from that class. Based on the two defntons, we further gve the defnton of F1-measure as formula 4.6. precson recall F1 precson recall (4.6) Fnally, the F1-Mcro and F1-Macro are defned as follow. n N F1 macro F1 1 N n N precson recall 1 N precson recall precson F1Mcro precson recall recall (4.7) (4.8) Where precson and recall s computed by the number of a, b, c. 4.4 Expermental Results SVM and NN. SVM s mplemented by lbsvm whch s a wdely used open-source lbrary for SVM [1]. Lnear ernel s adopted n SVM. For category ranng n NN, we ndvdually select 10 nearest neghbors for each text and adopt the category whch s the most. Cosne smlarty s used for neghbor s selecton. Before eyword reducton, we need to preprocess the raw datasets. Frstly, we extract words from the raw texts. These eywords wll not be used to construct VSM mmedately. We remove some useless words based on a stop word lst whch s obtaned from Internet and has 887 stop words. After that, we use the orter Stemmng Algorthm to combne the words that has same stem. For the proposed algorthm, the threshold s set to 0.. Eucldean dstance s adopted as the dstance functon. We compare the proposed algorthm (NRS) wth other four methods,.e. classcal rough sets (RS), statstcs (CHI), nformaton gan (IG) and mutual nformaton (MI). The results are shown from fgure to 5. In experments, we choose a dataset and apply dfferent methods to select a number of eyword and evaluate by a specfc classfer. The eyword number changes from 1000 to From fgure to 5, we can easly fnd that the proposed algorthm plays very well and t can select much better eywords than other four methods. Furthermore, we lst the average results of each algorthm n table 3 and 4. From table 3 and 4, we can see that the proposed algorthm also outperforms other methods n ths aspect. The experments are performed on Matlab R01b. We choose two popular classfers for evaluaton, whch are Table 3: erformance of dfferent algorthms on Reuters-1578 (%) Methods SVM NN F1-Mcro F1-Macro F1-Mcro F1-Macro NRS RS CHI IG MI Table 4: erformance of dfferent algorthms on 0-newsgroups (%) Methods SVM NN F1-Mcro F1-Macro F1-Mcro F1-Macro NRS RS CHI IG MI Internatonal Journal of Computer Scence Issues
6 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): Fg. 1 Comparson of dfferent algorthms on Reuters-1578 usng SVM Classfer(a) F1-Mcro; (b) F1-Macro Fg. Comparson of dfferent algorthms on Reuters-1578 usng NN Classfer(a) F1-Mcro; (b) F1-Macro Fg. 3 Comparson of dfferent algorthms on 0-newsgroups usng SVM Classfer(a) F1-Mcro; (b) F1-Macro Fg. 4 Comparson of dfferent algorthms on 0-newsgroups usng NN Classfer (a) F1-Mcro; (b) F1-Macro 015 Internatonal Journal of Computer Scence Issues
7 IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): ISSN (Onlne): Conclusons In ths paper, we propose a novel way to eyword reducton n text categorzaton usng neghborhood rough sets. As we nowledge, neghborhood rough sets s a model whch combne classcal rough sets and neghbor theory, and t s effcent to handle numerc value or even hybrd value. We apply some mportant concepts and theores of neghborhood rough sets to eyword reducton problem, moreover desgn a heurstc algorthm. The Expermental results prove that the proposed algorthm play well on eyword reducton problem and t outperform some classcal methods on Reuters-1578 and 0- newsgroups. Acnowledgments Ths wor s supported by the Scentfc Research Fund of Schuan rovncal Educaton Department (Grand No. 14ZB047), Scentfc Research Fund of Leshan Normal Unversty (Grand No. Z135), and Schuan rovncal Department of Scence and Technology roject (No.014JY0036).Instead, wrte F. A. Author thans.... Sponsor and fnancal support acnowledgments are also placed here. References [1] Y. Yang, "A comparatve study on feature selecton n text categorzaton", n roceedng of 14th Internatonal Conference on Machne Learnng (ICML 97), 1997, pp [] D. Q. Mao, Q. G. Duan, H. Y. Zhang et al., "Rough set based hybrd algorthm for text classfcaton", Expert Systems wth Applcatons, Vol. 36, 009, pp [3] Z. awla, "Rough sets", Internatonal Journal of Computer and Informaton Scence, Vol. 11, 198, pp [4] A. Chouchoulas, Q. Shen, "Rough Set-Aded Keyword Reducton for Text Categorzaton", Appled Artfcal Intellgence, Vol. 15, 001, pp [5] Y. L, S. C. K. Shu, S. K. al et al., "A rough set-based casebased reasoner for text categorzaton", Internatonal Journal of Approxmate Reasonng, Vol. 41, 006, pp [6] Q. H. Hu, D. R. Yu, Z. X. Xe, "Numercal attrbute reducton based on neghborhood granulaton and rough approxmaton", Chnese Journal of software, Vol. 19, 008, pp [7] Q. H. Hu, D. R. Yu, J. F. Lu et al., "Neghborhood rough set based heterogeneous feature subset selecton", Informaton Scences, Vol. 178, 008, pp [8] H. Wang, "Nearest neghbors by neghborhood countng", IEEE Trans. on attern Analyss and Machne Intellgence, Vol. 8, 006, pp [9] T. Y. Ln, Q. Lu, K. J. Huang, "Rough sets, neghborhood systems and approxmaton", n the 5th Internatonal Symposum on Methodologes of Intellgent Systems, 1990, pp [10] S. Y. Jng, K. She, S. Al, "A Unversal neghbourhood rough sets model for nowledge dscoverng from ncomplete heterogeneous data", Expert Systems, Vol. 30, 013, pp [11] T. F. Zhang, J. M. Xao, X. H. Wang, "Algorthms of attrbute relatve reducton n rough set theory", Chnese Journal of Electronc, Vol. 33, 005, pp [1] C. C. Chang, C. J. Ln, "LIBSVM: a lbrary for support vector machnes", ACM Transactons on Intellgent Systems and Technology, Vol. 7, 011, pp [13] S. Al, S. Y. Jng, K. She, "roft-aware DVFS Enabled Resource Management of IaaS Cloud", Internatonal Journal of Computer Scence Issues, Vol. 10, 003, pp S-Yuan Jng receved the h. D. degree from Unversty of Electronc Scence and Technology of Chna, Chengdu Chna, n 013. He s currently an assocate professor wth Leshan Normal Unversty. He s the member of ACM and CCF. Hs research nterest s n computatonal ntellgence, bg data mnng etc. 015 Internatonal Journal of Computer Scence Issues
A Method for Filling up the Missed Data in Information Table
A Method for Fllng up the Mssed Data Gao Xuedong, E Xu, L Teke & Zhang Qun A Method for Fllng up the Mssed Data n Informaton Table Gao Xuedong School of Management, nversty of Scence and Technology Beng,
More informationPower law and dimension of the maximum value for belief distribution with the max Deng entropy
Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng
More informationDiscretization of Continuous Attributes in Rough Set Theory and Its Application*
Dscretzaton of Contnuous Attrbutes n Rough Set Theory and Its Applcaton* Gexang Zhang 1,2, Lazhao Hu 1, and Wedong Jn 2 1 Natonal EW Laboratory, Chengdu 610036 Schuan, Chna dylan7237@sna.com 2 School of
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationSemi-supervised Classification with Active Query Selection
Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationApplication research on rough set -neural network in the fault diagnosis system of ball mill
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationMAXIMUM A POSTERIORI TRANSDUCTION
MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationA Hybrid Variational Iteration Method for Blasius Equation
Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationFuzzy Boundaries of Sample Selection Model
Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationFeature Selection in Multi-instance Learning
The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationEvaluation for sets of classes
Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationInternational Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN
Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY
More informationA Knowledge-Based Feature Selection Method for Text Categorization
A Knowledge-Based Feature Selecton Method for Text Categorzaton Yan Xu,2, JnTao L, Bn Wang,ChunMng Sun,2 Insttute of Coputng Technology,Chnese Acadey of Scences No.6 Kexueyuan South Road, Zhongguancun,Hadan
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationA Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function
Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,
More informationNatural Language Processing and Information Retrieval
Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support
More informationValuated Binary Tree: A New Approach in Study of Integers
Internatonal Journal of Scentfc Innovatve Mathematcal Research (IJSIMR) Volume 4, Issue 3, March 6, PP 63-67 ISS 347-37X (Prnt) & ISS 347-34 (Onlne) wwwarcournalsorg Valuated Bnary Tree: A ew Approach
More informationx = , so that calculated
Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to
More informationComplement of Type-2 Fuzzy Shortest Path Using Possibility Measure
Intern. J. Fuzzy Mathematcal rchve Vol. 5, No., 04, 9-7 ISSN: 30 34 (P, 30 350 (onlne Publshed on 5 November 04 www.researchmathsc.org Internatonal Journal of Complement of Type- Fuzzy Shortest Path Usng
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationExcess Error, Approximation Error, and Estimation Error
E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple
More informationCHAPTER IV RESEARCH FINDING AND DISCUSSIONS
CHAPTER IV RESEARCH FINDING AND DISCUSSIONS A. Descrpton of Research Fndng. The Implementaton of Learnng Havng ganed the whole needed data, the researcher then dd analyss whch refers to the statstcal data
More informationMicrowave Diversity Imaging Compression Using Bioinspired
Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,
More informationHongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)
ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of
More informationSystem in Weibull Distribution
Internatonal Matheatcal Foru 4 9 no. 9 94-95 Relablty Equvalence Factors of a Seres-Parallel Syste n Webull Dstrbuton M. A. El-Dacese Matheatcs Departent Faculty of Scence Tanta Unversty Tanta Egypt eldacese@yahoo.co
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationAn Improved Prediction Model based on Fuzzy-rough Set Neural Network
Internatonal Journal of Computer Theory and Engneerng, Vol.3, No.1, February, 011 1793-801 An Improved Predcton Model based on Fuzzy-rough Set Neural Network Jng Hong Abstract After the bref revew of the
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationA new Approach for Solving Linear Ordinary Differential Equations
, ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of
More informationConvexity preserving interpolation by splines of arbitrary degree
Computer Scence Journal of Moldova, vol.18, no.1(52), 2010 Convexty preservng nterpolaton by splnes of arbtrary degree Igor Verlan Abstract In the present paper an algorthm of C 2 nterpolaton of dscrete
More informationTurbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH
Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationOrientation Model of Elite Education and Mass Education
Proceedngs of the 8th Internatonal Conference on Innovaton & Management 723 Orentaton Model of Elte Educaton and Mass Educaton Ye Peng Huanggang Normal Unversty, Huanggang, P.R.Chna, 438 (E-mal: yepeng@hgnc.edu.cn)
More informationDe-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG
6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationHandling Uncertain Spatial Data: Comparisons between Indexing Structures. Bir Bhanu, Rui Li, Chinya Ravishankar and Jinfeng Ni
Handlng Uncertan Spatal Data: Comparsons between Indexng Structures Br Bhanu, Ru L, Chnya Ravshankar and Jnfeng N Abstract Managng and manpulatng uncertanty n spatal databases are mportant problems for
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationOn the correction of the h-index for career length
1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationAn Enterprise Competitive Capability Evaluation Based on Rough Sets
An Enterprse Compettve Capablty Evaluaton Based on Rough Sets 59 An Enterprse Compettve Capablty Evaluaton Based on Rough Sets Mng-Chang Lee Department of Informaton Management Fooyn Unversty, Kaohsung,
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationInexact Newton Methods for Inverse Eigenvalue Problems
Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.
More informationBOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu
BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationAggregation of Social Networks by Divisive Clustering Method
ggregaton of Socal Networks by Dvsve Clusterng Method mne Louat and Yves Lechaveller INRI Pars-Rocquencourt Rocquencourt, France {lzennyr.da_slva, Yves.Lechevaller, Fabrce.Ross}@nra.fr HCSD Beng October
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationConstructing Rough Set Based Unbalanced Binary Tree for Feature Selection
hnese Journal of Electroncs Vol.23, No.3, July 2014 onstructng Rough Set Based Unbalanced Bnary Tree for Feature Selecton LU Zhengca 1,QINZheng 1,2,JINQao 1 and LI Shengnan 1 (1.Department of omputer Scence
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationModelli Clamfim Equazione del Calore Lezione ottobre 2014
CLAMFIM Bologna Modell 1 @ Clamfm Equazone del Calore Lezone 17 15 ottobre 2014 professor Danele Rtell danele.rtell@unbo.t 1/24? Convoluton The convoluton of two functons g(t) and f(t) s the functon (g
More informationTransient Stability Assessment of Power System Based on Support Vector Machine
ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract
More informationAn improved multi-objective evolutionary algorithm based on point of reference
IOP Conference Seres: Materals Scence and Engneerng PAPER OPEN ACCESS An mproved mult-objectve evolutonary algorthm based on pont of reference To cte ths artcle: Boy Zhang et al 08 IOP Conf. Ser.: Mater.
More informationResearch Article Green s Theorem for Sign Data
Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of
More informationRotation Invariant Shape Contexts based on Feature-space Fourier Transformation
Fourth Internatonal Conference on Image and Graphcs Rotaton Invarant Shape Contexts based on Feature-space Fourer Transformaton Su Yang 1, Yuanyuan Wang Dept of Computer Scence and Engneerng, Fudan Unversty,
More information829. An adaptive method for inertia force identification in cantilever under moving mass
89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,
More informationMachine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing
Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer
More informationFUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM
Internatonal Conference on Ceramcs, Bkaner, Inda Internatonal Journal of Modern Physcs: Conference Seres Vol. 22 (2013) 757 761 World Scentfc Publshng Company DOI: 10.1142/S2010194513010982 FUZZY GOAL
More informationDepartment of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution
Department of Statstcs Unversty of Toronto STA35HS / HS Desgn and Analyss of Experments Term Test - Wnter - Soluton February, Last Name: Frst Name: Student Number: Instructons: Tme: hours. Ads: a non-programmable
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationBayesian predictive Configural Frequency Analysis
Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationAGGREGATION OF FUZZY OPINIONS UNDER GROUP DECISION-MAKING BASED ON SIMILARITY AND DISTANCE
Jrl Syst Sc & Complexty (2006) 19: 63 71 AGGREGATION OF FUZZY OPINIONS UNDER GROUP DECISION-MAKING BASED ON SIMILARITY AND DISTANCE Chengguo LU Jbn LAN Zhongxng WANG Receved: 6 December 2004 / Revsed:
More informationInteractive Bi-Level Multi-Objective Integer. Non-linear Programming Problem
Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan
More informationStudy of Selective Ensemble Learning Methods Based on Support Vector Machine
Avalable onlne at www.scencedrect.com Physcs Proceda 33 (2012 ) 1518 1525 2012 Internatonal Conference on Medcal Physcs and Bomedcal Engneerng Study of Selectve Ensemble Learnng Methods Based on Support
More informationLinear Classification, SVMs and Nearest Neighbors
1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush
More informationSparse Gaussian Processes Using Backward Elimination
Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an
More informationClassification with Reject Option in Text Categorisation Systems
Classfcaton wth Reject Opton n Text Categorsaton Systems Gorgo Fumera Ignazo Plla Fabo Rol Dept. of Electrcal and Electronc Eng., Pazza d Arm, 0923 Caglar, Italy {fumera,plla,rol}@dee.unca.t Abstract The
More informationOnline Appendix to: Axiomatization and measurement of Quasi-hyperbolic Discounting
Onlne Appendx to: Axomatzaton and measurement of Quas-hyperbolc Dscountng José Lus Montel Olea Tomasz Strzaleck 1 Sample Selecton As dscussed before our ntal sample conssts of two groups of subjects. Group
More informationReview of Taylor Series. Read Section 1.2
Revew of Taylor Seres Read Secton 1.2 1 Power Seres A power seres about c s an nfnte seres of the form k = 0 k a ( x c) = a + a ( x c) + a ( x c) + a ( x c) k 2 3 0 1 2 3 + In many cases, c = 0, and the
More informationTHE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.
THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationNeryškioji dichotominių testo klausimų ir socialinių rodiklių diferencijavimo savybių klasifikacija
Neryškoj dchotomnų testo klausmų r socalnų rodklų dferencjavmo savybų klasfkacja Aleksandras KRYLOVAS, Natalja KOSAREVA, Julja KARALIŪNAITĖ Technologcal and Economc Development of Economy Receved 9 May
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More information