Dynamic Ensemble Selection and Instantaneous Pruning for Regression
|
|
- Ashley Gibbs
- 5 years ago
- Views:
Transcription
1 Dynamc Ensemble Selecton and Instantaneous Prunng for Regresson Kaushala Das and Terry Wndeatt Centre for Vson Speech and Sgnal Processng Faculty of Engneerng and Physcal Scences Unversty of Surrey, Guldford, Surrey, GU2 7XH Unted Kngdom Abstract. A novel dynamc method of selectng pruned ensembles of predctors for regresson problems s presented. The proposed method, known henceforth as DESIP, enhances the predcton accuracy and generalzaton ablty of prunng methods. Prunng heurstcs attempt to combne accurate yet complementary members, therefore DESIP enhances the performance by modfyng the pruned aggregaton through dstrbutng the ensemble member selecton over the entre dataset. Four statc ensemble prunng approaches used n regresson are compared to hghlght the performance mprovement yelded by the dynamc method. Expermental comparson s made usng Multple Layer Perceptron predctors on benchmark datasets. 1 Introducton In the context of ensemble methods, t s recognzed that the combned outputs of several regressors generally gve mproved accuracy compared to a sngle predctor [1]. It has also been shown that ensemble members that are complementary can be selected to further mprove the performance [1]. The selecton, also called prunng, has the potental advantage of both reduced ensemble sze as well as mproved accuracy. However the selecton of classfers, rather than regressors, has prevously receved more attenton and gven rse to many dfferent approaches to prunng [2]. Some of these methods have been adapted to the regresson problem [3]. The dynamc prunng methods n [4, 5] are classfcaton orented and rely on functons that determne the ensemble selecton based on nformaton from the tranng set. The proposed novel dynamc method for regresson smlarly uses the tranng set to determne the ensemble selecton. By dynamc, we mean that the subset of predctors s chosen dfferently dependng on the test sample and ts relatonshp to the tranng set. 2 Related Research The man objectve of usng ensemble methods n regresson problems s to harness the complementarty of ndvdual ensemble member predctons [1]. In Negatve Correlaton Learnng, dversty of the predctors s ntroduced by smultaneously tranng a collecton of predctors usng a cost functon that ncludes a correlaton penalty term [3]; thereby collectvely enhancng the performance of the entre ensemble. By weghtng the outputs of the ensemble members before aggregatng, an 643
2 optmal set of weghts s obtaned n [9] by mnmzng a functon that estmates the generalzaton error of the ensemble; ths optmzaton beng acheved usng genetc algorthms. Wth ths approach, predctors wth weghts below a certan level are removed from the ensemble. In [3] genetc algorthms have been utlzed to extract sub-ensembles from larger ensembles. In Stacked Generalzaton a meta-learner s traned wth the outputs of each predctor to produce the fnal output [1]. Emprcal evdence shows that ths approach tends to over-ft, but wth regularzaton technques for prunng ensembles over-fttng s elmnated. Ensemble prunng by Sem-defnte Programmng has been used to fnd a sub-optmal ensemble n [3]. A dynamc ensemble selecton approach n whch many ensembles that perform well on an optmzaton set or a valdaton set are searched from a pool of over-produced ensembles and from ths the best ensemble s selected usng a selecton functon for computng the fnal output for the test sample [4]. Smlarly a dynamc multstage organzatonal method based on contextual nformaton of the tranng data s used to select the best ensemble for classfcaton n [5]. Recursve Feature Elmnaton has been used n [7] as a method of prunng ensembles. Here the weghts of a traned combner are evaluated to determne the least performng predctor that s removed from the ensemble. 2.1 Reduced Error Prunng Reduced Error Prunng wthout back fttng method (RE) [2], modfed for regresson problems, s used to establsh the order of regressors n the ensemble that produces a mnmum n the ensemble tranng error. Startng wth the regressor that produces the lowest tranng error, the remanng regressors are subsequently ncorporated one at a tme nto the ensemble to acheve a mnmum ensemble error. The sub ensemble S u s constructed by ncorporatng to S u-1 the regressor that mnmzes u = 1 1 su argk mnu Cs = 1 + C where k ϵ (1,...,M)\{S 1, S 2,,S u-1 } and {S 1, S 2,,S u-1 } label regressors that have been ncorporated n the pruned ensemble at teraton u-1. For statc ensemble selecton C s calculated over the entre tranng set and expressed as C = N n= 1 k (1) f ( x ) y (2) where = 1,2,,M, f (x) s the output of the th regressor and(x n,y n ) s the tranng data where n = (1,2,,N). Therefore the nformaton requred for the optmzaton of the tranng error s contaned n the vector C. 3 Method In contrast to statc ensemble selecton, Dynamc Ensemble Selecton wth Instantaneous Prunng (DESIP) provdes an ensemble talored to the specfc test nstance based on the nformaton of the tranng set. The method descrbed here s for a regresson problem where the regressors are ordered for every ndvdual tranng nstance based on the method of RE. Therefore each ensemble selecton for every n n 644
3 tranng nstance contans the same regressors as consttuent members but aggregated n a dfferent order. However, potentally ths dynamc method can be mplemented wth any prunng technque. The mplementaton of DESIP conssts of two stages. Frst the base regressors M are traned on bootstrap samples of the tranng dataset and the regressor order s found for every nstance n the tranng set. As shown n the pseudo-code n fgure 1, ths s acheved by buldng a seres of nested ensembles, per tranng nstance, n whch the ensemble of sze u contans the ensemble of sze u-1. Takng a sngle nstance of the tranng set, the method starts wth an empty ensemble S, n step 2, and bulds the ensemble order, n steps 6 to 15, by evaluatng the tranng error of each regressor n M. The regressor that ncreases the ensemble tranng error least s teratvely added to S. Ths s acheved by mnmzng z n step 9. Therefore each regressor n M takes a unque poston n S as S grows. Ths order s archved n a two dmensonal matrx A wth regressor order n rows and tranng nstance n columns. Tranng data D = (x n, y n ), where n = (1,2,..,N) and f m s a regressor, where m = (1,2,..,M). The Archve Matrx A = (a n ) where a n s a column vector wth max ndex of m. S s also a vector wth max ndex of m. 1. For n = 1.N 2. S empty vector 3. For m = 1 M 4. Evaluate Cm = f m ( xn ) yn 5. End for 6. For u = 1 M 7. mn + 8. For k n (1,...,M)\{S 1, S 2,,S u } u 9. Evaluate = 1 1 z u C S + C k = If z < mn 11. S u k 12. mn z 13. End f 14. End for 15. End for 16. a n S 17. End for Fg 1: Pseudo-code mplementng the archve matrx wth ordered ensemble per tranng nstance. In the second stage, the regressor order that s assocated wth the tranng nstance closest to the test nstance s retreved from matrx A. Here the closeness s determned by calculatng the L1 Norm of the dstance measure between the test nstance and the tranng set. Ths s performed n steps 1 to 6 n fgure 2. All nput features of the tranng set are consdered to dentfy the closest tranng nstance, 645
4 usng the K-Nearest Neghbors method [6], where K = 1. The resultng vector g n, where n s the ndex of the tranng nstance, s searched for the mnmum value and s dentfed as the closest tranng nstance to be retreved from A. The selected ensemble has the order of regressors determned by the tranng nstance. Test and tran nstance x f,test, x f,n,tran where f = (1,2,..,F) features. From fgure 1 Archve Matrx A = (a n ) where a n s the column vector contanng the order of regressors, n = 1,2,..,N. e f s a vector wth max ndex of F and g n s a vector wth max ndex of N. 1. For n = 1.N 2. For f = 1.F 3. Evaluate e f = x f,n,tran - x f,test 4. End for 5. F g n = e f f = 1 6. End for 7. Search for the mnmum values n g n and note n 8. a n s the ensemble selecton for the test nstance. Fg 2: Pseudo-code mplementng the dentfcaton of the closest tranng nstance to the test nstance. In the mplementaton of DESIP wth RE, equaton (2) s modfed so that C s calculated for every tranng nstance. The modfed equaton s shown n equaton (3) C = f ( xn ) yn (3) Consequently S u n equaton (1) s also calculated for every tranng nstance. For the comparson of DESIP wth statc methods, four statc prunng methods were mplemented wth DESIP. They are Ordered Aggregaton (OA) as descrbed n [3] for regresson, Recursve Feature Elmnaton (RFE) n [7], ensemble optmzaton usng Genetc Algorthm (GA) [3] and Reduced Error Prunng wthout back fttng (RE), descrbed n Secton 2.1. The datasets lsted n table 3 have been used for the above comparson. 4 Results MLP archtecture usng the Levenberg Marquardt learnng algorthm wth 5 nodes n the hdden layer, as descrbed n [3] has been selected n ths experment. The tranng/test data splt s 70/30 percent, and 32 base regressors are traned wth bootstrap samples. The Mean Squared Error (MSE) s used as the performance ndcator for both tranng and test sets, and averaged over 100 teratons. Tables 1 and 2 show MSE performance of the four statc methods wth and wthout DESIP. In tables 1 and 2, grayed results ndcate the mnmum MSE over the eght methods. It s observed that the majorty of the lowest MSE values have been acheved by DESIP. Fgure 3 shows the comparson of the tranng and the test error 646
5 plots of statc methods and DESIP (wth RE mplemented) for the SERVO dataset. It s observed that pruned ensembles wth DESIP are more accurate wth fewer members than statc methods. Mean Squared Error Fg 3: Servo / Test MSE OA RFE DESIP Subensemble Sze Mean Squared Error Comparson of the MSE plots of the tranng set and the test set for OA, RFE and DESIP usng RE Servo / Tran MSE OA RFE DESIP Subensemble Sze Dataset Multpler OA RFE GA RE Servo ± ± ± ±0.24 Boston Housng ± ± ± ±0.64 Forest Fres ± ± ± ±0.07 Wsconsn ± ± ± ±0.12 Concrete Slump ± ± ± ±0.36 Auto ± ± ± ±0.93 Auto Prce ± ± ± ±0.63 Body Fat ± ± ± ±2.12 Bolts ± ± ± ±4.69 Polluton ± ± ± ±0.21 Sensory ± ± ± ±0.26 Table 1: Statc Ensemble Prunng Methods: Averaged MSE wth Standard Devaton for the 100 teratons. Dataset Multpler OA RFE GA RE Servo ± ± ± ±0.21 Boston Housng ± ± ± ±0.78 Forest Fres ± ± ± ±0.09 Wsconsn ± ± ± ±0.28 Concrete Slump ± ± ± ±0.56 Auto ± ± ± ±0.83 Auto Prce ± ± ± ±0.63 Body Fat ± ± ± ±1.57 Bolts ± ± ± ±2.43 Polluton ± ± ± ±0.26 Sensory ± ± ± ±0.16 Table 2: DESIP wth Statc Methods Adopted: Averaged MSE wth Standard Devaton for the 100 teratons. 647
6 5 Concluson Dataset Instances Attrbutes Source Servo UCI-Repostory Boston Housng UCI-Repostory Forest Fres UCI-Repostory Wsconsn UCI-Repostory Concrete Slump UCI-Repostory Auto WEKA Auto Prce WEKA Body Fat WEKA Bolts 40 8 WEKA Polluton WEKA Sensory WEKA Table 3: Benchmark datasets used Dynamc ensemble prunng utlzes a dstrbuted approach to ensemble selecton and s an actve area of research for both classfcaton and regresson problems. In ths paper, a novel method of dynamc prunng of regresson ensembles s proposed. Expermental results show that test error has been reduced by modfyng the prunng based on the closest tranng nstance. On a few datasets the proposed method has not mproved performance, and wll be nvestgated further along wth dfferent dstance measures, varyng K for K-NN and relevant feature selecton. Bas/Varance and tme complexty analyss should also help to understand the performance relatve to other statc and dynamc prunng methods wth smlar complexty. References [1] Tsoumakas G., Partalas I., Vlahavas I., An Ensemble Prunng Prmer. Supervsed and Unsupervsed Ensemble Methods and ther Applcatons. Studes n Computatonal Intellgence Volume 245, Sprnger 2009, pp [2] Martínez-Muñoz G., Hernández-Lobato D., Suárez A., An Analyss of Ensemble Prunng Technques Based on Ordered Aggregaton. IEEE Transactons on Pattern Analyss and Machne Intellgence. 31(2), pp [3] Hernández-Lobato D., Martínez-Muñoz G., Suárez A, Emprcal Analyss and Evaluaton of Approxmate Technques for Prunng Regresson Baggng Ensembles. Neurocomputng 74, 2011, pp [4] Dos Santos E.M., Sabourn R., Maupn P., A Dynamc Overproduce-and-choose Strategy for the selecton of Classfer Ensembles. Pattern Recognton 41, 2008, pp [5] Cavaln P.R., Sabourn R., Suen C.Y., Dynamc Selecton of Ensembles of Classfer Usng Contextual Informaton, Multple Classfer Systems Volume 5997, LNCS, Sprnger, 2010, pp [6] Dubey H., Pud V., CLUEKR: Clusterng Based Effcent K-NN Regresson, Advances n Knowledge Dscovery and Data Mnng, LNCS, Sprnger, Volume 7818, 2013, pp [7] Wndeatt T., Das K., Feature Rankng Ensembles for Facal Acton Unt Classfcaton. IAPR Thrd Internatonal Workshop on Artfcal Neural Networks n Pattern Recognton, [8] Cavaln P.R., Sabourn R., Suen C.Y., Dynamc Selecton Approaches for Multple Classfer Systems, Neural Computng Applcatons Volume 22 (3-4) LNCS, Sprnger, 2013, pp [9] Zhau Z.-H., Wu J., Tang W., Ensemblng Neural Networks: many could be better than all, Artfcal Intellgence, Volume 137, 2002, pp
Supporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationStudy of Selective Ensemble Learning Methods Based on Support Vector Machine
Avalable onlne at www.scencedrect.com Physcs Proceda 33 (2012 ) 1518 1525 2012 Internatonal Conference on Medcal Physcs and Bomedcal Engneerng Study of Selectve Ensemble Learnng Methods Based on Support
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationSemi-supervised Classification with Active Query Selection
Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples
More informationSparse Gaussian Processes Using Backward Elimination
Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an
More informationInstance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification
Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationUsing Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*
Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationEnsemble of GA based Selective Neural Network Ensembles
Ensemble of GA based Selectve eural etwork Ensembles Jan-Xn WU Zh-Hua ZHOU Zhao-Qan CHE atonal Laboratory for ovel Software Technology anjng Unversty anjng, 0093, P.R.Chna wujx@a.nju.edu.cn {zhouzh, chenzq}@nju.edu.cn
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationMACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression
11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationNon-linear Canonical Correlation Analysis Using a RBF Network
ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane
More informationComparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method
Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method
More informationAggregation of Social Networks by Divisive Clustering Method
ggregaton of Socal Networks by Dvsve Clusterng Method mne Louat and Yves Lechaveller INRI Pars-Rocquencourt Rocquencourt, France {lzennyr.da_slva, Yves.Lechevaller, Fabrce.Ross}@nra.fr HCSD Beng October
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationImproved Machine Learning Models for Predicting Selective Compounds
Improved Machne Learnng Models for Predctng Selectve Compounds Xa Nng Computer Scence & Engneerng Unversty of Mnnesota, Twn Ctes xnng@cs.umn.edu George Karyps Computer Scence & Engneerng Unversty of Mnnesota,
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationA PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS
HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationA Network Intrusion Detection Method Based on Improved K-means Algorithm
Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More information2 STATISTICALLY OPTIMAL TRAINING DATA 2.1 A CRITERION OF OPTIMALITY We revew the crteron of statstcally optmal tranng data (Fukumzu et al., 1994). We
Advances n Neural Informaton Processng Systems 8 Actve Learnng n Multlayer Perceptrons Kenj Fukumzu Informaton and Communcaton R&D Center, Rcoh Co., Ltd. 3-2-3, Shn-yokohama, Yokohama, 222 Japan E-mal:
More informationSpatial Statistics and Analysis Methods (for GEOG 104 class).
Spatal Statstcs and Analyss Methods (for GEOG 104 class). Provded by Dr. An L, San Dego State Unversty. 1 Ponts Types of spatal data Pont pattern analyss (PPA; such as nearest neghbor dstance, quadrat
More informationNatural Language Processing and Information Retrieval
Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationTurbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH
Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationRelevance Vector Machines Explained
October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes
More informationFeature Selection in Multi-instance Learning
The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng
More informationNeural Networks & Learning
Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred
More informationMULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN
MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationA neural network with localized receptive fields for visual pattern classification
Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton
More informationSpeeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem
H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationCluster Validation Determining Number of Clusters. Umut ORHAN, PhD.
Cluster Analyss Cluster Valdaton Determnng Number of Clusters 1 Cluster Valdaton The procedure of evaluatng the results of a clusterng algorthm s known under the term cluster valdty. How do we evaluate
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationComparison Of Network Pruning And Tree Pruning On Artificial Neural Network Tree
Australan Journal of Basc and Appled Scences, 5(9): 1093-1098, 2011 ISSN 1991-8178 Comparson Of Network Prunng And Tree Prunng On Artfcal Neural Network Tree 1 S. Kalaaras, 1 S. Sayeed and 2 J. Hossen
More informationStatistical Evaluation of WATFLOOD
tatstcal Evaluaton of WATFLD By: Angela MacLean, Dept. of Cvl & Envronmental Engneerng, Unversty of Waterloo, n. ctober, 005 The statstcs program assocated wth WATFLD uses spl.csv fle that s produced wth
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationTransient Stability Assessment of Power System Based on Support Vector Machine
ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract
More informationAn Extended Hybrid Genetic Algorithm for Exploring a Large Search Space
2nd Internatonal Conference on Autonomous Robots and Agents Abstract An Extended Hybrd Genetc Algorthm for Explorng a Large Search Space Hong Zhang and Masum Ishkawa Graduate School of L.S.S.E., Kyushu
More informationHongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)
ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of
More informationCSC 411 / CSC D11 / CSC C11
18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationA Hybrid Variational Iteration Method for Blasius Equation
Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method
More informationAn improved multi-objective evolutionary algorithm based on point of reference
IOP Conference Seres: Materals Scence and Engneerng PAPER OPEN ACCESS An mproved mult-objectve evolutonary algorthm based on pont of reference To cte ths artcle: Boy Zhang et al 08 IOP Conf. Ser.: Mater.
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationDynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)
/24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes
More informationPop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing
Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationIV. Performance Optimization
IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationStatistics MINITAB - Lab 2
Statstcs 20080 MINITAB - Lab 2 1. Smple Lnear Regresson In smple lnear regresson we attempt to model a lnear relatonshp between two varables wth a straght lne and make statstcal nferences concernng that
More information18-660: Numerical Methods for Engineering Design and Optimization
8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal
More informationWe present the algorithm first, then derive it later. Assume access to a dataset {(x i, y i )} n i=1, where x i R d and y i { 1, 1}.
CS 189 Introducton to Machne Learnng Sprng 2018 Note 26 1 Boostng We have seen that n the case of random forests, combnng many mperfect models can produce a snglodel that works very well. Ths s the dea
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationPHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University
PHYS 45 Sprng semester 7 Lecture : Dealng wth Expermental Uncertantes Ron Refenberger Brck anotechnology Center Purdue Unversty Lecture Introductory Comments Expermental errors (really expermental uncertantes)
More informationFUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM
Internatonal Conference on Ceramcs, Bkaner, Inda Internatonal Journal of Modern Physcs: Conference Seres Vol. 22 (2013) 757 761 World Scentfc Publshng Company DOI: 10.1142/S2010194513010982 FUZZY GOAL
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationDepartment of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6
Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.
More informationSystem identifications by SIRMs models with linear transformation of input variables
ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationUnified Subspace Analysis for Face Recognition
Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA
More informationTensor Subspace Analysis
Tensor Subspace Analyss Xaofe He 1 Deng Ca Partha Nyog 1 1 Department of Computer Scence, Unversty of Chcago {xaofe, nyog}@cs.uchcago.edu Department of Computer Scence, Unversty of Illnos at Urbana-Champagn
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationArmy Ants Tunneling for Classical Simulations
Electronc Supplementary Materal (ESI) for Chemcal Scence. Ths journal s The Royal Socety of Chemstry 2014 electronc supplementary nformaton (ESI) for Chemcal Scence Army Ants Tunnelng for Classcal Smulatons
More informationStatistics for Economics & Business
Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationLearning with Tensor Representation
Report No. UIUCDCS-R-2006-276 UILU-ENG-2006-748 Learnng wth Tensor Representaton by Deng Ca, Xaofe He, and Jawe Han Aprl 2006 Learnng wth Tensor Representaton Deng Ca Xaofe He Jawe Han Department of Computer
More informationLecture 23: Artificial neural networks
Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationFORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES
Proceedngs of the Fourth Internatonal Conference on Machne Learnng and Cybernetcs, Guangzhou, 8- August 005 FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES DING-ZHOU CAO, SU-LIN PANG, YUAN-HUAI
More informationSingle-Facility Scheduling over Long Time Horizons by Logic-based Benders Decomposition
Sngle-Faclty Schedulng over Long Tme Horzons by Logc-based Benders Decomposton Elvn Coban and J. N. Hooker Tepper School of Busness, Carnege Mellon Unversty ecoban@andrew.cmu.edu, john@hooker.tepper.cmu.edu
More informationInternet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks
Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc
More informationClassification. Representing data: Hypothesis (classifier) Lecture 2, September 14, Reading: Eric CMU,
Machne Learnng 10-701/15-781, 781, Fall 2011 Nonparametrc methods Erc Xng Lecture 2, September 14, 2011 Readng: 1 Classfcaton Representng data: Hypothess (classfer) 2 1 Clusterng 3 Supervsed vs. Unsupervsed
More informationAn Evolutionary Method of Neural Network in System Identification
Internatonal Journal of Intellgent Informaton Systems 206; 5(5): 75-8 http://www.scencepublshnggroup.com//s do: 0.648/.s.2060505.4 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) An Evolutonary Method
More informationChapter Newton s Method
Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve
More informationCS 331 DESIGN AND ANALYSIS OF ALGORITHMS DYNAMIC PROGRAMMING. Dr. Daisy Tang
CS DESIGN ND NLYSIS OF LGORITHMS DYNMIC PROGRMMING Dr. Dasy Tang Dynamc Programmng Idea: Problems can be dvded nto stages Soluton s a sequence o decsons and the decson at the current stage s based on the
More information