Semi-supervised Classification with Active Query Selection
|
|
- Barbra Allison
- 5 years ago
- Views:
Transcription
1 Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Abstract. Labeled samples are crucal n sem-supervsed classfcaton, but whch samples should we choose to be the labeled samples? In other words, whch samples, f labeled, would provde the most nformaton? We propose a method to solve ths problem. Frst, we gve each unlabeled examples an ntal class label usng unsupervsed learnng. Then, by maxmzng the mutual nformaton, we choose the samples wth most nformaton to be user-specfed labeled samples. After that, we run sem-supervsed algorthm wth the userspecfed labeled samples to get the fnal classfcaton. Expermental results on synthetc data show that our algorthm can get a satsfyng classfcaton results wth actve query selecton. Introducton Recently, there has been great nterest n Sem-supervsed classfcaton. The goal of sem-supervsed learnng s to use unlabeled data to mprove the performance of standard supervsed learnng algorthms. Snce n many felds, obtanng labeled data s hard or expensve, sem-supervsed learnng methods wth small labeled sample sze s of great use. In case the unsupervsed learnng methods can separate the ponts well (see e.g. Fg.a), there s no need for sem-supervsed methods. However, n case of nose (see e.g. Fg.b), or n case of two modes whch belong to two dfferent classes overlap (see e.g. Fg.c), sem-supervsed learnng wth a few labeled ponts n each class can mprove the performance sgnfcantly. A number of algorthms have been proposed for sem-supervsed learnng, ncludng EM [8], Co-tranng [, 4], Tr-tranng [5], random feld models [9, ], graph based approaches [, 6, 3]. Dfferent methods have dfferent assumptons, and can be used n dfferent stuaton. Especally, when data resdes on a low-dmensonal manfold wthn a hgh-dmensonal representaton space, sem-supervsed learnng methods should be adjusted to work on manfold. Belkn gves a soluton to ths problem wth manfold Regularzaton methods n [4]. Query selecton s extensvely studed n the supervsed framework. In [0], the queres are selected to mnmze the verson space sze for support vector machne. In [7], a commttee of classfers s employed, and a pont s quered whenever the commttee members dsagree. Many other methods are proposed to actvely choose the samples n supervsed learnng, but few are done to choose samples n sem-supervsed learnng. D.-Y. Yeung et al. (Eds.): SSPR&SPR 006, LNCS 409, pp , 006. Sprnger-Verlag Berln Hedelberg 006
2 74 J. Wang and S. Luo The labeled samples play an mportant role n sem-supervsed learnng. Then a queston rses: whch samples should be the labeled samples? Among the exstng sem-supervsed learnng methods, some choose the labeled samples manually[6], to do ths, one has to have some doman knowledge of whch samples need most to be labeled; some choose the labeled samples randomly, whch may not contan the rght samples; and n [], Zhu et al. choose the samples actvely by greedly selectng queres from the unlabeled data to mnmze the estmated expected classfcaton error, but Zhu s actve learnng method can only be used together wth hs semsupervsed learnng method. In ths paper, we gve a more general and automatc query selecton method n the sem-supervsed framework. Our method can be appled to most of the exstng semsupervsed learnng methods. It s the pre-process of the exstng sem-supervsed methods. The man dea s to consder whch samples, f labeled, would gve more nformaton. Followng ths dea, we use the mutual nformaton I ( Y; y) ( y represents one sample s class label and Y represents the whole sample s class labels) as a measure of actve query selecton. By maxmzng the mutual nformaton, we get the sample whch needs most to be labeled. Usng ths method, we can choose the samples to be labeled actvely and automatcally, and t does not need any doman knowledge. In ths paper, n order to explan our method, we work wth the Laplacan Egenmaps [3] and manfold Regularzaton [4] of Belkn to show the entre process. We can see how the actve query selecton method works on manfold. We do not clam that ths method can only use on manfold, and ndeed we am to llumnate that applyng our method, to any sem-supervsed method, would always yeld satsfyng results. Ths paper s organzed as follows: In secton, we ntroduce our algorthm n a bref way. Secton 3 gves detals of every part of our algorthm. Expermental results on synthetc data are shown n secton 4, followed by conclusons n secton 5. Fg.. (a) Example of stuaton n whch unsupervsed learnng methods (here we use Laplacan Egenmaps) can work well. (b)(c) Examples of stuatons n whch unsupervsed learnng can not gve satsfyng results, and they need some labeled samples to help. Our Algorthm To explan the entre process of our actve query selecton method, we work wth the Laplacan Egenmaps [3] and manfold Regularzaton [4] of Belkn. The steps are as follow:
3 Sem-supervsed Classfcaton wth Actve Query Selecton 743 Step. Gve each sample (unlabeled) an ntal class label usng unsupervsed learnng. Here, we use Laplacan Egenmaps to map the sample to a real value functon f. Step. By maxmzng the mutual nformaton I ( Y; y) ( y represents one sample s class label and Y represents the whole sample s class labels), we actvely choose the samples wth most uncertan class labels to be user-specfed labeled samples. Step 3. Gve the chosen samples ther class label. Step 4. Run the sem-supervsed algorthm (here, we use the manfold Regularzaton algorthm) wth the user-specfed labeled samples to get the fnal classfcaton. 3 Detals of Our Method 3. Usng Laplacan Egenmaps to Get Intal Class Label x,..., x n R m Gven a sample set. Construct ts neghborhood graph G = ( V, E), whose vertces are sample ponts V = x, L, x }, and whose edge weghts n { w}, j = { n represent approprate parwse smlarty relatonshps between samples. For example, w can be the radal bass functon: w = exp( σ m d = ( x d x where σ s a scale parameter. The radal bass functon of w ensure that nearby ponts are assgned large edge weghts. We frst consder two-class stuaton. Assume that f s a real value functon whose value s bounded from 0 to (0 and each represents a class label). T y = f ( x ), Y = ( y, y,..., y n ). Laplacan Egenmaps try to mnmze the followng objectve functon ( y y ) W () By mnmzng ths objectve functon, we get of each sample, wth y [0,]. j jd ) ) y,..., (), y yn, the ntal class label 3. Usng Mutual Informaton to Choose the Samples wth Most Uncertan Class Labels Ths s our actve query selecton step. And we use the mutual nformaton I ( Y; y) ( y represents one sample s class label and Y represents the whole sample s class labels) as a measure of query selecton. By maxmzng the mutual
4 744 J. Wang and S. Luo nformaton, we get the sample whch would gve most nformaton, that s, whch needs most to be labeled. In order to calculate I ( Y; y), nspred by the work of [5], we defne a Gaussan random feld on the vertces of V { y T Δ / } p( y) exp λ y (3) where Δ = D W, and D s a dagonal matrx gven by = n D W The mutual nformaton between Y and y * s the expected decrease n entropy of Y when y * s observed: I ( Y; y ) = H ( Y) E = { H ( Y y )} (/ )log( + p( y )( p( y )) x * T j= H. x ) where H = ( log p( Y)), and s the Hessan matx. The best sample to label s the one that maxmzes I ( Y ; y ). And the mutual nformaton s largest when p ) 0. 5,.e., for samples wth most nformaton. ( y 3.3 Usng the User-Specfed Labeled Samples to Get the Fnal Classfcaton After we actvely choose the sample to gve label, we can run the sem-supervsed classfcaton methods to get the fnal result. In Manfold Regularzaton methods of Belkn, the author mnmzes the followng cost functon (4) mn H[ f ] = f H l l = ( f ( x ) y ) n + γ f + γ ( f ( x ) f ( x )) W (5) A K I, j= j Where l s the number of labeled samples. γ A, γ I are regularzaton parameters. K f s some form of constrant to ensure the smoothness of the learned manfold. Here, the l samples n the above cost functon are not chosen randomly or manually as n the orgnal work of Belkn. But rather, they are chosen wth the actve query selecton methods dscussed n Expermental Results As we pont out at the begnnng of ths paper, unsupervsed learnng can not work well n case of nose, and n case of two modes whch belong to two dfferent classes overlapped. In these stuatons, sem-supervsed learnng wth a few labeled samples can help. Usng some synthetc data, we show that our actve query selecton method can choose the most nformatve samples to gve labels. Fg. (a) s a nose case of fg. (a), and wthout labeled samples, the Laplacan Egenmaps can not fnd a satsfyng
5 Sem-supervsed Classfcaton wth Actve Query Selecton 745 classfcaton (the yellow curve). Usng our actve query selecton method, the algorthm chooses some samples to be labeled, these samples are shown n (b) wth purple color. After that, user gve the class label of these chosen samples (the red and blue samples n (c), each color represents a class), then, wth these user-specfed labeled samples, manfold regularzaton method fnd the more satsfyng classfcaton as shown n (c) (the yellow curve). Fg.. (a) Laplacan Egenmaps can not fnd a satsfyng classfcaton wthout labeled samples. (b) The samples automatcally chosen to be labeled. (c) The manfold regularzaton results wth the labeled samples. 5 Conclusons A key problem of sem-supervsed learnng s to choose the most nformatve samples to be labeled at the very begnnng of sem-supervsed algorthms. Usng mutual nformaton, we gve a soluton to ths problem. Our method of samples chosen can apply to most of the exstng sem-supervsed learnng methods, and n ths paper, we combne t wth manfold regularzaton to show how t works. We also do experments on some synthetc data, and yeld satsfyng results. In future works, we wll try ths method on some real world experments. Another problem of sem-supervsed learnng s how many labeled sample are sutable, for example, should we choose fve samples to gve label, or, should we choose ten? In future work, we wll consder ths problem n the framework of the actve query selecton of ths paper. Acknowledgements The research s supported by natonal natural scence foundatons of chna ( ), the Research Fund for the Doctoral Program of Hgher Educaton of Chna ( ) and Co-Constructon Project of Key Subject of Beng. References. A. Blum and T. Mtchell: Combnng labeled and unlabeled data wth co-tranng. In Proceedngs of the th Annual Conference on Computatonal Learnng Theory. Madson, WI, pp.9 00, (998).. A. Blum, S. Chawla: Learnng from Labeled and Unlabeled Data usng Graph Mncuts. ICML (00).
6 746 J. Wang and S. Luo 3. Belkn M., Nyog P: Laplacan Egenmaps for Dmensonalty Reducton and Data Representaton. Neural Computaton, June (003) 4. Belkn M., Nyog P., Sndhwan V: On Manfold Regularzaton. Department of Computer Scence, Unversty of Chcago, TR B. Krshnapuram, D. Wllams, Ya Xue, A. Hartemnk, L. Carn, and M. A. T. Fgueredo: On Sem-Supervsed Classfcaton. NIPS (004). 6. D. Zhou, O. Bousquet, T.N. Lal, J. Weston and B. Schoelkopf: Learnng wth Local and Global Consstency. NIPS (003). 7. Freund, Y., Seung, H. S., Shamr, E., & Tshby, N.: Selectve samplng usng the query by commttee algorthm. Machne Learnng, 8, (997). 8. K. Ngam: Usng Unlabeled Data to Improve Text Classfcaton. PhD thess, Carnege Mellon Unversty Computer Scence Dept, (00). 9. M. Szummer and T. Jaakkola: Partally labeled classfcaton wth markov random walks. NIPS (00). 0. Tong, S., and Koller, D.: Support vector machne actve learnng wth applcatons to text classfcaton. ICML (000).. Xaojn Zhu, J. Lafferty, and Z. Ghahraman: Combnng actve learnng and semsupervsed learnng usng Gaussan felds and harmonc functons. ICML (003).. Xaojn Zhu, Z. Ghahraman, and J. Lafferty: Sem-supervsed learnng usng Gaussan felds and harmonc functons. ICML (003). 3. Xaojn Zhu: Sem-Supervsed Learnng wth Graphs. PhD thess, Carnege Mellon Unversty Computer Scence Dept, (005). 4. Zhou, Z.-H., & L, M.: Sem-supervsed regresson wth co-tranng. Internatonal Jont Conference on Artfcal Intellgence, (005). 5. Zhou, Z.-H., & L, M.: Tr-tranng: explotng unlabeled data usng three classfers. IEEE Trans. Knowledge and Data Engneerng, 7, 59 54, (005).
Kernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationMAXIMUM A POSTERIORI TRANSDUCTION
MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationA Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach
A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationTensor Subspace Analysis
Tensor Subspace Analyss Xaofe He 1 Deng Ca Partha Nyog 1 1 Department of Computer Scence, Unversty of Chcago {xaofe, nyog}@cs.uchcago.edu Department of Computer Scence, Unversty of Illnos at Urbana-Champagn
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationLecture 3: Dual problems and Kernels
Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM
More information18-660: Numerical Methods for Engineering Design and Optimization
8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal
More informationPower law and dimension of the maximum value for belief distribution with the max Deng entropy
Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationSparse Gaussian Processes Using Backward Elimination
Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an
More informationSoft-Supervised Learning for Text Classification
Soft-Supervsed Learnng for Text Classfcaton Amarnag Subramanya & Jeff Blmes Dept. of Electrcal Engneerng, Unversty of Washngton, Seattle, WA 98195, USA. {asubram,blmes}@ee.washngton.edu Abstract We propose
More informationDiscretization of Continuous Attributes in Rough Set Theory and Its Application*
Dscretzaton of Contnuous Attrbutes n Rough Set Theory and Its Applcaton* Gexang Zhang 1,2, Lazhao Hu 1, and Wedong Jn 2 1 Natonal EW Laboratory, Chengdu 610036 Schuan, Chna dylan7237@sna.com 2 School of
More informationInternational Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN
Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More informationSolving Nonlinear Differential Equations by a Neural Network Method
Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationA Local Variational Problem of Second Order for a Class of Optimal Control Problems with Nonsmooth Objective Function
A Local Varatonal Problem of Second Order for a Class of Optmal Control Problems wth Nonsmooth Objectve Functon Alexander P. Afanasev Insttute for Informaton Transmsson Problems, Russan Academy of Scences,
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationHandling Uncertain Spatial Data: Comparisons between Indexing Structures. Bir Bhanu, Rui Li, Chinya Ravishankar and Jinfeng Ni
Handlng Uncertan Spatal Data: Comparsons between Indexng Structures Br Bhanu, Ru L, Chnya Ravshankar and Jnfeng N Abstract Managng and manpulatng uncertanty n spatal databases are mportant problems for
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationSupport Vector Machines
Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x n class
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationTHE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.
THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More informationFeature Selection in Multi-instance Learning
The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationHigh resolution entropy stable scheme for shallow water equations
Internatonal Symposum on Computers & Informatcs (ISCI 05) Hgh resoluton entropy stable scheme for shallow water equatons Xaohan Cheng,a, Yufeng Ne,b, Department of Appled Mathematcs, Northwestern Polytechncal
More informationA Hybrid Variational Iteration Method for Blasius Equation
Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method
More informationDepartment of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution
Department of Statstcs Unversty of Toronto STA35HS / HS Desgn and Analyss of Experments Term Test - Wnter - Soluton February, Last Name: Frst Name: Student Number: Instructons: Tme: hours. Ads: a non-programmable
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationAdaptive Manifold Learning
Adaptve Manfold Learnng Jng Wang, Zhenyue Zhang Department of Mathematcs Zhejang Unversty, Yuquan Campus, Hangzhou, 327, P. R. Chna wroarng@sohu.com zyzhang@zju.edu.cn Hongyuan Zha Department of Computer
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationNon-linear Canonical Correlation Analysis Using a RBF Network
ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationLearning with Tensor Representation
Report No. UIUCDCS-R-2006-276 UILU-ENG-2006-748 Learnng wth Tensor Representaton by Deng Ca, Xaofe He, and Jawe Han Aprl 2006 Learnng wth Tensor Representaton Deng Ca Xaofe He Jawe Han Department of Computer
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More informationLecture 20: November 7
0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:
More informationOn an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1
On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationSemi-supervised Learning using Kernel Self-consistent Labeling
Sem-supervsed earnng usng Kernel Self-consstent abelng Keywords: sem-supervsed learnng, kernel methods, support vector machne, leave-one-out cross valdaton, Gaussan process. Abstract We present a new method
More informationNatural Language Processing and Information Retrieval
Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationSpace of ML Problems. CSE 473: Artificial Intelligence. Parameter Estimation and Bayesian Networks. Learning Topics
/7/7 CSE 73: Artfcal Intellgence Bayesan - Learnng Deter Fox Sldes adapted from Dan Weld, Jack Breese, Dan Klen, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer What s Beng Learned? Space
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationMaximizing the number of nonnegative subsets
Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More informationLarge-Margin HMM Estimation for Speech Recognition
Large-Margn HMM Estmaton for Speech Recognton Prof. Hu Jang Department of Computer Scence and Engneerng York Unversty, Toronto, Ont. M3J 1P3, CANADA Emal: hj@cs.yorku.ca Ths s a jont work wth Chao-Jun
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationA new Approach for Solving Linear Ordinary Differential Equations
, ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of
More informationOrganizing Teacher Education In German Universities
An Operatons Research Approach Bob Grün Unversty of Kaserslautern Department of Mathematcs AG Optmzaton Research Semnar OptALI: Optmzaton and ts Applcatons n Learnng and Industry 28.2.2012 Outlne 1 Introducton
More informationValuated Binary Tree: A New Approach in Study of Integers
Internatonal Journal of Scentfc Innovatve Mathematcal Research (IJSIMR) Volume 4, Issue 3, March 6, PP 63-67 ISS 347-37X (Prnt) & ISS 347-34 (Onlne) wwwarcournalsorg Valuated Bnary Tree: A ew Approach
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationA NEW DISCRETE WAVELET TRANSFORM
A NEW DISCRETE WAVELET TRANSFORM ALEXANDRU ISAR, DORINA ISAR Keywords: Dscrete wavelet, Best energy concentraton, Low SNR sgnals The Dscrete Wavelet Transform (DWT) has two parameters: the mother of wavelets
More informationSpectral Clustering. Shannon Quinn
Spectral Clusterng Shannon Qunn (wth thanks to Wllam Cohen of Carnege Mellon Unverst, and J. Leskovec, A. Raaraman, and J. Ullman of Stanford Unverst) Graph Parttonng Undrected graph B- parttonng task:
More informationSupport Vector Machines
/14/018 Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x
More informationKeyword Reduction for Text Categorization using Neighborhood Rough Sets
IJCSI Internatonal Journal of Computer Scence Issues, Volume 1, Issue 1, No, January 015 ISSN (rnt): 1694-0814 ISSN (Onlne): 1694-0784 www.ijcsi.org Keyword Reducton for Text Categorzaton usng Neghborhood
More informationErratum: A Generalized Path Integral Control Approach to Reinforcement Learning
Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationStudy of Selective Ensemble Learning Methods Based on Support Vector Machine
Avalable onlne at www.scencedrect.com Physcs Proceda 33 (2012 ) 1518 1525 2012 Internatonal Conference on Medcal Physcs and Bomedcal Engneerng Study of Selectve Ensemble Learnng Methods Based on Support
More informationLecture 10 Support Vector Machines. Oct
Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationRBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis
Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural
More informationAn Iterative Modified Kernel for Support Vector Regression
An Iteratve Modfed Kernel for Support Vector Regresson Fengqng Han, Zhengxa Wang, Mng Le and Zhxang Zhou School of Scence Chongqng Jaotong Unversty Chongqng Cty, Chna Abstract In order to mprove the performance
More informationMDL-Based Unsupervised Attribute Ranking
MDL-Based Unsupervsed Attrbute Rankng Zdravko Markov Computer Scence Department Central Connectcut State Unversty New Brtan, CT 06050, USA http://www.cs.ccsu.edu/~markov/ markovz@ccsu.edu MDL-Based Unsupervsed
More informationCryptanalysis of pairing-free certificateless authenticated key agreement protocol
Cryptanalyss of parng-free certfcateless authentcated key agreement protocol Zhan Zhu Chna Shp Development Desgn Center CSDDC Wuhan Chna Emal: zhuzhan0@gmal.com bstract: Recently He et al. [D. He J. Chen
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationBuilding Maximum Entropy Text Classifier Using Semi-supervised Learning
Buldng Maxmum Entropy Text Classfer Usng Sem-supervsed Learnng Zhang, Xnhua For PhD Qualfyng Exam Term Paper 2005-7-4 School of Computng, NUS 1 Road map Introducton: bacground and applcaton Sem-supervsed
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationClassification as a Regression Problem
Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class
More informationAutomatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models
Automatc Object Trajectory- Based Moton Recognton Usng Gaussan Mxture Models Fasal I. Bashr, Ashfaq A. Khokhar, Dan Schonfeld Electrcal and Computer Engneerng, Unversty of Illnos at Chcago. Chcago, IL,
More informationTransient Stability Assessment of Power System Based on Support Vector Machine
ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract
More informationThe lower and upper bounds on Perron root of nonnegative irreducible matrices
Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College
More informationNonlinear Classifiers II
Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural
More informationNatural Images, Gaussian Mixtures and Dead Leaves Supplementary Material
Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More informationOrientation Model of Elite Education and Mass Education
Proceedngs of the 8th Internatonal Conference on Innovaton & Management 723 Orentaton Model of Elte Educaton and Mass Educaton Ye Peng Huanggang Normal Unversty, Huanggang, P.R.Chna, 438 (E-mal: yepeng@hgnc.edu.cn)
More information