An Iterative Modified Kernel for Support Vector Regression

Size: px
Start display at page:

Download "An Iterative Modified Kernel for Support Vector Regression"

Transcription

1 An Iteratve Modfed Kernel for Support Vector Regresson Fengqng Han, Zhengxa Wang, Mng Le and Zhxang Zhou School of Scence Chongqng Jaotong Unversty Chongqng Cty, Chna Abstract In order to mprove the performance of a support vector regresson, a new method for modfed kernel functon s proposed. In ths method the nformaton of whole samples s ncluded n kernel functon by conformal mappng. So the Kernel functon s data-dependent. Wth random ntal parameter of kernel functon, teratve modfyng s not stopped untl satsfactory effect. Comparng wth the conventonal model, the mproved approach does not need selectng parameters of kernel functon. Smulaton results show that the mproved approach has better learnng ablty and forecastng precson than tradtonal model. Keywords support vector regresson, data-dependent, kernel, teraton I. INTRODUCTION In recent years, a new pattern classfcaton algorthm, called Support Vector Machne(SVM) has proposed by Vapnk [1] s a powerful methodology for solvng a wde varety of problems n nonlnear classfcaton, functon estmaton, and densty estmaton. Unlke tradtonal neural network models whch mnmze the emprcal tranng error, SVM mplements the structural rsk mnmzaton prncple whch seeks to mnmze the tranng error and a confdence nterval term. Ths results n a good generalzaton error. Because of ts good propertes such as global optmal soluton, good learnng ablty for small samples and so on, the SVM has receved ncreasng attenton n recent years [-4]. Moreover t has been successfully appled to the support vector regresson (SVR), especally for nonlnear tme seres [5-7]. The performance of SVM largely depends on the kernel. Smola [8] elucdated the relaton between the SVM kernel method and the standard regularzaton theory. However, there are no theores concernng how to choose good kernel functon n a data-dependent way. In paper [9], the authors propose a method of modfyng a kernel to mprove the performance of a Support Vector Machne classfer. It s based on the Remannan geometrcal structure nduced by the kernel functon. In order to ncrease the separablty, the dea s to enlarge the spatal resoluton around the separatng boundary surface wth a conformal mappng. In ths method, a prmary kernel s used to obtan support vectors. Then the kernel s modfed conformally n a data dependent way by usng the nformaton of the support vectors. The Fnal classfer s traned by the modfed kernel. Inspred by ths dea, lang yanchun [10] apply the modfed kernel to SVR and forecast fnancal tme seres. These methods have acheved better performance than the conventonal SVM, but we need choose parameters of the modfed kernel carefully. The goal of ths paper s to propose a new method to tran SVM, whch modfes the kernel functon repeatedly. Unlke tradtonal methods whch select parameters of the kernel functon carefully, the parameters of ths algorthm are random. Smulaton experments show that the new method s obvously superor to the tradtonal SVM n the precson of predcton. The remander of ths paper s organzed as follows. Secton brefly ntroduces the learnng of SVR. Secton 3 provdes a background to data-dependent Kernel. Our method to tran SVR by teratve learnng wll be ntroduced n Secton 4. Secton 5 presents some computatonal results to show the effectveness of our method. Secton 6 concludes our works. II. SUPPORT VECTOR REGRESSION Let { x, y}, = 1,,, m be a gven set of tranng data, n where ( x, y ) R R. The output of the SVR s f ( x) =< w, φ( x) > + b, (1) where w s the weght vector, b the bas and φ ( x) the nonlnear mappng from the nput space S to the hgh dmensonal feature space F. < >, represent the nner product. The commonly used ε -nsenstve loss functon ntroduced by Vapnk s f( x) y ε, f( x) y ε Lε ( x ) =. () 0, f( x) y < ε In order to tran w and b, the followng functon s mnmzed /08 /$ IEEE CIS 008

2 1 Z = w + C ξ + ξ + mn ( ) + y < w, φ( x) > b ε + ξ, (3) st.. y +< w, φ( x) > + b ε + ξ + ξ, ξ 0, = 1,, m where C s the regularzed constant determnng the trade-off between the emprcal error and the regularzaton term. After the ntroducton of postve slack varables and Lagrange multplers, (3) s equvalent to a standard quadratc programmng (QP) problem whch can be solved wth QP. After (3) s optmzed, (1) can be rewrtten as f ( x) = ( α α ) k( x, x) + b, (4) x SV where α and α are the optmzed soluton of QP. kxx (, ) s the followng kernel functon kxx (, ) =< φ( x), φ( x ) >. (5) It should be ponted out that not all functons can be taken as kernel functon n the SVR. It has been proved that the kernel functon should satsfy the condtons of Mercer s theorem. The kernel functon plays an mportant role n the SVR. It has great effect on the predctng precson. In the next secton the kernel functon s modfed by the conformal mappng, whch makes the kernel functon data-dependent. III. DATA-DEPENDENT KERNEL Before you begn to format your paper, frst wrte and save the content as a separate text fle. Keep your text and graphc fles separate untl after the text has been formatted and styled. Do not use hard tabs, and lmt use of hard returns to only one return at the end of a paragraph. Do not add any knd of pagnaton anywhere n the paper. Do not number text headsthe template wll do that for you. In the tradtonal SVM and SVR, there are no theores concernng how to choose kernel functon n a data-dependent way. Whle some tme seres, such as fnancal tme seres, weather data etc, are nherently nosy, non-statonary and determnstcally chaotc. In order to mprove the precson of forecastng, t s necessary to redefne the kernel functon from the tranng data. In ths secton, a background to datadependent Kernel s provded. The kernel functon s modfed based on nformaton geometry [9-1], n a data-dependent way. From the pont of geometry, nonlnear mappng defnes an embeddng of nput space S nto feature space F as a curved sub-manfold. Generally, F s a reproducng kernel Hlbert space(rkhs) whch s a subspace of Hlbert space. So a Remannan metrc [9] can be nduced n the nput space, and the Remannan metrc can be expressed n the closed form n terms of the kernel g ( x) =< φ( x), φ ( x) >= k( x, x ) =. (6) x x x x j x x j j The volume form n the feature space s defned as dv = g( x) dx(1) dx() dx( n), (7) where gx ( ) = det gj( x), x(1), x(), x( n) are the whole elements of x, the factor gx ( ) represents how a local area s magnfed n F under the mappng φ( x) [9]. After a conformal mappng s ntroduced to the kernel functon, the new kernel functon s defned as k ( x, x ) = c( x) c( x ) k( x, x ), (8) where cx ( ) s a postve conformal mappng functon. It s easy to see that the new kernel functon k ( x, x ) satsfes the condtons of Mercer s theorem. The conformal mappng s taken as [9] cx h x x τ, (9) ( ) = exp( /( )) I where I s the support vectors set [9]. For the new kernel functon k ( x, x ), the Remannan metrc can be expressed as cx ( ) cx ( ) g x c x g x. (10) j ( ) = + ( ) j ( ) x xj One of the typcal kernel functons s Gaussan RBF kernel : In ths case, we have 1, where δj = 0, k( x, x ) = exp( x x / σ ). (11) = j. j g () x δ / σ j = j, (1) In the regon of a neghborhood of a support vector x, we have n n 4 gx ( ) = ( h / σ )exp( nr /( τ )) 1 + σ r / τ, (13) where r = x x s the Eucldean dstance between x and x.

3 IV. ITERATIVE MODIFIED ALGORITHM After the text edt has been completed, the paper s ready for the template. Duplcate the template fle by usng the Save As command, and use the namng conventon prescrbed by your conference for the name of your paper. In ths newly created fle, hghlght all of the contents and mport your prepared text fle. You are now ready to style your paper; use the scroll down wndow on the left of the MS Word Formattng toolbar. In order to mprove the performance of SVM classfer [9], the kernel functon s modfed such that the factor gx ( ) s enlarged around the support vectors. But n SVR, the kernel functon s modfed such that gx ( ) should be compressed nearby support vectors. In order to make sure the magnfcaton s compressed. Let cx = h x x τ, (14) ( ) exp( /( )) I s s k ( x, x ) = c ( x) c ( x ) k( x, x ), (15) s g sj ( x) = k s ( x, x ) x = x. (16) x x where s s a postve nteger and I the whole tranng data. The summaton n (14) runs over all the tranng data such that the tranng procedure can be smplfed. In ths paper the RBF kernel s adopted. For those pont x whch s nearby the tranng data x, we have j g s( x) = det g sj( x).(17) ns n 4 = ( h / σ )exp( nsr /( τ )) 1 + s σ r / τ It s easy to check the followng conclusons: 1. when h < mn{ σ,1}, g s ( x) s compressed nearby the tranng data x for s = 1,,.. when 1 h < e, g s ( x) s compressed nearby the tranng data x for s > M. 3. g s ( x) s compressed for x whch s far away every tranng data x. In summary, the tranng process of the new method s as followes. Step 1. Gve random values for σε,, Ch,, τ and error ; Step. Let k ( x, x ) = k( x, x ), j = 0 ; 0 Step 3. Tranng SVR by k ( x, x ), computer the fgure of ; mert δ = ( y f( x )) ( y y) j Step 4. If δ > error Then k (, ) ( ) ( ) (, ) j 1 x x + = c x c x kj x x j = j + 1, return Step 3 Else Ext. After the procedure fnshed, the modfed kernel s s s k ( x, x ) = c ( x) c ( x ) k( x, x ), (18) where s s the fnal teraton number. V. SIMULATION In ths secton, we do some smulatons to evaluate the performance of ths method. We have compared the teratve modfed SVR and the tradtonal SVR on the same examples. There have the same tranng parameters n optmal problem (3). To assess the approxmaton results and the generalzaton ablty, a fgure of mert s defned as, (19) δ = ( y f( x )) ( y y) where {( x, y ) = 1,, m} s the test set, y = y / m. A. Approxmaton of One-dmensonal Functon For comparson we show the results wth the teratve modfed SVR and wth the tradtonal SVR. The sold lnes represent the orgnal functon and the dashed lnes show the approxmatons. Fg.1 shows the results for the approxmaton of the functon f( x) = sn x+ (sn 3 x) / 3 sn( x/ ), x [0, π ]. Approxmaton results for three dfferent values of h and τ are represented n table I, where the parameters of the tradtonal SVR and the testng error are σ = 0.1, ε = 0.1, C = 0.05 and δ = respectvely. The fgure of mert for each approxmaton s computed on a unformly sampled test set of 16 ponts. The nput x s constructed by the unform dstrbuton on nterval [0, π ], The tranng and test data are composed of 63 ponts and 16 ponts respectvely. Fg. shows the another approxmaton for the same functon, where the parameters of the tradtonal SVR and the testng error are σ = 0.1, ε = 0.1, C = 0.1 and δ = respectvely. Ths shows the modfed method possesses better performance of generalzaton than the tradtonal SVR.

4 B. Approxmaton of Mult-dmensonal Functons Table II shows the approxmaton results of the earthquake wave, whch s regarded as tme seres. The tranng data set s 3 {( x, y) R R = 1,,,50}, a hstorcal lag wth order 3. The fgure of mert for each approxmaton s computed on a unformly sampled test set of 50 ponts. Fg.3 llustrates part of TABLE I. Parameters & testng error of the tradtonal SVR orgnal data, ts approxmaton by the tradtonal SVR and modfed SVR. From comparson we can see the modfed method possesses better performance of generalzaton than the tradtonal SVR. APPROXIMATION RESULTS AND PARAMETERS OF SINGLE VARIA BLE FUNCTION Parameters of the modfed SVR testng error of the frst modfed SVR testng error of the second modfed SVR testng error of the 10 s modfed SVR σ = 0.1 h = 0.1, τ = ε = 0.1 h = 0.1, τ = C = 0.05 δ = h = 0.08, τ = σ = 0.1, ε = 0.1 C = 0.1 h = 0.8, τ = δ = a. The approxmaton result wth the tradtonal SVR. b. The approxmaton results wth the frst and second modfed SVR.

5 c. The approxmaton result wth the ten s modfed SVR. Fgure 1. Approxmaton results of one-dmensonal functon. Fgure. Approxmaton results wth the tradtonal and frst modfed SVR. TABLE II. APPROXIMATION RESULTS AND PARAMETERS OF MULTI-DIMENSIONAL FUNCTION Parameters of the tradtonal SVR Parameters of the modfed SVR testng error of the tradtonal SVR testng error of the frst modfed SVR testng error of the second modfed SVR σ = 0.04 ε = 0.01 C = h = 4, τ =

6 Fgure 3. Approxmaton results of mult-dmensonal functon. VI. CONCLUSION In ths paper we present an teratve modfed kernel to mprove the performance of SVR. It s based on the conformal mappng n nformaton geometry, whch results n datadependent kernel. The whole tranng data are used n the constructon of the conformal mappng nstead of support vectors. The dea s to compress the spatal resoluton such that the performance of forecastng s mproved. It s mportant that the kernel s modfed by teraton and the parameters of the kernel could be set as random values. Smulaton results show the effectveness and generalzaton ablty of ths method. ACKNOWLEDGMENT Ths work s supported by NSF Grant # , Natural Scence Foundaton Project of CQ CSTC 007BB396 and KJ REFERENCES [1] Vapnk V, Nature of Statstcal Learnng Theory [M], Translated by Zhang Xue-gong. Bejng: Tsnghua Unversty Press, 000, pp [] Scholkopf B, Sung K, Burges C, Comparng support vector machnes wth gaussan kernels to radal bass functon classfers,. IEEE Trans Sgnal Processng, 1997, vol. 45, pp [3] Perez-Cruz F, Nava-Vazquez A, Fgueras-Vdal A R, Emprcal rsk mnmzaton for support vector classfers, IEEE Trans on Neural Networks, 003, vol. 14, pp [4] Belhumeur P N, Hespanha J P, Kregman D J, Egenfaces vs fsherfaces :recognton usng class specfc lnear projecton, IEEE Trans on Pattern Analyss and Machne Intellgence, 1997, vol. 19, pp [5] Cao L-juan, Tay Francs E H, Fnancal forecastng usng support vector machnes, Neural Computng &Applcatons, 001, vol. 10, pp [6] Tay Francs E H, Cao L-juan, E-descendng support vector machnes for fnancal tme seres forecastng, Neural Processng Letters, 00, vol. 15, pp [7] Yang Ha-qn, Chan La-wan, Kng Irw n support vector machne regresson for volatle stock market predcton, Proceedngs of Intellgent Data Engneerng and Automated Learnng. Berln: Sprnger- Verlag, 00, pp [8] Smola, A.J., Schölkopf, B. &MÜLler, K. R., The connecton between regularzaton operators and support vector kernels, Neural Network, 1998, vol. 11, pp [9] Amar S, W u S, Improvng support vector machne classfers by modfyng kernel functons, Neural Networks, 1999, vol. 1, pp [10] Lang Yan-chun, Sun Yan-feng, An mproved method of support vector machne and ts applcatons to fnancal tme seres forcestng, Progresss n Natural Scence, 003, vol. 13, pp [11] Coln Campbell, Kernel Methods: A survey of current technques, Neurocomputng, 00, vol. 48, pp [1] Amar S, W u S, An nformaton-geometrcal method for mprovng the performance of support vector machne classfers, Proccedngs of Internatonal Conference on Artfcal Neural Network s 99. London: IEEE conference Publcaton No. 470, 1999, pp

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES Proceedngs of the Fourth Internatonal Conference on Machne Learnng and Cybernetcs, Guangzhou, 8- August 005 FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES DING-ZHOU CAO, SU-LIN PANG, YUAN-HUAI

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Linear Classification, SVMs and Nearest Neighbors

Linear Classification, SVMs and Nearest Neighbors 1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Support Vector Machines

Support Vector Machines CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Chapter 6 Support vector machine. Séparateurs à vaste marge

Chapter 6 Support vector machine. Séparateurs à vaste marge Chapter 6 Support vector machne Séparateurs à vaste marge Méthode de classfcaton bnare par apprentssage Introdute par Vladmr Vapnk en 1995 Repose sur l exstence d un classfcateur lnéare Apprentssage supervsé

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far Supervsed machne learnng Lnear models Least squares regresson Fsher s dscrmnant, Perceptron, Logstc model Non-lnear

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far So far Supervsed machne learnng Lnear models Non-lnear models Unsupervsed machne learnng Generc scaffoldng So far

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Lecture 3: Dual problems and Kernels

Lecture 3: Dual problems and Kernels Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM

More information

MULTICLASS LEAST SQUARES AUTO-CORRELATION WAVELET SUPPORT VECTOR MACHINES. Yongzhong Xing, Xiaobei Wu and Zhiliang Xu

MULTICLASS LEAST SQUARES AUTO-CORRELATION WAVELET SUPPORT VECTOR MACHINES. Yongzhong Xing, Xiaobei Wu and Zhiliang Xu ICIC Express Letters ICIC Internatonal c 2008 ISSN 1881-803 Volume 2, Number 4, December 2008 pp. 345 350 MULTICLASS LEAST SQUARES AUTO-CORRELATION WAVELET SUPPORT VECTOR MACHINES Yongzhong ng, aobe Wu

More information

Kristin P. Bennett. Rensselaer Polytechnic Institute

Kristin P. Bennett. Rensselaer Polytechnic Institute Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Transient Stability Assessment of Power System Based on Support Vector Machine

Transient Stability Assessment of Power System Based on Support Vector Machine ransent Stablty Assessment of Power System Based on Support Vector Machne Shengyong Ye Yongkang Zheng Qngquan Qan School of Electrcal Engneerng, Southwest Jaotong Unversty, Chengdu 610031, P. R. Chna Abstract

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

Lecture 20: November 7

Lecture 20: November 7 0-725/36-725: Convex Optmzaton Fall 205 Lecturer: Ryan Tbshran Lecture 20: November 7 Scrbes: Varsha Chnnaobreddy, Joon Sk Km, Lngyao Zhang Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer:

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Support Vector Machines

Support Vector Machines Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x n class

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG 6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955

More information

CSE 252C: Computer Vision III

CSE 252C: Computer Vision III CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Statistical machine learning and its application to neonatal seizure detection

Statistical machine learning and its application to neonatal seizure detection 19/Oct/2009 Statstcal machne learnng and ts applcaton to neonatal sezure detecton Presented by Andry Temko Department of Electrcal and Electronc Engneerng Page 2 of 42 A. Temko, Statstcal Machne Learnng

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Determination of Compressive Strength of Concrete by Statistical Learning Algorithms

Determination of Compressive Strength of Concrete by Statistical Learning Algorithms Artcle Determnaton of Compressve Strength of Concrete by Statstcal Learnng Algorthms Pjush Samu Centre for Dsaster Mtgaton and Management, VI Unversty, Vellore, Inda E-mal: pjush.phd@gmal.com Abstract.

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

A MODIFIED METHOD FOR SOLVING SYSTEM OF NONLINEAR EQUATIONS

A MODIFIED METHOD FOR SOLVING SYSTEM OF NONLINEAR EQUATIONS Journal of Mathematcs and Statstcs 9 (1): 4-8, 1 ISSN 1549-644 1 Scence Publcatons do:1.844/jmssp.1.4.8 Publshed Onlne 9 (1) 1 (http://www.thescpub.com/jmss.toc) A MODIFIED METHOD FOR SOLVING SYSTEM OF

More information

Study of Selective Ensemble Learning Methods Based on Support Vector Machine

Study of Selective Ensemble Learning Methods Based on Support Vector Machine Avalable onlne at www.scencedrect.com Physcs Proceda 33 (2012 ) 1518 1525 2012 Internatonal Conference on Medcal Physcs and Bomedcal Engneerng Study of Selectve Ensemble Learnng Methods Based on Support

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

Advanced Introduction to Machine Learning

Advanced Introduction to Machine Learning Advanced Introducton to Machne Learnng 10715, Fall 2014 The Kernel Trck, Reproducng Kernel Hlbert Space, and the Representer Theorem Erc Xng Lecture 6, September 24, 2014 Readng: Erc Xng @ CMU, 2014 1

More information

Tensor Subspace Analysis

Tensor Subspace Analysis Tensor Subspace Analyss Xaofe He 1 Deng Ca Partha Nyog 1 1 Department of Computer Scence, Unversty of Chcago {xaofe, nyog}@cs.uchcago.edu Department of Computer Scence, Unversty of Illnos at Urbana-Champagn

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Intro to Visual Recognition

Intro to Visual Recognition CS 2770: Computer Vson Intro to Vsual Recognton Prof. Adrana Kovashka Unversty of Pttsburgh February 13, 2018 Plan for today What s recognton? a.k.a. classfcaton, categorzaton Support vector machnes Separable

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Inexact Newton Methods for Inverse Eigenvalue Problems

Inexact Newton Methods for Inverse Eigenvalue Problems Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.

More information

Accurate Forecasting of Underground Fading Channel Based on Improved LS-SVM Anyi Wang1, a, Xi Xi1, b

Accurate Forecasting of Underground Fading Channel Based on Improved LS-SVM Anyi Wang1, a, Xi Xi1, b nd Workshop on Advanced Research and echnology n Industry Applcatons (WARIA 06) Accurate Forecastng of Underground Fadng Channel Based on Improved LS-SVM Any Wang, a, X X, b College of Communcaton and

More information

Evaluation of simple performance measures for tuning SVM hyperparameters

Evaluation of simple performance measures for tuning SVM hyperparameters Evaluaton of smple performance measures for tunng SVM hyperparameters Kabo Duan, S Sathya Keerth, Aun Neow Poo Department of Mechancal Engneerng, Natonal Unversty of Sngapore, 0 Kent Rdge Crescent, 960,

More information

Microwave Diversity Imaging Compression Using Bioinspired

Microwave Diversity Imaging Compression Using Bioinspired Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,

More information

CSC 411 / CSC D11 / CSC C11

CSC 411 / CSC D11 / CSC C11 18 Boostng s a general strategy for learnng classfers by combnng smpler ones. The dea of boostng s to take a weak classfer that s, any classfer that wll do at least slghtly better than chance and use t

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

Support Vector Machines

Support Vector Machines /14/018 Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x

More information

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS Journal of Theoretcal and Appled Informaton Technology 005-01 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Multi-step-ahead Method for Wind Speed Prediction Correction Based on Numerical Weather Prediction and Historical Measurement Data

Multi-step-ahead Method for Wind Speed Prediction Correction Based on Numerical Weather Prediction and Historical Measurement Data Journal of Physcs: Conference Seres PAPER OPEN ACCESS Mult-step-ahead Method for Wnd Speed Predcton Correcton Based on Numercal Weather Predcton and Hstorcal Measurement Data To cte ths artcle: Han Wang

More information

Feature Selection in Multi-instance Learning

Feature Selection in Multi-instance Learning The Nnth Internatonal Symposum on Operatons Research and Its Applcatons (ISORA 10) Chengdu-Juzhagou, Chna, August 19 23, 2010 Copyrght 2010 ORSC & APORC, pp. 462 469 Feature Selecton n Mult-nstance Learnng

More information

A Prediction Method of Spacecraft Telemetry Parameter Based On Chaos Theory

A Prediction Method of Spacecraft Telemetry Parameter Based On Chaos Theory A Predcton Method of Spacecraft Telemetry Parameter Based On Chaos Theory L Le, Gao Yongmng, Wu Zhhuan, Ma Kahang Department of Graduate Management Equpment Academy Beng, Chna e-mal: lle_067@hotmal.com

More information

Speeding up the IRWLS convergence to the SVM solution

Speeding up the IRWLS convergence to the SVM solution Speedng up the IRWLS convergence to the SVM soluton Fernando Pérez-Cruz Gatsby Computatonal Neuroscence Unt Unversty College London Alexandra house 7 Queen Square London, WCN 3AR, Unted Kngdom E-mal: fernando@gatsby.ucl.ac.u

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b

A New Refinement of Jacobi Method for Solution of Linear System Equations AX=b Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Orientation Model of Elite Education and Mass Education

Orientation Model of Elite Education and Mass Education Proceedngs of the 8th Internatonal Conference on Innovaton & Management 723 Orentaton Model of Elte Educaton and Mass Educaton Ye Peng Huanggang Normal Unversty, Huanggang, P.R.Chna, 438 (E-mal: yepeng@hgnc.edu.cn)

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

General viscosity iterative method for a sequence of quasi-nonexpansive mappings

General viscosity iterative method for a sequence of quasi-nonexpansive mappings Avalable onlne at www.tjnsa.com J. Nonlnear Sc. Appl. 9 (2016), 5672 5682 Research Artcle General vscosty teratve method for a sequence of quas-nonexpansve mappngs Cuje Zhang, Ynan Wang College of Scence,

More information

Scroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator

Scroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator Latest Trends on Crcuts, Systems and Sgnals Scroll Generaton wth Inductorless Chua s Crcut and Wen Brdge Oscllator Watcharn Jantanate, Peter A. Chayasena, and Sarawut Sutorn * Abstract An nductorless Chua

More information

Sparse Gaussian Processes Using Backward Elimination

Sparse Gaussian Processes Using Backward Elimination Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Lecture 6: Support Vector Machines

Lecture 6: Support Vector Machines Lecture 6: Support Vector Machnes Marna Melă mmp@stat.washngton.edu Department of Statstcs Unversty of Washngton November, 2018 Lnear SVM s The margn and the expected classfcaton error Maxmum Margn Lnear

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

FMA901F: Machine Learning Lecture 5: Support Vector Machines. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 5: Support Vector Machines. Cristian Sminchisescu FMA901F: Machne Learnng Lecture 5: Support Vector Machnes Crstan Smnchsescu Back to Bnary Classfcaton Setup We are gven a fnte, possbly nosy, set of tranng data:,, 1,..,. Each nput s pared wth a bnary

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

A PROCEDURE FOR SIMULATING THE NONLINEAR CONDUCTION HEAT TRANSFER IN A BODY WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY.

A PROCEDURE FOR SIMULATING THE NONLINEAR CONDUCTION HEAT TRANSFER IN A BODY WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY. Proceedngs of the th Brazlan Congress of Thermal Scences and Engneerng -- ENCIT 006 Braz. Soc. of Mechancal Scences and Engneerng -- ABCM, Curtba, Brazl,- Dec. 5-8, 006 A PROCEDURE FOR SIMULATING THE NONLINEAR

More information

Finite Element Modelling of truss/cable structures

Finite Element Modelling of truss/cable structures Pet Schreurs Endhoven Unversty of echnology Department of Mechancal Engneerng Materals echnology November 3, 214 Fnte Element Modellng of truss/cable structures 1 Fnte Element Analyss of prestressed structures

More information