Robust Feature Induction for Support Vector Machines

Size: px
Start display at page:

Download "Robust Feature Induction for Support Vector Machines"

Transcription

1 Robust Featue Inducton fo Suppot Vecto Machnes Rong Jn Depatment of Compute Scence and Engneeng, Mchgan State Unvesty, East Lansng, MI4884 Huan Lu Depatment of Compute Scence and Engneeng, Azona State Unvesty, Tempe, AZ Abstact The goal of featue nducton s to automatcally ceate nonlnea combnatons of exstng featues as addtonal nput featues to mpove classfcaton accuacy. Typcally, nonlnea featues ae ntoduced nto a suppot vecto machne (SVM) though a nonlnea kenel functon. One dsadvantage of such an appoach s that the featue space nduced by a kenel functon s usually of hgh dmenson and theefoe wll substantally ncease the chance of ove-fttng the tanng data. Anothe dsadvantage s that nonlnea featues ae nduced mplctly and theefoe ae dffcult fo people to undestand whch nduced featues ae ctcal to the classfcaton pefomance. In ths pape, we popose a boostng-style algothm that can explctly nduces mpotant nonlnea featues fo SVMs. We pesent empcal studes wth dscusson to show that ths appoach s effectve n mpovng classfcaton accuacy fo SVMs. The compason wth an SVM model usng nonlnea kenels also ndcates that ths appoach s effectve and obust, patculaly when the numbe of tanng data s small.. Intoducton Fo the last couple of yeas, suppot vecto machnes (SVMs) have been successfully appled to many applcatons, such as text classfcaton (Joachms, 00), mage classfcaton (Teytaud & Saut, 00), and face ecognton (Phllps, 998). Compaed to othe leanng machnes, SVMs have two advantages: ) contollng model complexty though a egulazaton tem, and ) nducng nonlnea featues Appeang n Poceedngs of the st Intenatonal Confeence on Machne Leanng, Banff, Canada, 004. Copyght 004 by the authos. though a nonlnea kenel functon. The fst advantage makes SVMs moe obust to noses and less lkely to ove-ft the tanng data. The second advantage nceases the expessve powe of SVMs and makes SVMs sutable fo complcated classfcaton tasks. Many empcal studes have shown that appopate choce of nonlnea kenels can sgnfcantly mpove the classfcaton accuacy (Teytaud & Saut, 00). Howeve, thee ae also some shotcomngs of usng kenel-nduced featues fo an SVM model: ) on-spase weghts. The featue space nduced by a nonlnea kenel functon s usually of lage dmensonalty and could be nfnte n some cases. When the data can be sepaated wth a few nonlnea featues, a kenel-nduced featue scheme may have to assgn non-zeo weghts to many tanng examples n ode to concentate on those mpotant nduced featues. Ths can esult n spaseness n the nduced featue space that can lead to a non-spase weght dstbuton of the tanng examples. ) Inappopate egulazaton. In pactce, people found that data nomalzaton (.e., subtacted by means and dvded by standad vaance) usually mpoves the pefomance of an SVM. Ths s because an SVM uses the l nom of vecto w (.e. w ) as ts penalty tem, n whch weghts of dffeent featues ae added togethe usng the same scale (.e., ). Thus, the penalty tem n an SVM eques the values of dffeent nput featues compaable. Howeve, t s athe dffcult to nomalze kenel-nduced nonlnea featues because they usually do not have explct expessons. As a esult, hgh ode nonlnea featues may suffe moe fom the penalty tem w snce the values ae athe small and need lage weghts to have notceable mpact on decson boundaes. 3) Implct featues. It s usually dffcult to deve explct expessons fo kenel-nduced featues. Ths makes t had to nfe ctcal nonlnea featues fom leaned SVM models usng nonlnea kenel functons. Ths nceases the dffculty fo uses to compehend the leaned esults.

2 In ths pape, we popose a featue nducton algothm that explctly ntoduces nonlnea featues nto an SVM to boost ts classfcaton pefomance. Patculaly, ths new algothm addesses the above thee shotcomngs wth the kenel-nduced featues. The explct ntoducton of nonlnea featues allows the poposed algothm to concentate on the subspace of mpotant featues. Data nomalzaton can be pefomed on the nduced featues n the same fashon as on the exstng featues. It s ease to ntepet an SVM model usng the explctly nduced featues than an SVM model wth the featues mplctly nduced by kenel functons. The poposed algothm woks as follows: fo each teaton, the algothm examnes the chaactestcs of the msclassfed tanng examples and geneates a new featue amed to coect mstakes. The algothm s smla to AdaBoost (Schape & Snge, 999) n that both boost the classfcaton pefomance by utlzng the msclassfed tanng examples. The most mpotant dstncton between them s that the poposed algothm ceates new featues whle the AdaBoost algothm geneates new nstances of weak classfes. If we vew each nstance of the weak classfe as a featue geneaton functon, the AdaBoost algothm may also be teated as a featue nducton method. Howeve, the poposed algothm dffes fom AdaBoost n the followng thee mpotant aspects: ) The poposed algothm only needs to dentfy the mpotant nonlnea featues whle the AdaBoost algothm has to ceate new weak classfes as well as compute the optmal weghts that combne multple weak classfes. ) The poposed algothm can take advantage of the egulazaton mechansm wthn an SVM model to avod the ove-fttng poblem. In contast, AdaBoost s a geedy algothm. Empcal studes have shown that the AdaBoost algothm tends to ove-ft tanng examples when the data s nosy (Optz & Maclne, 999; Ratsch et al., 000; Gove & Schuumans,998; Dettech, 000; Jn et al., 003). 3) Unlke AdaBoost whee the nduced weak classfes ae lnealy combned, the new featues nduced by the poposed algothm can be combned nonlnealy though kenel functons. The est of the pape s aanged as follows: Secton dscusses the elated wok; Secton 3 descbes the detals of the poposed featue nducton algothm fo SVMs; Secton 4 pesents an empcal study fo the poposed algothm; and Secton 5 concludes ths wok.. Related Wok We wll fst dscuss the pevous wok on featue nducton, next evew an SVM model and ts elated kenel method because ths wok tagets on the featue nducton poblem fo SVMs, and then descbe the coe dea of the AdaBoost algothm snce the poposed algothm utlzes the dea of boostng algothms. Pevous wok on featue nducton. Della Peta et al. (997) poposed an effcent algothm to seach fo featues that can effectvely ncease the lkelhood of tanng data (.e. log py ( x) ). Ths dea Tanng was appled to the whole sentence language model (Rosenfeld et al., 999), n whch a sentence s summazed nto a set of featues and descbed by a condtonal exponental model. It s late extended to the condtonal andom feld (MacCallum, 003) to allow fo moe tunng of the algothm n ode to futhe mpove the leanng effcency. Smlaly, the poposed featue nducton algothm s to seach fo featues that can effectvely decease the objectve functon n an SVM. The dffculty of nducng effectve nonlnea featues fo SVMs ases due to the fact that SVMs attempt to solve a constaned optmzaton poblem n whch featues only appea n the constants. As a esult, changes n featues cannot dectly nfluence the objectve functon of an SVM model. Ths s n contast to the featue nducton fo a condtonal exponental model n whch the objectve functon s expessed explctly n tems of nput featues. Renne & Jaakkola s wok (003) also nvolved featue nducton fo classfyng spam emals. They popose a compesson algothm fo text categozaton and ntoduce effectve featues to acheve the maxmum compesson ate. It s dffeent fom afoementoned woks on featue nducton n that t does not have an analytc objectve functon that s elated to featues. Theefoe, the seach of new featues s athe heustc and ad-hoc. Suppot Vecto Machnes. Hee we only evew the SVM model fo bnay classfcaton (Buges, 998). Let D denote the tanng data, D = { < x, } y > =. Each pa n the tanng data conssts of an nput vecto x and output label y (ethe + o -). An SVM model wth the lnea kenel s wtten as the followng optmzaton poblem: mn wb,, ξ,..., ξ w + c ξ = subject to [... ], y ( x w b) ξ, ξ 0 () whee c s a constant that contols the amount of admssble eos. The dual fom of the above optmzaton poblem s:

3 max α αα yy x x ( ),..., j j j α α = = j= subject to α 0, [... ], 0 α = y = c By eplacng the dot poduct x x j n the above equaton wth a kenel functon K( x, xj), we can mplctly ntoduce nonlnea featues nto the SVM model. As ndcated fom Equaton (), the nput featues do not appea n the objectve functon. Instead, they show up n the constants y( x w b) ξ and can only nfluence the objectve functon though the constants. Ths ndectness makes the nducton of effectve featues athe dffcult. To dectly elate nput featues to the objectve functon, we can ewte the optmzaton poblem n Equaton () n the followng fom: mn w + c max(0, y ( x w b) + ) wb, = (3) We can obseve n the above Equaton that constants ae emoved by eplacng each slack vaable ξ wth a tem max(0, y( x w b) + ) n the objectve functon. Optmzng Equaton (3) s stll dffcult because t nvolves dscontnuous functon `max. To solve ths poblem, we follow the dea n (Zhang et al., 003) to appoxmate the functon max(0, y( x w b) + ) wth a log-lnea functon log { + exp( [ y ( ) ] ) x w b + }. It s poved by Zhang et al. (003) that ths log-lnea functon foms an uppe bound fo functon max(0, y( x w b) + ) and wll unfomly convege to the max functon when +. The poposed featue nducton algothm s based on ths patcula appoxmaton fom of an SVM model. AdaBoost. As we mentoned eale, the poposed featue nducton algothm s smla to AdaBoost n that both ae teatve algothms and ceate a new entty at each teaton to coect the mstakes made by the cuent model. The man dea of AdaBoost s to ntoduce a new nstance of a weak classfe at each step to geedly educe the classfcaton eo. Specfcally, let H 0 ( x ) be the classfcaton functon of cuent teaton and ou goal s to geneate a new weak classfe hx ( ) such that the combnaton of H 0 ( x ) wth hx ( ),.e., H ( x) = H0 ( x) + αh( x), wll have a smalle classfcaton eo. In AdaBoost, a weak classfe s taned ove weghted examples () wth a weght exp( H0 ( x) y) fo each example < x, y >. The fnal classfe H f ( x ) s the lnea combnaton of all weak classfes geneated fom evey teaton,.e., H ( x ) = α h ( x ). f t= t t If each weak classfe ht ( x ) s vewed as a featue functon, the fnal classfe H f ( x ) s smply a lnea model fo the nduced featues { ht ( x )}. In ths sense, we can teat AdaBoost as a specal featue nducton algothm. Howeve, unlke othe featue nducton algothms that ae only esponsble fo geneatng new featues, AdaBoost needs to not only geneate new featues (.e., weak classfes) but also lean the lnea model (.e., combnaton weghts α t ). Thus, the featue nducton algothms have advantages ove AdaBoost n that they can ely on the classfcaton model to appopately combne the nduced featues wth the exstng featues. Patculaly, a featue nducton algothm fo SVMs can beneft geatly fom the nce popetes of SVMs: ) Unlke AdaBoost that may ove-ft nosy data, the featue nducton algothm fo SVM can avod the ove-fttng poblem because of the egulazaton mechansm nsde the SVM model; ) Unlke AdaBoost whee the nduced weak classfes ae lnealy combned, the featues nduced fo the SVM model ae able to nteact wth each othe nonlnealy though a kenel functon. 3. Featue Inducton Algothm fo SVMs As dscussed n Secton, n ode to fnd effectve featues fo an SVM, we use the specal fom of SVMs n Equaton (3) and appoxmate functon max(0, y( x w b) + ) wth a log-lnea functon log { + exp( [ y ( ) ] ) x w b + }. Puttng them togethe, the objectve functon n the ognal SVM model s appoxmated as follows: lsvm = c w + max(0, ( ) ) y x w b + = lapp = log + exp( y ( x w b) + ) = { } ote that the egulazaton tem cw s emoved fom the appoxmated objectve functon lapp. Ths s because log { + exp( [ y ( ) ] ) } max(0, ( ) ) x w b + y x w b + whee the equalty s taken only when +. Theefoe, we can always choose constant > 0 to (5)

4 make the appoxmated objectve functon lapp equal to the tue one lsvm. In effect, constant plays the smla ole as the egulazaton tem. An appopate choce of wll lead to less genealzaton eo. Ths pont wll be futhe dscussed late. Let x be the cuent featue vecto and g( x ) be the new featue that s ntoduced to the cuent SVM model. Let lapp be the appoxmated objectve functon that only uses the old featues x, and ' lapp ({ w, α}, b) be the appoxmated objectve functon that uses both old featues x and the new featue g( x ). Theefoe ' lapp ({ w, α}, b) can be expessed as follows: ' lapp ({ w, α}, b) = (6) log { + exp( y ( x w+ αg ( x ) b )) } whee α stands fo the weght of the new featue and b stands fo the new theshold value. In the above expesson fo the new objectve functon, we assume both the weghts fo the old featues,.e., w, and the theshold value b ae almost unchanged wth the ncluson of the new featue g( x ). Ths s a easonable assumpton when the classfcaton eo of an SVM usng the old featues x s low and the nduced featue wll not esult n a lage change n the SVM model. To fnd an effectve new featue g( x ), we consde the dffeence between the two objectve functons: ' lapp ({ w, α}, b) lapp + exp( y ( ( ) )) log x w+ αg x b = + exp( y( x w b)) (7) exp( y ( ))) log αg x = + + exp( H( x )) whee H ( x ) s defned as H( x) = y( x w b). Usng the nequalty log( + x) x, we have an uppe bound fo the dffeence of two objectve functons as: ' lapp ({ w, α}, b') lapp exp( α yg ( x))) (8) exp( H( x )) + + exp( H( x )) [ ] To emove the nteacton between the nduced featue g( x ) and the weght α, we futhe assume the outputs of g( x ) fall nto the nteval [-,]. Based on the λx + x λ x λ nequalty e e + e fo x [,], the uppe bound n Equaton (8) can be futhe smplfed as: l ({ w, α}, b') l ' app app α α e + e + exp( H( x )) [ + exp( H( x ))] α α e e yg ( x) + exp( H( x )) (8 ) The fst two tems n the above equaton ae not elated to the nduce featue and theefoe can be gnoed. The thd tem compses of a summaton ove all the tanng examples wth each example contbutng yg ( x). Theefoe, a good featue functon + exp( H ( x )) g( x ) that mnmzes the quantty n Equaton (8) can be obtaned by leanng a classfcaton model ove the tanng examples that ae weghted by the facto. Ths facto gves hgh weghts to + exp( H ( x )) msclassfed examples n whch H ( x ) ae negatve, and low weghts to coectly classfed examples whee H ( x ) ae postve. Ths s smla to the weght H ( x) dstbuton e used n AdaBoost but wth two mpotant dstnctons: ) The weghts used by the poposed featue nducton algothm s bounded,.e., <, whle + exp( H( x )) the weghts used by AdaBoost s unbounded. Ths bounded natue allows the poposed algothm to avod ove-emphass on the msclassfed examples. ) In the poposed weghts fo the tanng examples, constant contols the tadeoff between the tanng eo and the model complexty. Wth a lage value of, moe weghts ae assgned to the msclassfed examples and theefoe tanng eos can be educed moe effcently. On the othe hand, a small wll esult n less weght assgned to the msclassfed examples and less chance to ove-ft tanng data. Theefoe, constant has the smla mpact on the genealzaton eo as the egulazaton tem n an SVM model. The emanng queston s how to detemne the appopate value fo constant, whch has been justfed as an mpotant facto n the above dscusson. Snce the ntoducton of s to appoxmate the ognal objectve functon of an SVM model n Equaton (3), the optmal can be found by mnmzng

5 the dffeence between the two objectve functons app ( l w, b ) and l SVM,.e., * R ( l ) app w b lsvm = ag mn (, ) It can be seen that that functon ( l l ) app SVM s a convex functon wth egad to, whch can be easly justfed by ts second ode devatve. Theefoe, thee s a global mnmum soluton fo. Standad optmzaton algothms such as conjugate gadent and the ewton method can be appled to solve the optmzaton poblem n Equaton (9). Fgue summazes the poposed featue nducton algothm fo SVMs, whch altenatvely apples the pocedue of computng weghts and the pocedue of tanng a new classfcaton model. 4. Expements Featue Inducton fo SVM F: a set of featues F = { f, f,..., f } W: weghts fo tanng examples W = { W, W,..., W } D: tanng examples D = { < x, } y > = Intalzaton: W ={,,,} F = {all the ognal featues} Repeat untl the desed pefomance Tan a SVM model usng featue set F Compute W usng W = / [ + exp( H( x ))] Sample examples accodng the dstbuton j W / W j Tan a classfcaton model gx ( ) : X [...] usng the sampled examples F { F, g( x)} Fgue : Descpton of the featue nducton algothm fo SVMs In ths expement, we wll examne the effectveness of the poposed featue nducton algothm fo SVM. Patculaly, we wll addess the followng thee questons: ) How effectve and obust s the poposed featue nducton algothm fo mpovng classfcaton D (9) accuacy of SVMs? 50 dffeent nonlnea featues ae geneated fo each expement and the change of testng eos usng lnea SVM model wth espect to the numbe of nduced featues ae examned. ) How effectve s the poposed featue nducton algothm compaed to SVMs usng nonlnea kenels? SVMs wth polynomal kenels and a RBF kenel ae expemented and the esults ae compaed wth those usng the featue nducton algothm. 3) How effectve s the poposed featue nducton algothm compaed to AdaBoost? The AdaBoost algothm usng a lnea SVM as ts bass classfe s un ove the same set of datasets and ts esults ae compaed to the featue nducton algothm. Data Set # Examples # Featues onosphee beast_cance 68 9 spam cmc people outdoo Table : Statstcs of Expemental Datasets Sx dffeent datasets ae used n expements: fou of them ae fom the UCI machne leanng epostoy and the othe two ae fom eal wold applcatons. The statstcs of all eghts datasets ae lsted n Table. People and Outdoos ae the two datasets fom mage classfcaton tasks: dataset People s used fo the task of dentfyng scenes that have moe than two people, and dataset Outdoo s used fo the task of dentfyng outdoo scenes. Both ae data samples used fo TREC vdeo eteval evaluaton (TRECVID, 003). In the expements, we choose the decson tee algothm (C4.5, Qunlan, 993) to be the classfcaton model fo the poposed featue nducton algothm. Ths s because the decson tee algothm can ntoduce nonlnea nteactons between multple featues though conjunctons of Boolean lteals. A ewton method s used to fnd that optmzes Equaton (9). To futhe pevent ove-fttng the tanng data, we set the uppe bound fo to be 50 n the expements. The WEKA SVM mplementaton (Wtten & Fank, 999) s used fo expements. The kenel functons used n compason ae a RBF kenel and polynomal kenels wth degees angng fom to 5. The maxmum numbe of featues nduced by the poposed algothm s 50. Unless specfed, all expements ae conducted usng the 0-fold coss valdaton, wth 90% of the data used fo tanng and the emanng 0% fo testng.

6 0.035 beast_cance onosphee E E E E cmc people #Induced Featues E E spam outdoo #Induced Featues Fgue : Tanng and testng eos of a lnea SVM model usng the poposed featue nducton algothm fo sx dffeent datasets. Each sold lne epesents the change of testng eos wth espect to the numbe of nduced featues and each dash-dot lne epesents the coespondng change n tanng eos. 4. Expement I: Effectveness and Robustness of The Featue Inducton Algothm Fgue dsplays both the tanng eo and the testng eos of a lnea SVM model that uses the poposed featue nducton algothm fo sx dffeent datasets. Each sold lne epesents the change of tanng eos wth espect to the numbe of nduced featues. Each dash-dot lne epesents the change of the coespondng testng eos. Fst, fo all datasets, the nduced featues ae able to sgnfcantly educe the tanng eo of a lnea SVM model. The most notceable case s dataset onosphee, n whch the tanng eo has been educed fom 5.6% to zeo. Ths fact ndcates that the appoxmaton used n the pevous analyss s vald and easonably accuate. Second, despte of the mno fluctuaton n testng eos, fo all datasets, the oveall tend s that the moe the nduced featues ae, the smalle the classfcaton eo s. otce that although datasets such as people, cmc, and beast_cance have no moe than 0 featues, by ntoducng 50 new nonlnea featues, we ae stll able to educe the testng eo of a lnea SVM model wthout sgnfcantly ovefttng tanng data. The most notceable case s dataset beast_cance, n whch a sgnfcant dop n the testng eo s obseved when the numbe of nduced featues s between 7 and 9. Howeve, wth moe nduced featues, we see that the testng eo fo dataset beast_cance keeps deceasng. Ths tend s consstent wth the analyss pesented n the ntoducton secton. Explctly nduced new nonlnea featues can be nomalzed to the same scale as the ognal featues. As a esult, the Poly. Degee beast_cance onosphee cmc spam people outdoo 3.97% 9.7% 33.40% 6.67% 4.98% 7.3%.06% 0.9% 30.48% 8.07% 45.35%.% 3.79% 6.57% 30.75% 8.6% 45.54%.% 4.65% 8.57% 30.54% 36.43% 45.5%.5% 5 3.8% 8.9% 33.3% 37.7% 45.5%.7% Table : Testng eos fo SVM model usng polynomal kenels wth degee angng fom to 5.

7 egulazaton mechansm n the SVM model s effectve n peventng the ovefttng of tanng data that could potentally be caused by the lage numbe of nduced featues. In contast, applyng nonlnea kenels to nduce nonlnea featues can actually lead to a sgnfcant ovefttng poblem. Table pesents the testng eos of the SVM model usng polynomal kenels of dffeent degees. otce that, fo many datasets, a hgh degee polynomal kenel can sgnfcantly degade the SVM pefomance. Fo example, compaed to a lnea SVM model (poly. degee equals ), usng a polynomal kenel of degee 5 esults n a substantally wose classfcaton eo fo almost all datasets except outdoo as shown n Table. The most notceable case s dataset spam, whose testng eo s only 6.67% fo a lnea SVM model. Howeve, the testng eo shoots up to 37.7% when a polynomal kenel of degee 5 s used. As dscussed n the ntoducton secton, ths phenomenon can be attbuted to the fact that nonlnea featues ae ntoduced mplctly though kenel functons and theefoe ae dffcult to be nomalzed to the same scale as the ognal featues. As a esult, the egulazaton mechansm n SVM may not be able to wok appopately to pevent ovefttng poblems. The tend seen n Fgue shows that the nceased numbe of nduced featues can often help mpove testng accuacy. Howeve, the nceased numbe of nduced featues s assocated wth nceased computng costs. Its dmnshng gan n accuacy as the numbe of nduced featues nceases suggests a heustc stoppng cteon: stop when the gan between the two subsequent uns s nsgnfcant. 4. Expement II: Compason wth onlnea Kenels We apply both polynomal kenels and the RBF kenel to ntoduce nonlnea featues fo SVMs. Both the degee of polynomal kenels and vaance fo RBF kenel ae detemned though an ntenal coss valdaton pocedue usng 0% of tanng data that ae andomly selected fom the ognal tanng pool. The same coss valdaton pocedue s appled to detemne the vaance fo the RBF kenel. The esults of testng eos fo polynomal kenels and RBF kenels ae 50.00% 40.00% 30.00% 0.00% 0.00% 0.00% beast_cance Baselne AdaBoost onosphee cmc spam people outdoo Fgue 3: Tanng eos of AbaBoost and a lnea SVM model (efeed as baselne). pesented n Table 3, togethe wth the esults fo the poposed featue nducton algothm. Fo the pupose of compason, we also nclude the testng eos fo an SVM model usng the lnea kenel, whch s efeed as baselne model n Table 3. A boldfaced numbe ndcates that ts testng eo s substantally bette than the baselne model, and an talc font s appled when t s substantally wose than the baselne model. Fst, between the two kenel functons, the RBF kenel appeas to pefom sgnfcantly bette than the polynomal kenels fo almost all datasets. It manages to mpove the pefomance substantally fo thee out of sx datasets and acheves the same pefomance as the baselne model fo the est datasets. In contast, the polynomal kenels mpove classfcaton accuacy fo only two out of sx datasets whle degadng the pefomance fo the othe thee datasets. Second, the poposed featue nducton algothm consstently pefoms bette than the baselne model fo all datasets. Patculaly, t acheves substantal educton n testng eos fo fve out of sx datasets. Ths s notceably bette than both the RBF and polynomal kenels. Based on the above obsevaton, we conclude that the poposed algothm s moe obust and effectve than the nonlnea kenels n tems of nducng nonlnea featues fo the SVM model. 4.3 Expement III: Compason wth AdaBoost Methods beast_cance onosphee cmc spam people outdoo baselne 3.97% 9.7% 33.40% 6.67% 4.98% 7.3% polynomal.69% 3.43% 34.90% 9.50% 45.40%.03% RBF.79% 8.97% 30.30% 6.8% 4.4% 0.50% Featue Inducton.75% 5.4% 9.80% 6.35% 39.3% 4.33% AdaBoost 3.0%.8% 33.00% 9.37% 4.40% 6.5% Table 3: Testng eos fo dffeent methods. The baselne model efes to a lnea SVM model. A bold font s appled when a testng eo s substantally bette than the baselne model, and an talc font s appled when t s substantally wose than the baselne model

8 We compae the poposed featue nducton algothm fo SVMs to the AdaBoost algothm that uses SVM model as ts base classfe. Fo both algothms, a lnea kenel s used fo the SVM model. The maxmum numbe of teatons fo AdaBoost s set to be 0. The testng eos fo AdaBoost ae pesented n Table 3. Compaed to the poposed featue nducton algothm, AdaBoost appeas to be less effectve n mpovng the pefomance of an SVM model. Fo most datasets, t only acheves the same pefomance as the ognal SVM model. A futhe examnaton of tanng eos fo AdaBoost n Fgue 3 ndcates that the AdaBoost algothm cannot even educe the tanng eos sgnfcantly fo almost all datasets except people. Ths can be explaned by the fact that both a lnea SVM model and the AdaBoost algothm usng a lnea combnaton model. As a esult, the ntoducton of AdaBoost does not change the lnea natue of a lnea SVM model and theefoe has lttle mpact on the classfcaton accuacy. 5. Conclusons Induced featues can educe classfcaton eos. In ths pape, we popose a new featue nducton algothm fo SVMs. Unlke the kenel method whee the numbe of nduced featues s usually lage (even nfnte), the poposed algothm only ceates nonlnea featues that can effectvely educe tanng eos. Specfcally, new featues ae nduced teatvely. At each step, t weghts tanng examples dffeently accodng to the outputs of the cuent SVM model. The weghted tanng examples ae then used to tan a classfcaton model that becomes a new featue fo the SVM model. Empcal studes ove both UCI datasets and datasets of eal applcatons ndcate that the poposed featue nducton algothm s not only effectve n mpovng the pefomance of a lnea SVM model but also obust even when the numbe of nduced featues s substantally lage than the numbe of ognal featues. Refeences Joachms, T. (00). A Statstcal Leanng Model of Text Classfcaton fo SVMs. In Poceedng of 4 th ACM Intenatonal Confeence on Reseach and Development n Infomaton Reteval. Teytaud, O. and Saut, O. (00). Kenel-based Image Classfcaton, Lectue otes n Compute Scence, Phllps, P. J. (998). Suppot Vecto Machnes Appled to Face Recognton. In Advances n eual Infomaton Pocessng Systems, page 803. Optz, D., & Maclne, R. (999). Popula Ensemble methods: An Empcal Study. Jounal of AI Reseach pp Jn, R., Lu, Y., S, L., James, C., & A.G. Hauptmann (003). A ew Boostng Algothm Usng Input- Dependent Regulaze. In Poceedngs of 0 th Intenatonal Confeence on Machne Leanng. Dettech, T. G. (000). An Expemental Compason of Thee Methods fo constuctng Ensembles of Decson Tees: Baggng, Boostng, and Randomzaton. Machne Leanng, 40, Gove, A. J., & Schuumans, D. (998). Boostng n the Lmt: Maxmzng the Magn of Leaned Ensembles. In Poceedngs of the Ffteenth atonal Confeence on Atfcal Intellgence pp Ratsch, G., Onoda, T., & Mulle, K. (000). Soft Magns fo Adaboost. Machne Leanng, 4, Renne, J D. M. & Jaakkola, T. (00). Automatc Featue Inducton fo Text Classfcaton, MIT Atfcal Intellgence Laboatoy Abstact Book. Della Peta, S., Della Peta, V.J., & Laffety, J.D. (997). Inducng Featues of Random Felds. IEEE Tansactons on Patten Analyss and Machne Intellgence, 9, Rosenfeld, R., Wasseman, L., Ca, C., & Zhu, X.J. (999). Inteactve Featue Inducton and Logstc Regesson fo Whole Sentence Exponental Language Models. In Poc. IEEE wokshop on Automatc Speech Recognton and Undestandng, Keystone, Coloado. Buges, C.J.C. (998). A Tutoal on Suppot Vecto Machne fo Patten Recognton, Knowledge Dscovey and Data Mnng, (). Wtten, I.H. & Fank, E. (999). Data Mnng: Pactcal Machne Leanng Tools and Technques wth Java Implementatons. Mogan Kaufmann. Schape, R.E. & Snge, Y. (999). Impoved Boostng Algothms usng Confdence-ated Pedctons, Machne Leanng 37 (3): Qunlan, R. (993). C4.5: Pogams fo Machne Leanng, Mogan Kaufmann Publshes, San Mateo, CA. TRECVID (003). Zhang, J., Jn, R., Yang, Y. & Hauptmann, A.G. (003). Modfed Logstc Regesson: An Appoxmaton to SVM and ts Applcatons n Lage-Scale Text Categozaton, In Poc. of Intenatonal Confeence on Machne Leanng (ICML 003). McCallum, A. (003). Effcently Inducng Featues of Condtonal Random Felds. In Poc. Of 9 th Uncetanty n Atcfcal Intellgence (UAI 003).

Exact Simplification of Support Vector Solutions

Exact Simplification of Support Vector Solutions Jounal of Machne Leanng Reseach 2 (200) 293-297 Submtted 3/0; Publshed 2/0 Exact Smplfcaton of Suppot Vecto Solutons Tom Downs TD@ITEE.UQ.EDU.AU School of Infomaton Technology and Electcal Engneeng Unvesty

More information

Machine Learning 4771

Machine Learning 4771 Machne Leanng 4771 Instucto: Tony Jebaa Topc 6 Revew: Suppot Vecto Machnes Pmal & Dual Soluton Non-sepaable SVMs Kenels SVM Demo Revew: SVM Suppot vecto machnes ae (n the smplest case) lnea classfes that

More information

Multistage Median Ranked Set Sampling for Estimating the Population Median

Multistage Median Ranked Set Sampling for Estimating the Population Median Jounal of Mathematcs and Statstcs 3 (: 58-64 007 ISSN 549-3644 007 Scence Publcatons Multstage Medan Ranked Set Samplng fo Estmatng the Populaton Medan Abdul Azz Jeman Ame Al-Oma and Kamaulzaman Ibahm

More information

P 365. r r r )...(1 365

P 365. r r r )...(1 365 SCIENCE WORLD JOURNAL VOL (NO4) 008 www.scecncewoldounal.og ISSN 597-64 SHORT COMMUNICATION ANALYSING THE APPROXIMATION MODEL TO BIRTHDAY PROBLEM *CHOJI, D.N. & DEME, A.C. Depatment of Mathematcs Unvesty

More information

Machine Learning. Spectral Clustering. Lecture 23, April 14, Reading: Eric Xing 1

Machine Learning. Spectral Clustering. Lecture 23, April 14, Reading: Eric Xing 1 Machne Leanng -7/5 7/5-78, 78, Spng 8 Spectal Clusteng Ec Xng Lectue 3, pl 4, 8 Readng: Ec Xng Data Clusteng wo dffeent ctea Compactness, e.g., k-means, mxtue models Connectvty, e.g., spectal clusteng

More information

A Novel Ordinal Regression Method with Minimum Class Variance Support Vector Machine

A Novel Ordinal Regression Method with Minimum Class Variance Support Vector Machine Intenatonal Confeence on Mateals Engneeng and Infomaton echnology Applcatons (MEIA 05) A ovel Odnal Regesson Method wth Mnmum Class Vaance Suppot Vecto Machne Jnong Hu,, a, Xaomng Wang and Zengx Huang

More information

8 Baire Category Theorem and Uniform Boundedness

8 Baire Category Theorem and Uniform Boundedness 8 Bae Categoy Theoem and Unfom Boundedness Pncple 8.1 Bae s Categoy Theoem Valdty of many esults n analyss depends on the completeness popety. Ths popety addesses the nadequacy of the system of atonal

More information

Khintchine-Type Inequalities and Their Applications in Optimization

Khintchine-Type Inequalities and Their Applications in Optimization Khntchne-Type Inequaltes and The Applcatons n Optmzaton Anthony Man-Cho So Depatment of Systems Engneeng & Engneeng Management The Chnese Unvesty of Hong Kong ISDS-Kolloquum Unvestaet Wen 29 June 2009

More information

Set of square-integrable function 2 L : function space F

Set of square-integrable function 2 L : function space F Set of squae-ntegable functon L : functon space F Motvaton: In ou pevous dscussons we have seen that fo fee patcles wave equatons (Helmholt o Schödnge) can be expessed n tems of egenvalue equatons. H E,

More information

Dirichlet Mixture Priors: Inference and Adjustment

Dirichlet Mixture Priors: Inference and Adjustment Dchlet Mxtue Pos: Infeence and Adustment Xugang Ye (Wokng wth Stephen Altschul and Y Kuo Yu) Natonal Cante fo Botechnology Infomaton Motvaton Real-wold obects Independent obsevatons Categocal data () (2)

More information

Experimental study on parameter choices in norm-r support vector regression machines with noisy input

Experimental study on parameter choices in norm-r support vector regression machines with noisy input Soft Comput 006) 0: 9 3 DOI 0.007/s00500-005-0474-z ORIGINAL PAPER S. Wang J. Zhu F. L. Chung Hu Dewen Expemental study on paamete choces n nom- suppot vecto egesson machnes wth nosy nput Publshed onlne:

More information

Optimization Methods: Linear Programming- Revised Simplex Method. Module 3 Lecture Notes 5. Revised Simplex Method, Duality and Sensitivity analysis

Optimization Methods: Linear Programming- Revised Simplex Method. Module 3 Lecture Notes 5. Revised Simplex Method, Duality and Sensitivity analysis Optmzaton Meods: Lnea Pogammng- Revsed Smple Meod Module Lectue Notes Revsed Smple Meod, Dualty and Senstvty analyss Intoducton In e pevous class, e smple meod was dscussed whee e smple tableau at each

More information

Learning the structure of Bayesian belief networks

Learning the structure of Bayesian belief networks Lectue 17 Leanng the stuctue of Bayesan belef netwoks Mlos Hauskecht mlos@cs.ptt.edu 5329 Sennott Squae Leanng of BBN Leanng. Leanng of paametes of condtonal pobabltes Leanng of the netwok stuctue Vaables:

More information

A Brief Guide to Recognizing and Coping With Failures of the Classical Regression Assumptions

A Brief Guide to Recognizing and Coping With Failures of the Classical Regression Assumptions A Bef Gude to Recognzng and Copng Wth Falues of the Classcal Regesson Assumptons Model: Y 1 k X 1 X fxed n epeated samples IID 0, I. Specfcaton Poblems A. Unnecessay explanatoy vaables 1. OLS s no longe

More information

Distinct 8-QAM+ Perfect Arrays Fanxin Zeng 1, a, Zhenyu Zhang 2,1, b, Linjie Qian 1, c

Distinct 8-QAM+ Perfect Arrays Fanxin Zeng 1, a, Zhenyu Zhang 2,1, b, Linjie Qian 1, c nd Intenatonal Confeence on Electcal Compute Engneeng and Electoncs (ICECEE 15) Dstnct 8-QAM+ Pefect Aays Fanxn Zeng 1 a Zhenyu Zhang 1 b Lnje Qan 1 c 1 Chongqng Key Laboatoy of Emegency Communcaton Chongqng

More information

3. A Review of Some Existing AW (BT, CT) Algorithms

3. A Review of Some Existing AW (BT, CT) Algorithms 3. A Revew of Some Exstng AW (BT, CT) Algothms In ths secton, some typcal ant-wndp algothms wll be descbed. As the soltons fo bmpless and condtoned tansfe ae smla to those fo ant-wndp, the pesented algothms

More information

An Approach to Inverse Fuzzy Arithmetic

An Approach to Inverse Fuzzy Arithmetic An Appoach to Invese Fuzzy Athmetc Mchael Hanss Insttute A of Mechancs, Unvesty of Stuttgat Stuttgat, Gemany mhanss@mechaun-stuttgatde Abstact A novel appoach of nvese fuzzy athmetc s ntoduced to successfully

More information

If there are k binding constraints at x then re-label these constraints so that they are the first k constraints.

If there are k binding constraints at x then re-label these constraints so that they are the first k constraints. Mathematcal Foundatons -1- Constaned Optmzaton Constaned Optmzaton Ma{ f ( ) X} whee X {, h ( ), 1,, m} Necessay condtons fo to be a soluton to ths mamzaton poblem Mathematcally, f ag Ma{ f ( ) X}, then

More information

A. Thicknesses and Densities

A. Thicknesses and Densities 10 Lab0 The Eath s Shells A. Thcknesses and Denstes Any theoy of the nteo of the Eath must be consstent wth the fact that ts aggegate densty s 5.5 g/cm (ecall we calculated ths densty last tme). In othe

More information

Amplifier Constant Gain and Noise

Amplifier Constant Gain and Noise Amplfe Constant Gan and ose by Manfed Thumm and Wene Wesbeck Foschungszentum Kalsuhe n de Helmholtz - Gemenschaft Unvestät Kalsuhe (TH) Reseach Unvesty founded 85 Ccles of Constant Gan (I) If s taken to

More information

Generating Functions, Weighted and Non-Weighted Sums for Powers of Second-Order Recurrence Sequences

Generating Functions, Weighted and Non-Weighted Sums for Powers of Second-Order Recurrence Sequences Geneatng Functons, Weghted and Non-Weghted Sums fo Powes of Second-Ode Recuence Sequences Pantelmon Stăncă Aubun Unvesty Montgomey, Depatment of Mathematcs Montgomey, AL 3614-403, USA e-mal: stanca@studel.aum.edu

More information

Some Approximate Analytical Steady-State Solutions for Cylindrical Fin

Some Approximate Analytical Steady-State Solutions for Cylindrical Fin Some Appoxmate Analytcal Steady-State Solutons fo Cylndcal Fn ANITA BRUVERE ANDRIS BUIIS Insttute of Mathematcs and Compute Scence Unvesty of Latva Rana ulv 9 Rga LV459 LATVIA Astact: - In ths pape we

More information

On Maneuvering Target Tracking with Online Observed Colored Glint Noise Parameter Estimation

On Maneuvering Target Tracking with Online Observed Colored Glint Noise Parameter Estimation Wold Academy of Scence, Engneeng and Technology 6 7 On Maneuveng Taget Tacng wth Onlne Obseved Coloed Glnt Nose Paamete Estmaton M. A. Masnad-Sha, and S. A. Banan Abstact In ths pape a compehensve algothm

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Tian Zheng Department of Statistics Columbia University

Tian Zheng Department of Statistics Columbia University Haplotype Tansmsson Assocaton (HTA) An "Impotance" Measue fo Selectng Genetc Makes Tan Zheng Depatment of Statstcs Columba Unvesty Ths s a jont wok wth Pofesso Shaw-Hwa Lo n the Depatment of Statstcs at

More information

Correspondence Analysis & Related Methods

Correspondence Analysis & Related Methods Coespondence Analyss & Related Methods Ineta contbutons n weghted PCA PCA s a method of data vsualzaton whch epesents the tue postons of ponts n a map whch comes closest to all the ponts, closest n sense

More information

Energy in Closed Systems

Energy in Closed Systems Enegy n Closed Systems Anamta Palt palt.anamta@gmal.com Abstact The wtng ndcates a beakdown of the classcal laws. We consde consevaton of enegy wth a many body system n elaton to the nvese squae law and

More information

UNIT10 PLANE OF REGRESSION

UNIT10 PLANE OF REGRESSION UIT0 PLAE OF REGRESSIO Plane of Regesson Stuctue 0. Intoducton Ojectves 0. Yule s otaton 0. Plane of Regesson fo thee Vaales 0.4 Popetes of Resduals 0.5 Vaance of the Resduals 0.6 Summay 0.7 Solutons /

More information

Thermodynamics of solids 4. Statistical thermodynamics and the 3 rd law. Kwangheon Park Kyung Hee University Department of Nuclear Engineering

Thermodynamics of solids 4. Statistical thermodynamics and the 3 rd law. Kwangheon Park Kyung Hee University Department of Nuclear Engineering Themodynamcs of solds 4. Statstcal themodynamcs and the 3 d law Kwangheon Pak Kyung Hee Unvesty Depatment of Nuclea Engneeng 4.1. Intoducton to statstcal themodynamcs Classcal themodynamcs Statstcal themodynamcs

More information

Constraint Score: A New Filter Method for Feature Selection with Pairwise Constraints

Constraint Score: A New Filter Method for Feature Selection with Pairwise Constraints onstant Scoe: A New Flte ethod fo Featue Selecton wth Pawse onstants Daoqang Zhang, Songcan hen and Zh-Hua Zhou Depatment of ompute Scence and Engneeng Nanjng Unvesty of Aeonautcs and Astonautcs, Nanjng

More information

Chapter 23: Electric Potential

Chapter 23: Electric Potential Chapte 23: Electc Potental Electc Potental Enegy It tuns out (won t show ths) that the tostatc foce, qq 1 2 F ˆ = k, s consevatve. 2 Recall, fo any consevatve foce, t s always possble to wte the wok done

More information

CS649 Sensor Networks IP Track Lecture 3: Target/Source Localization in Sensor Networks

CS649 Sensor Networks IP Track Lecture 3: Target/Source Localization in Sensor Networks C649 enso etwoks IP Tack Lectue 3: Taget/ouce Localaton n enso etwoks I-Jeng Wang http://hng.cs.jhu.edu/wsn06/ png 006 C 649 Taget/ouce Localaton n Weless enso etwoks Basc Poblem tatement: Collaboatve

More information

Physics 2A Chapter 11 - Universal Gravitation Fall 2017

Physics 2A Chapter 11 - Universal Gravitation Fall 2017 Physcs A Chapte - Unvesal Gavtaton Fall 07 hese notes ae ve pages. A quck summay: he text boxes n the notes contan the esults that wll compse the toolbox o Chapte. hee ae thee sectons: the law o gavtaton,

More information

Efficiency of the principal component Liu-type estimator in logistic

Efficiency of the principal component Liu-type estimator in logistic Effcency of the pncpal component Lu-type estmato n logstc egesson model Jbo Wu and Yasn Asa 2 School of Mathematcs and Fnance, Chongqng Unvesty of Ats and Scences, Chongqng, Chna 2 Depatment of Mathematcs-Compute

More information

4 SingularValue Decomposition (SVD)

4 SingularValue Decomposition (SVD) /6/00 Z:\ jeh\self\boo Kannan\Jan-5-00\4 SVD 4 SngulaValue Decomposton (SVD) Chapte 4 Pat SVD he sngula value decomposton of a matx s the factozaton of nto the poduct of thee matces = UDV whee the columns

More information

COMPLEMENTARY ENERGY METHOD FOR CURVED COMPOSITE BEAMS

COMPLEMENTARY ENERGY METHOD FOR CURVED COMPOSITE BEAMS ultscence - XXX. mcocd Intenatonal ultdscplnay Scentfc Confeence Unvesty of skolc Hungay - pl 06 ISBN 978-963-358-3- COPLEENTRY ENERGY ETHOD FOR CURVED COPOSITE BES Ákos József Lengyel István Ecsed ssstant

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Vibration Input Identification using Dynamic Strain Measurement

Vibration Input Identification using Dynamic Strain Measurement Vbaton Input Identfcaton usng Dynamc Stan Measuement Takum ITOFUJI 1 ;TakuyaYOSHIMURA ; 1, Tokyo Metopoltan Unvesty, Japan ABSTRACT Tansfe Path Analyss (TPA) has been conducted n ode to mpove the nose

More information

4 Recursive Linear Predictor

4 Recursive Linear Predictor 4 Recusve Lnea Pedcto The man objectve of ths chapte s to desgn a lnea pedcto wthout havng a po knowledge about the coelaton popetes of the nput sgnal. In the conventonal lnea pedcto the known coelaton

More information

THE REGRESSION MODEL OF TRANSMISSION LINE ICING BASED ON NEURAL NETWORKS

THE REGRESSION MODEL OF TRANSMISSION LINE ICING BASED ON NEURAL NETWORKS The 4th Intenatonal Wokshop on Atmosphec Icng of Stuctues, Chongqng, Chna, May 8 - May 3, 20 THE REGRESSION MODEL OF TRANSMISSION LINE ICING BASED ON NEURAL NETWORKS Sun Muxa, Da Dong*, Hao Yanpeng, Huang

More information

Clustering. Outline. Supervised vs. Unsupervised Learning. Clustering. Clustering Example. Applications of Clustering

Clustering. Outline. Supervised vs. Unsupervised Learning. Clustering. Clustering Example. Applications of Clustering Clusteng CS478 Mahne Leanng Spng 008 Thosten Joahms Conell Unvesty Outlne Supevsed vs. Unsupevsed Leanng Heahal Clusteng Heahal Agglomeatve Clusteng (HAC) Non-Heahal Clusteng K-means EM-Algothm Readng:

More information

PHYS 705: Classical Mechanics. Derivation of Lagrange Equations from D Alembert s Principle

PHYS 705: Classical Mechanics. Derivation of Lagrange Equations from D Alembert s Principle 1 PHYS 705: Classcal Mechancs Devaton of Lagange Equatons fom D Alembet s Pncple 2 D Alembet s Pncple Followng a smla agument fo the vtual dsplacement to be consstent wth constants,.e, (no vtual wok fo

More information

Chapter Fifiteen. Surfaces Revisited

Chapter Fifiteen. Surfaces Revisited Chapte Ffteen ufaces Revsted 15.1 Vecto Descpton of ufaces We look now at the vey specal case of functons : D R 3, whee D R s a nce subset of the plane. We suppose s a nce functon. As the pont ( s, t)

More information

LASER ABLATION ICP-MS: DATA REDUCTION

LASER ABLATION ICP-MS: DATA REDUCTION Lee, C-T A Lase Ablaton Data educton 2006 LASE ABLATON CP-MS: DATA EDUCTON Cn-Ty A. Lee 24 Septembe 2006 Analyss and calculaton of concentatons Lase ablaton analyses ae done n tme-esolved mode. A ~30 s

More information

VParC: A Compression Scheme for Numeric Data in Column-Oriented Databases

VParC: A Compression Scheme for Numeric Data in Column-Oriented Databases The Intenatonal Aab Jounal of Infomaton Technology VPaC: A Compesson Scheme fo Numec Data n Column-Oented Databases Ke Yan, Hong Zhu, and Kevn Lü School of Compute Scence and Technology, Huazhong Unvesty

More information

Effective Discriminative Feature Selection with Non-trivial Solutions

Effective Discriminative Feature Selection with Non-trivial Solutions Effectve Dscmnatve Featue Selecton wth Non-tval Solutons Hong Tao, Chenpng Hou, Membe, IEEE, Fepng Ne, Yuanyuan Jao, Dongyun Y axv:4.548v [cs.lg] Ap 5 Abstact Featue selecton and featue tansfomaton, the

More information

Approximate Abundance Histograms and Their Use for Genome Size Estimation

Approximate Abundance Histograms and Their Use for Genome Size Estimation J. Hlaváčová (Ed.): ITAT 2017 Poceedngs, pp. 27 34 CEUR Wokshop Poceedngs Vol. 1885, ISSN 1613-0073, c 2017 M. Lpovský, T. Vnař, B. Bejová Appoxmate Abundance Hstogams and The Use fo Genome Sze Estmaton

More information

ON THE FRESNEL SINE INTEGRAL AND THE CONVOLUTION

ON THE FRESNEL SINE INTEGRAL AND THE CONVOLUTION IJMMS 3:37, 37 333 PII. S16117131151 http://jmms.hndaw.com Hndaw Publshng Cop. ON THE FRESNEL SINE INTEGRAL AND THE CONVOLUTION ADEM KILIÇMAN Receved 19 Novembe and n evsed fom 7 Mach 3 The Fesnel sne

More information

Scalars and Vectors Scalar

Scalars and Vectors Scalar Scalas and ectos Scala A phscal quantt that s completel chaacteed b a eal numbe (o b ts numecal value) s called a scala. In othe wods a scala possesses onl a magntude. Mass denst volume tempeatue tme eneg

More information

Groupoid and Topological Quotient Group

Groupoid and Topological Quotient Group lobal Jounal of Pue and Appled Mathematcs SSN 0973-768 Volume 3 Numbe 7 07 pp 373-39 Reseach nda Publcatons http://wwwpublcatoncom oupod and Topolocal Quotent oup Mohammad Qasm Manna Depatment of Mathematcs

More information

GENERALIZATION OF AN IDENTITY INVOLVING THE GENERALIZED FIBONACCI NUMBERS AND ITS APPLICATIONS

GENERALIZATION OF AN IDENTITY INVOLVING THE GENERALIZED FIBONACCI NUMBERS AND ITS APPLICATIONS #A39 INTEGERS 9 (009), 497-513 GENERALIZATION OF AN IDENTITY INVOLVING THE GENERALIZED FIBONACCI NUMBERS AND ITS APPLICATIONS Mohaad Faokh D. G. Depatent of Matheatcs, Fedows Unvesty of Mashhad, Mashhad,

More information

Event Shape Update. T. Doyle S. Hanlon I. Skillicorn. A. Everett A. Savin. Event Shapes, A. Everett, U. Wisconsin ZEUS Meeting, October 15,

Event Shape Update. T. Doyle S. Hanlon I. Skillicorn. A. Everett A. Savin. Event Shapes, A. Everett, U. Wisconsin ZEUS Meeting, October 15, Event Shape Update A. Eveett A. Savn T. Doyle S. Hanlon I. Skllcon Event Shapes, A. Eveett, U. Wsconsn ZEUS Meetng, Octobe 15, 2003-1 Outlne Pogess of Event Shapes n DIS Smla to publshed pape: Powe Coecton

More information

N = N t ; t 0. N is the number of claims paid by the

N = N t ; t 0. N is the number of claims paid by the Iulan MICEA, Ph Mhaela COVIG, Ph Canddate epatment of Mathematcs The Buchaest Academy of Economc Studes an CECHIN-CISTA Uncedt Tac Bank, Lugoj SOME APPOXIMATIONS USE IN THE ISK POCESS OF INSUANCE COMPANY

More information

AN EXACT METHOD FOR BERTH ALLOCATION AT RAW MATERIAL DOCKS

AN EXACT METHOD FOR BERTH ALLOCATION AT RAW MATERIAL DOCKS AN EXACT METHOD FOR BERTH ALLOCATION AT RAW MATERIAL DOCKS Shaohua L, a, Lxn Tang b, Jyn Lu c a Key Laboatoy of Pocess Industy Automaton, Mnsty of Educaton, Chna b Depatment of Systems Engneeng, Notheasten

More information

Backward Haplotype Transmission Association (BHTA) Algorithm. Tian Zheng Department of Statistics Columbia University. February 5 th, 2002

Backward Haplotype Transmission Association (BHTA) Algorithm. Tian Zheng Department of Statistics Columbia University. February 5 th, 2002 Backwad Haplotype Tansmsson Assocaton (BHTA) Algothm A Fast ult-pont Sceenng ethod fo Complex Tats Tan Zheng Depatment of Statstcs Columba Unvesty Febuay 5 th, 2002 Ths s a jont wok wth Pofesso Shaw-Hwa

More information

APPLICATIONS OF SEMIGENERALIZED -CLOSED SETS

APPLICATIONS OF SEMIGENERALIZED -CLOSED SETS Intenatonal Jounal of Mathematcal Engneeng Scence ISSN : 22776982 Volume Issue 4 (Apl 202) http://www.mes.com/ https://stes.google.com/ste/mesounal/ APPLICATIONS OF SEMIGENERALIZED CLOSED SETS G.SHANMUGAM,

More information

The Greatest Deviation Correlation Coefficient and its Geometrical Interpretation

The Greatest Deviation Correlation Coefficient and its Geometrical Interpretation By Rudy A. Gdeon The Unvesty of Montana The Geatest Devaton Coelaton Coeffcent and ts Geometcal Intepetaton The Geatest Devaton Coelaton Coeffcent (GDCC) was ntoduced by Gdeon and Hollste (987). The GDCC

More information

Part V: Velocity and Acceleration Analysis of Mechanisms

Part V: Velocity and Acceleration Analysis of Mechanisms Pat V: Velocty an Acceleaton Analyss of Mechansms Ths secton wll evew the most common an cuently pactce methos fo completng the knematcs analyss of mechansms; escbng moton though velocty an acceleaton.

More information

19 The Born-Oppenheimer Approximation

19 The Born-Oppenheimer Approximation 9 The Bon-Oppenheme Appoxmaton The full nonelatvstc Hamltonan fo a molecule s gven by (n a.u.) Ĥ = A M A A A, Z A + A + >j j (883) Lets ewte the Hamltonan to emphasze the goal as Ĥ = + A A A, >j j M A

More information

PARAMETER ESTIMATION FOR TWO WEIBULL POPULATIONS UNDER JOINT TYPE II CENSORED SCHEME

PARAMETER ESTIMATION FOR TWO WEIBULL POPULATIONS UNDER JOINT TYPE II CENSORED SCHEME Sept 04 Vol 5 No 04 Intenatonal Jounal of Engneeng Appled Scences 0-04 EAAS & ARF All ghts eseed wwweaas-ounalog ISSN305-869 PARAMETER ESTIMATION FOR TWO WEIBULL POPULATIONS UNDER JOINT TYPE II CENSORED

More information

A Method of Reliability Target Setting for Electric Power Distribution Systems Using Data Envelopment Analysis

A Method of Reliability Target Setting for Electric Power Distribution Systems Using Data Envelopment Analysis 27 กก ก 9 2-3 2554 ก ก ก A Method of Relablty aget Settng fo Electc Powe Dstbuton Systems Usng Data Envelopment Analyss ก 2 ก ก ก ก ก 0900 2 ก ก ก ก ก 0900 E-mal: penjan262@hotmal.com Penjan Sng-o Psut

More information

q-bernstein polynomials and Bézier curves

q-bernstein polynomials and Bézier curves Jounal of Computatonal and Appled Mathematcs 151 (2003) 1-12 q-bensten polynomals and Béze cuves Hall Ouç a, and Geoge M. Phllps b a Depatment of Mathematcs, Dokuz Eylül Unvesty Fen Edebyat Fakültes, Tınaztepe

More information

State Estimation. Ali Abur Northeastern University, USA. Nov. 01, 2017 Fall 2017 CURENT Course Lecture Notes

State Estimation. Ali Abur Northeastern University, USA. Nov. 01, 2017 Fall 2017 CURENT Course Lecture Notes State Estmaton Al Abu Notheasten Unvesty, USA Nov. 0, 07 Fall 07 CURENT Couse Lectue Notes Opeatng States of a Powe System Al Abu NORMAL STATE SECURE o INSECURE RESTORATIVE STATE EMERGENCY STATE PARTIAL

More information

The Backpropagation Algorithm

The Backpropagation Algorithm The Backpopagaton Algothm Achtectue of Feedfowad Netwok Sgmodal Thehold Functon Contuctng an Obectve Functon Tanng a one-laye netwok by teepet decent Tanng a two-laye netwok by teepet decent Copyght Robet

More information

Test 1 phy What mass of a material with density ρ is required to make a hollow spherical shell having inner radius r i and outer radius r o?

Test 1 phy What mass of a material with density ρ is required to make a hollow spherical shell having inner radius r i and outer radius r o? Test 1 phy 0 1. a) What s the pupose of measuement? b) Wte all fou condtons, whch must be satsfed by a scala poduct. (Use dffeent symbols to dstngush opeatons on ectos fom opeatons on numbes.) c) What

More information

Re-Ranking Retrieval Model Based on Two-Level Similarity Relation Matrices

Re-Ranking Retrieval Model Based on Two-Level Similarity Relation Matrices Intenatonal Jounal of Softwae Engneeng and Its Applcatons, pp. 349-360 http://dx.do.og/10.1457/sea.015.9.1.31 Re-Rankng Reteval Model Based on Two-Level Smlaty Relaton Matces Hee-Ju Eun Depatment of Compute

More information

A NOVEL DWELLING TIME DESIGN METHOD FOR LOW PROBABILITY OF INTERCEPT IN A COMPLEX RADAR NETWORK

A NOVEL DWELLING TIME DESIGN METHOD FOR LOW PROBABILITY OF INTERCEPT IN A COMPLEX RADAR NETWORK Z. Zhang et al., Int. J. of Desgn & Natue and Ecodynamcs. Vol. 0, No. 4 (205) 30 39 A NOVEL DWELLING TIME DESIGN METHOD FOR LOW PROBABILITY OF INTERCEPT IN A COMPLEX RADAR NETWORK Z. ZHANG,2,3, J. ZHU

More information

Maximum Likelihood Directed Enumeration Method in Piecewise-Regular Object Recognition

Maximum Likelihood Directed Enumeration Method in Piecewise-Regular Object Recognition Maxmum Lkelhood Dected Enumeaton Method n Pecewse-Regula Obect Recognton Andey Savchenko Abstact We exploe the poblems of classfcaton of composte obect (mages, speech sgnals wth low numbe of models pe

More information

24-2: Electric Potential Energy. 24-1: What is physics

24-2: Electric Potential Energy. 24-1: What is physics D. Iyad SAADEDDIN Chapte 4: Electc Potental Electc potental Enegy and Electc potental Calculatng the E-potental fom E-feld fo dffeent chage dstbutons Calculatng the E-feld fom E-potental Potental of a

More information

MACHINE LEARNING. Mistake and Loss Bound Models of Learning

MACHINE LEARNING. Mistake and Loss Bound Models of Learning Iowa State Unvesty MACHINE LEARNING Vasant Honava Bonfomatcs and Computatonal Bology Pogam Cente fo Computatonal Intellgence, Leanng, & Dscovey Iowa State Unvesty honava@cs.astate.edu www.cs.astate.edu/~honava/

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

(8) Gain Stage and Simple Output Stage

(8) Gain Stage and Simple Output Stage EEEB23 Electoncs Analyss & Desgn (8) Gan Stage and Smple Output Stage Leanng Outcome Able to: Analyze an example of a gan stage and output stage of a multstage amplfe. efeence: Neamen, Chapte 11 8.0) ntoducton

More information

Support Vector Machines

Support Vector Machines Hstoy Suppot Vecto Machnes Moe om Po. Ane W. Mooe s Lectue otes.cs.cmu.eu/~am SVM s a classe eve om statstcal leanng theoy y Vapn an Chevonens SVMs ntouce y Bose, Guyon, Vapn n COLT-9 o an mpotant an actve

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Cooperative and Active Sensing in Mobile Sensor Networks for Scalar Field Mapping

Cooperative and Active Sensing in Mobile Sensor Networks for Scalar Field Mapping 3 IEEE Intenatonal Confeence on Automaton Scence and Engneeng (CASE) TuBT. Coopeatve and Actve Sensng n Moble Senso Netwos fo Scala Feld Mappng Hung Manh La, Wehua Sheng and Jmng Chen Abstact Scala feld

More information

Research Article Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition

Research Article Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition Hndaw Publshng Copoaton athematcal Poblems n Engneeng, Atcle ID 89758, 0 pages http://dx.do.og/0.55/04/89758 Reseach Atcle Incemental enso Pncpal Component Analyss fo Handwtten Dgt Recognton Chang Lu,,

More information

Minimal Detectable Biases of GPS observations for a weighted ionosphere

Minimal Detectable Biases of GPS observations for a weighted ionosphere LETTER Eath Planets Space, 52, 857 862, 2000 Mnmal Detectable Bases of GPS obsevatons fo a weghted onosphee K. de Jong and P. J. G. Teunssen Depatment of Mathematcal Geodesy and Postonng, Delft Unvesty

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 151 Lectue 18 Hamltonan Equatons of Moton (Chapte 8) What s Ahead We ae statng Hamltonan fomalsm Hamltonan equaton Today and 11/6 Canoncal tansfomaton 1/3, 1/5, 1/10 Close lnk to non-elatvstc

More information

Rigid Bodies: Equivalent Systems of Forces

Rigid Bodies: Equivalent Systems of Forces Engneeng Statcs, ENGR 2301 Chapte 3 Rgd Bodes: Equvalent Sstems of oces Intoducton Teatment of a bod as a sngle patcle s not alwas possble. In geneal, the se of the bod and the specfc ponts of applcaton

More information

Analytical and Numerical Solutions for a Rotating Annular Disk of Variable Thickness

Analytical and Numerical Solutions for a Rotating Annular Disk of Variable Thickness Appled Mathematcs 00 43-438 do:0.436/am.00.5057 Publshed Onlne Novembe 00 (http://www.scrp.og/jounal/am) Analytcal and Numecal Solutons fo a Rotatng Annula Ds of Vaable Thcness Abstact Ashaf M. Zenou Daoud

More information

Integral Vector Operations and Related Theorems Applications in Mechanics and E&M

Integral Vector Operations and Related Theorems Applications in Mechanics and E&M Dola Bagayoko (0) Integal Vecto Opeatons and elated Theoems Applcatons n Mechancs and E&M Ι Basc Defnton Please efe to you calculus evewed below. Ι, ΙΙ, andιιι notes and textbooks fo detals on the concepts

More information

A. P. Sakis Meliopoulos Power System Modeling, Analysis and Control. Chapter 7 3 Operating State Estimation 3

A. P. Sakis Meliopoulos Power System Modeling, Analysis and Control. Chapter 7 3 Operating State Estimation 3 DRAF and INCOMPLEE able of Contents fom A. P. Saks Melopoulos Powe System Modelng, Analyss and Contol Chapte 7 3 Opeatng State Estmaton 3 7. Intoducton 3 7. SCADA System 4 7.3 System Netwok Confguato 7

More information

CEEP-BIT WORKING PAPER SERIES. Efficiency evaluation of multistage supply chain with data envelopment analysis models

CEEP-BIT WORKING PAPER SERIES. Efficiency evaluation of multistage supply chain with data envelopment analysis models CEEP-BIT WORKING PPER SERIES Effcency evaluaton of multstage supply chan wth data envelopment analyss models Ke Wang Wokng Pape 48 http://ceep.bt.edu.cn/englsh/publcatons/wp/ndex.htm Cente fo Enegy and

More information

Transport Coefficients For A GaAs Hydro dynamic Model Extracted From Inhomogeneous Monte Carlo Calculations

Transport Coefficients For A GaAs Hydro dynamic Model Extracted From Inhomogeneous Monte Carlo Calculations Tanspot Coeffcents Fo A GaAs Hydo dynamc Model Extacted Fom Inhomogeneous Monte Calo Calculatons MeKe Ieong and Tngwe Tang Depatment of Electcal and Compute Engneeng Unvesty of Massachusetts, Amhest MA

More information

CSU ATS601 Fall Other reading: Vallis 2.1, 2.2; Marshall and Plumb Ch. 6; Holton Ch. 2; Schubert Ch r or v i = v r + r (3.

CSU ATS601 Fall Other reading: Vallis 2.1, 2.2; Marshall and Plumb Ch. 6; Holton Ch. 2; Schubert Ch r or v i = v r + r (3. 3 Eath s Rotaton 3.1 Rotatng Famewok Othe eadng: Valls 2.1, 2.2; Mashall and Plumb Ch. 6; Holton Ch. 2; Schubet Ch. 3 Consde the poston vecto (the same as C n the fgue above) otatng at angula velocty.

More information

SOME NEW SELF-DUAL [96, 48, 16] CODES WITH AN AUTOMORPHISM OF ORDER 15. KEYWORDS: automorphisms, construction, self-dual codes

SOME NEW SELF-DUAL [96, 48, 16] CODES WITH AN AUTOMORPHISM OF ORDER 15. KEYWORDS: automorphisms, construction, self-dual codes Факултет по математика и информатика, том ХVІ С, 014 SOME NEW SELF-DUAL [96, 48, 16] CODES WITH AN AUTOMORPHISM OF ORDER 15 NIKOLAY I. YANKOV ABSTRACT: A new method fo constuctng bnay self-dual codes wth

More information

A Tutorial on Low Density Parity-Check Codes

A Tutorial on Low Density Parity-Check Codes A Tutoal on Low Densty Paty-Check Codes Tuan Ta The Unvesty of Texas at Austn Abstact Low densty paty-check codes ae one of the hottest topcs n codng theoy nowadays. Equpped wth vey fast encodng and decodng

More information

Particle Swarm Optimization Algorithm for Designing BP and BS IIR Digital Filter

Particle Swarm Optimization Algorithm for Designing BP and BS IIR Digital Filter Patcle Swam Optmzaton Algothm fo Desgnng BP and BS IIR Dgtal Flte Rant Kau 1, Damanpeet Sngh 1Depatment of Electoncs & Communcaton 1 Punab Unvesty, Patala Depatment of Compute Scence SantLongowal Insttute

More information

INTRODUCTION. consider the statements : I there exists x X. f x, such that. II there exists y Y. such that g y

INTRODUCTION. consider the statements : I there exists x X. f x, such that. II there exists y Y. such that g y INRODUCION hs dssetaton s the eadng of efeences [1], [] and [3]. Faas lemma s one of the theoems of the altenatve. hese theoems chaacteze the optmalt condtons of seveal mnmzaton poblems. It s nown that

More information

Monte Carlo comparison of back-propagation, conjugate-gradient, and finite-difference training algorithms for multilayer perceptrons

Monte Carlo comparison of back-propagation, conjugate-gradient, and finite-difference training algorithms for multilayer perceptrons Rocheste Insttute of Technology RIT Schola Woks Theses Thess/Dssetaton Collectons 20 Monte Calo compason of back-popagaton, conugate-gadent, and fnte-dffeence tanng algothms fo multlaye peceptons Stephen

More information

Observer Design for Takagi-Sugeno Descriptor System with Lipschitz Constraints

Observer Design for Takagi-Sugeno Descriptor System with Lipschitz Constraints Intenatonal Jounal of Instumentaton and Contol Systems (IJICS) Vol., No., Apl Obseve Desgn fo akag-sugeno Descpto System wth Lpschtz Constants Klan Ilhem,Jab Dalel, Bel Hadj Al Saloua and Abdelkm Mohamed

More information

Optimal System for Warm Standby Components in the Presence of Standby Switching Failures, Two Types of Failures and General Repair Time

Optimal System for Warm Standby Components in the Presence of Standby Switching Failures, Two Types of Failures and General Repair Time Intenatonal Jounal of ompute Applcatons (5 ) Volume 44 No, Apl Optmal System fo Wam Standby omponents n the esence of Standby Swtchng Falues, Two Types of Falues and Geneal Repa Tme Mohamed Salah EL-Shebeny

More information

SURVEY OF APPROXIMATION ALGORITHMS FOR SET COVER PROBLEM. Himanshu Shekhar Dutta. Thesis Prepared for the Degree of MASTER OF SCIENCE

SURVEY OF APPROXIMATION ALGORITHMS FOR SET COVER PROBLEM. Himanshu Shekhar Dutta. Thesis Prepared for the Degree of MASTER OF SCIENCE SURVEY OF APPROXIMATION ALGORITHMS FOR SET COVER PROBLEM Hmanshu Shekha Dutta Thess Pepaed fo the Degee of MASTER OF SCIENCE UNIVERSITY OF NORTH TEXAS Decembe 2009 APPROVED: Fahad Shahokh, Mao Pofesso

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

CSJM University Class: B.Sc.-II Sub:Physics Paper-II Title: Electromagnetics Unit-1: Electrostatics Lecture: 1 to 4

CSJM University Class: B.Sc.-II Sub:Physics Paper-II Title: Electromagnetics Unit-1: Electrostatics Lecture: 1 to 4 CSJM Unvesty Class: B.Sc.-II Sub:Physcs Pape-II Ttle: Electomagnetcs Unt-: Electostatcs Lectue: to 4 Electostatcs: It deals the study of behavo of statc o statonay Chages. Electc Chage: It s popety by

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Physics 11b Lecture #2. Electric Field Electric Flux Gauss s Law

Physics 11b Lecture #2. Electric Field Electric Flux Gauss s Law Physcs 11b Lectue # Electc Feld Electc Flux Gauss s Law What We Dd Last Tme Electc chage = How object esponds to electc foce Comes n postve and negatve flavos Conseved Electc foce Coulomb s Law F Same

More information

A STUDY OF SOME METHODS FOR FINDING SMALL ZEROS OF POLYNOMIAL CONGRUENCES APPLIED TO RSA

A STUDY OF SOME METHODS FOR FINDING SMALL ZEROS OF POLYNOMIAL CONGRUENCES APPLIED TO RSA Jounal of Mathematcal Scences: Advances and Applcatons Volume 38, 06, Pages -48 Avalable at http://scentfcadvances.co.n DOI: http://dx.do.og/0.864/jmsaa_700630 A STUDY OF SOME METHODS FOR FINDING SMALL

More information

Advanced Robust PDC Fuzzy Control of Nonlinear Systems

Advanced Robust PDC Fuzzy Control of Nonlinear Systems Advanced obust PDC Fuzzy Contol of Nonlnea Systems M Polanský Abstact hs pape ntoduces a new method called APDC (Advanced obust Paallel Dstbuted Compensaton) fo automatc contol of nonlnea systems hs method

More information

Engineering Mechanics. Force resultants, Torques, Scalar Products, Equivalent Force systems

Engineering Mechanics. Force resultants, Torques, Scalar Products, Equivalent Force systems Engneeng echancs oce esultants, Toques, Scala oducts, Equvalent oce sstems Tata cgaw-hll Companes, 008 Resultant of Two oces foce: acton of one bod on anothe; chaacteed b ts pont of applcaton, magntude,

More information