Improved Classification Based on Predictive Association Rules

Size: px
Start display at page:

Download "Improved Classification Based on Predictive Association Rules"

Transcription

1 Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 Improved Classfcaon Based on Predcve Assocaon Rules Zhxn Hao, Xuan Wang, Ln Yao, Yaoyun Zhang Inellgence Compung Research Cener, Shenzhen Graduae School Harbn Insue of Technology Shenzhen, Chna hzx984@gmal.com, wangxuan@nsun.h.edu.cn, yaoln@h.edu.cn, xaon5@gmal.com Absrac Classfcaon based on predcve assocaon rules (CPAR) s a knd of assocaon classfcaon mehods whch combnes he advanages of boh assocave classfcaon and radonal rule-based classfcaon. For rule generaon, CPAR s more effcen han radonal rule-based classfcaon because much repeaed calculaon s avoded and mulple lerals can be seleced o generae mulple rules smulaneously. Despe hese advanages above n rule generaon, he predcon processes have he weaknesses of class rule dsrbuon mbalance and nerrupon of ncorrec class rules. Furher, s useless o nsances sasfyng no rules. To ackle hese problems, hs paper presens Class Weghng Adjusmen, Cener Vecor-based Preclassfcaon and Pos-processng wh Suppor Vecor Machne. Expermens on Chnese ex classfcaon corpus TanCorp show ha our algorhm acheves an average mprovemen of 5.9% on F score compared wh CPAR. Keywords CPAR, Class Weghng Adjusmen, Vecor-based Pre-classfcaon, Suppor Vecor Machne Cener I. INTRODUCTION Classfcaon mehods based on assocaon rules have grasped much more aenon of researchers nowadays. Accordng o several repors, hgher classfcaon accuracy has been acheved han radonal classfcaon approaches e.g. C4.5, FOIL and RIPPER []. I can make predcon from example wh unknown labels by mnng assocaon rules n ranng corpus. In general, hree seps are conaned n assocaon rules classfcaon []: ().Rule Generaon: exrac canddae rules n he ranng corpus whch sasfy mnmum suppor predefned wh some daa mnng algorhm. (). Rule Selecon: evaluae all canddae rules, only he rules sasfyng confdence predefned can be reaned. (3). Classfcaon: choose he bes rule from classfer for predcon. Mos radonal assocaon classfcaon algorhms only have concerned abou selecng lerals hrough frequenly ems. The arbues of a rule only depends on he mnmum suppor predefned and her classfcaon ables are gnored. The hreshold of mnmum suppor plays a very mporan role. However, hese approaches are me-consumng and s dffcul o selec hgh qualy rules. CPAR has combned he advanages of boh assocave classfcaon and radonal rule-based classfcaon. I employs Predcve Rule Mnng algorhm o generae rules drecly from ranng corpus. In hs way, CPAR generaes much smaller se of hgh-qualy predcve rules and generaes all he rules accordng o he rules generaed before o avod redundan rule generaon. For selecng lerals, nsead of selecng only he bes leral for he rule, CPAR selecs all he lerals close o he bes one hus generaes more useful rules [3]. The rule generaon of CPAR makes a more effcen assocave classfcaon algorhm. Bu, here are manly hree flaws n he processes of rule evaluaon and classfcaon. For each of hem a soluon s proposed as follows: () The number of rules of each class has mbalanced dsrbuon. I may range from several dozen o several housand. Therefore, an example s predced more easly o he class wh more rules han he fewer ones. So Class Weghng Adjusmen s proposed whch wll balance he nensy of classfcaon rules by adjusng wegh facor eravely, he effec of srong classes and weak ones wll be more balanced. () In he sage of classfcaon, each class s reaed unformly for examples whch wll ncrease he probably of ms-classfcaon. Before classfcaon, smlary compuaon beween example and he cener vecor of each class s proposed. We only load he rules of a class whose smlary of cener vecor o he example s greaer han average smlary. Smlary compuaon s based on vecor space model. (3) CPAR s useless for he nsances sasfyng no rules. Expermenal daa of TanCorp shows ha here are abou 4% esng nsances sasfy no rules. Posprocessng wh Suppor Vecor Machne (SVM) s proposed o predc hese examples [4]. The paper s organzed as follows. Secon descrbes he general deas of classfcaon based on predcve assocaon rules. In Secon 3, we dscuss our Improved CPAR (ICPAR) algorhm n deal. Secon 4 presens he expermenal resul of ICPAR and make comparsons wh CPAR, KNN, and SVM. The concludng remarks are presened n Secon 5. II. CLASSIFICATION BASED ON PREDICTIVE ASSOCIATION RULES CPAR s an assocaon classfcaon algorhm whch combnes he advanages of boh assocave classfcaon and radonal rule-based classfcaon. A greedy algorhm Predcve Rule Mnng s used o generae rules drecly from ranng daa. In order o avod over fng, CPAR employs expeced accuracy o evaluae rule and bes k rules of each class are used for predcon. There are hree man seps n CPAR whch are gven as follows [3]: /09/$ IEEE 9

2 A. Rule Generaon Some defnons are frs gven as follows: Leral: A leral p s an arbue-value par n he form of (A, v ), where A s an arbue and v s a value. If and only f = v, uple wll sasfy leral p, s he value of h arbue of. In rule generaon algorhm of CPAR, every documen s represened by a se of erms and here s no wegh for each erm. So we make no dfference n leral and erm n hs paper. Rule: A rule r s n he form of p p... pl c, p s a leral and c s a class label. When a uple sasfes all he lerals of rule r, we say ha uple sasfes rule r. The uple can be predced o class c. Gven a rule r:p r C r. Posve examples are examples sasfy P r and belongng o class C r. Negave examples are examples sasfy P r and no belongng o class C r. The gan of leral p s defned as (). P and N are ses of posve and negave examples respecvely. P and N are number of examples n respecve ses P and N. Afer leral p s added o curren rule r, here wll be P * posve and N * negave examples sasfyng he new rule s body. * * P P ga n( p) = P (log log ) * * P + N P + N For an eraon of rule generaon of CPAR, leral wh bes gan s seleced o curren r. A rule s generaed unl he gan value of he leral s smaller han predefned hreshold. The mos me-consumng par of radonal FOIL algorhm les n compung gan of every leral when selecng he bes leral [5]. CPAR uses PNArray o enhance he effcency of he compuaons. If an example s correcly covered by a rule, CPAR decreases s wegh nsead of removng from Fol and CPAR ges more rules. Moreover, CPAR selecs no only he bes leral bu also he lerals geng gan closer o he bes one. I can ge mulple rules smulaneously and concree rule generaon algorhm can be found n [3]. B. Rule Evaluaon Laplace expeced error esmaon s used n CPAR o esmae he exceped accuracy of rules whch belongng o class c. () LaplaceAccuracy = ( n + )/( n + k) () In equaon (), k s he oal class number, no s he number of examples whch sasfy he rule s body and n whch here are nc examples belongng o class c. C. Classfcaon For each example, CPAR selec he bes k rules of each class whch sasfyng. Then compue he average accuracy of c o he k rules and predc he example o he class wh he hghes average accuracy. III. IMPROVED CPAR To overcome he weaknesses of CPAR descrbed n Secon, Class Weghng Adjusmen, Cener Vecor-based Preclassfcaon and Pos-processng wh SVM are proposed. The res of hs secon wll dscuss each approach n deal. A. Class Weghng Adjusmen The processng of rule evaluaon n CPAR only consders abou he expeced accuracy of each rule whou akng n accoun he wegh of every class. Table I lss he number of rules n each class generaed by rule generaon algorhm of CPAR on ranng corpus of TanCorp. Furher, Table I depcs ha he class rule numbers are unbalanced. Therefore, he example wll be easly classfed no class wh more rule number. TABLE I. RULE NUMBER OF EACH CLASS Recrumen Spors Healh Regon Eneranmen Esae Educaon Auomoble Compuer Technque Ar Economc In order o avod hs, a Class Weghng Adjusmen algorhm s used whch has been presened n [6]. Some defnons are gven as follows. Class Rule Inensy: nensy of class c s defned as (3). ρ ( c ) = ε ( c ) + ε ( c ) (0 ρ ) ε s he probably of examples wrongly classfed no c. ε s he probably of examples correcly classfed no c. The sronger class rule nensy s, he larger ρ value of he class wll be. Class Wegh Vecor: W = [ w, w,..., wm] where m ( w n) = n. W s a wegh vecor before he frs mes = weghng adjusmen. w s he wegh of class c and n s he number of rules of class c. The oal number of rules s n. Inally, wegh vecor s se o [,,...,]. For sngle eraon, he weghng facor for every class s defned as (4). New class wegh + of class c wll be compued as (5). w (3) α ( c ) = ρ ( c ) (4) w + = w a ( c ) (5) 9

3 Because new class wegh mus sasfy he m condon + ( w n ) = n, so he normalze expresson of = w + s compued as (6). + w α ( c) n w = (6) Z m Z = w α( c) n (7) = The algorhm repeas eraons unl all he classes rule nensy are equally same or reach he eraon me predeermned. Furher more, he Class Weghng Adjusmen algorhm s presened n Fg.. Inpu: he rules of each class and he ran se T Oupu: class wegh of each class Procedure Class Weghng Adjusmen Algorhm se he wegh of each class o n classwegh n oal rule number of each class 3 eraon me 0 4 whle ++ < eraon me predefned 5 make predcon wh classfcaon algorhm of 6 CPAR for T and record e, e of each class 7 Z0 8 for each class c n all classes 9 p(c) e(c) + e(c) 0 a(c) - p(c) classwegh(c) *= a(c) Z += classwegh(c) * class_rule_number(c) 3 end 4 for each class c n all classes 5 classwegh(c) = classwegh(c) * n / Z 6 end 7 end 8 reurn classwegh Fgure. Class Wegh Algorhm B. Cener Vecor-Based Pre-classfcaon For predcon, he classfcaon n CPAR drecly loads he rules of each class. Classfcaon resul wll be easly affeced by ncorrec class. We can use a sasc for an example belongng o a ceran class o es he nference.e. he smlary beween he example and he class s larger han average smlary of he example wh all classes. There are 97.43% documens n TanCorp sasfyng hs nference. Before loadng class rules, we can compue he smlary beween example wh each class and average smlary hrough all classes. Only he rules of class whose smlary wh example s larger han average smlary are used. Ths mehod can reduce he probably of ncorrec class nerrupon. Durng smlary compuaon, Vecor Space Model [7] s used for documen represenaon. A documen d s made up by a se of feaure ems (,,..., m) and he weghs correspond o hem ( w, w,..., w m). There are several approaches for wegh compuaon; TF-IDF s one of hem. In equaon (8), TF k s he frequency of k urnng up n documen d. There are N documens n ranng corpus. The number of documens whch conan feaure em k s denoed by N. k N wk = TFIDF( k ) = TFk *log (8) N Feaure selecon s a echnology for feaure dmensonal reducon. I can smplfy compuaon and avod over fng [8]. Ch-square sascs s used here. I can measure correlaon degree of erm and class c. Ch-square sascs assumes ha and c have a known dsrbuon. Ths dsrbuon s denoed by χ -dsrbuon. The correlaon degree of and c wll be hgher when hey have a bgger χ value. The formula of χ s shown n (9). N ( AD CB) χ (, c) = ( A+ C)( B+ D)( A+ B)( C+ D) N s he oal documen number n ranng corpus, A s he number of documens whch belong o class c and conan. B s he number of documens no belongng o c bu conanng. C s he number of documens belongng o c whou. D s he number of documens no belongng o c and whou word. The Ch-square value for erm can be compued as descrbed n (0). M s he oal class number. Afer feaure selecon, he erms wll be removed whose Ch-square values are smaller han hreshold predefned. Moreover, he erms whose Chsquare values are larger han hreshold wll be reaned as documen feaure. χ M max = k (9) () = max χ (, c ) (0) Up ll now, he represenaon can be gven o any documen. For a class, we can sum all he documens VSM belongng o and compue he average cener vecor of he class. Towards documen d and cener vecor c j of class j, smlary beween hem can be compued as explaned n (). sm( d, c ) = j M w k jk k = M M wk w jk k= k= w () C. Pos-Processng wh SVM Afer classfcaon wh he class rules, here may sll have some es examples sasfyng no rule. Table II shows he probably of hs knd of documens n es corpus of TanCorp. CPAR s no applcable for hese documens. Under hs condon, Pos-processng wh SVM s proposed for he purpose of classfcaon. 93

4 TABLE II. PROBABILITY OF EXAMPLES BELONGING TO EACH CLASS WITH NO RULE SATISFYING Recrumen Spors Healh Regon Eneranmen Esae.5%.%.9% 0.7% 7.5% Educaon Auomoble Compuer Technque Ar Economc 6.8%.7% 3.% 8.7% 0.9%.5% SVM s a paern recognon approach based on sascal learnng heory frsly proposed by Vapnk e al n 995[9]. I s a unversal learnng approach developed from heory of VC- Dmenson and dea of srucural rsk mnmzaon. In order o ge greaes generalzaon ably, SVM fnds an opmal radeoff beween he complexy of he model and learnng ably accordng o lmed sample nformaon. The basc hough of SVM s descrbed as follows. Suppose some gven daa pons belong o one of he wo classes and s o decde ha o whch class a new daa pon mgh belong. In SVM, a daa pon s vewed as a p- dmensonal vecor and wha needs o be found s ha wheher here s a (p-)-dmensonal hyperplane whch can separae hem. Moreover, he neares dsance beween a pon n one separang hyperplane and a pon n he oher separang hyperplane s maxmzed. I s known as maxmum-margn hyperplane. Suppose here are N daa pons, every daa pon can be n wren as ( x, y ) ( =,,..., N x R y {,} ). Any hyperplane can be wren as wx b = 0. In model ranng of SVM, parameer w and b s chosen o maxmze he margn followng he consran for each, y( wx + b) ( y = ) and y( wx + b) ( y = ).The wo equaons can be generalze as (). y ( wx + b), =,,..., N () Accordng o he rule of geomery, he dsance beween hese wo separang hyperplanes s / w. Maxmzng s equally o mnmze w /. The separang hyperplane sasfyng () and mnmzng w / s called opmal separang hyperplane. Usng Lagrange Mulpler can ge an equvalen problem o maxmze (3). In opmal separang hyperplane, approprae kernel funcon K ( x, xj ) can be seleced o complee lnear classfcaon afer some non-lnear ransformaon. Correspondng classfcaon funcon s gven * below n (4). b s he hreshold of classfcaon. If f( x ) > 0, x wll be predced o he class, oherwse wll no. I has been recognzed ha SVM worked well for ex classfcaon. There are manly four reasons of usng SVM for pos-processng of CPAR [0]. ().SVM doesn depend on he number of feaures because of s applcaon of over fng proecon. Large feaure spaces can be handled n SVM. (). There are only few rrelevan feaures n ex classfcaon. Feaure selecon may resul n a loss of nformaon and SVM can combne many feaures whch are used for classfcaon. (3). The documen vecor of ex classfcaon only conans few ems of value no equal o zero. SVM s well sued for problems wh sparse documen vecor. (4). Mos ex classfcaon problems are lnearly separable and SVM s suable for fndng such lnear separaor. In addon, ICPAR algorhm s defned n Fg.. Inpu: rule se generaed by rule generaon algorhm of CPAR Cener Vecor of each class SVM model raned by ranng corpus Tesng corpus Oupu: class labels predced for each example n esng corpus Procedure ICPAR Algorhm compue he wegh of each class usng class weghng adjusmen for each example n esng corpus 3 vvsm of 4 avg_smaverage smlary of v wh all he classes cener vecor 5 max_excep0 max_excep_label- 6 for each class c n all classes 7 f he smlary of c s cener vecor wh v s larger han avg_sm 8 ruleche rules of c 9 sasfy_ruleche rules of rulec sasfyng example 0 sumcsum up he bes k rules expeced accuracy n sasfy_rulec expeccsumc*class wegh of c f expecc>max_expec 3 max_expecexpecc 4 max_expec_labelc 5 end 6 f max_expec>0 7 predc example no class max_expec_label 8 else 9 predc example wh pos-processng svm model 0 end Fgure. Algorhm of ICPAR IV. EXPERIMENTAL RESULTS Our expermens are conduced on PC wh Wndows XP operang sysem, prmary frequency 3.0G cpu,.0g prmary memory. Chnese ex classfcaon corpus TanCorp V.0 suppled by Insue of Compung Technology, Chnese Academy of Scence s used for expermen []. There are classes n TanCorp and he documen number of each class s lsed n Table III. In order o avod over fng, 5-fold cross valdaon s employed. TABLE III. DOCUMENT NUMBER OF EACH CLASS L = a = aa y y ( x, x ) D j j j, j (3) Recrumen Spors Healh Regon Eneranmen Esae n * * ( ) = sgn( (, ) + ) = f x a yk x x b (4) Educaon Auomoble Compuer Technque Ar Economc In rule generaon of CPAR, we se hreshold of posve wegh facorδ o 0.04, hreshold of leral gan o 0.6, wegh 94

5 decay facorα o /3 and mn_gan_rao (he hreshold rao of a leral s gan o he bes leral s gan, he leral wll be seleced when he rao of s gan o he bes leral s gan s larger han ) o 0.8. CPAR selec he bes k rules of each class for classfcaon, Fg.3 shows accuracy of dfferen k n range from o 0. When k equals o 5, we go a hghes accuracy. So he bes 5 rules are used n CPAR. In Class Weghng Adjusmen algorhm, he eraon me s se o 5. In addon, our SVM mplemenaon s based on LbSVM []. Fgure 3. Dfferen accuracy of dfferen k value Precson and Recall are wdely used n sascal classfcaons. To a ceran class, A denoes he number of examples classfed no he class correcly. B denoes he example number msclassfed o he class. C denoes he number of examples whch belong o he class bu classfed o anoher class. The precson and recall are defned as follow. Precson= A/( A+ B) f( A+ B) > 0; oherwse Precson= Recall= A/( A+ C) f( A+ C) > 0; oherwse Recall= A popular measure ha combnes Precson and Recall s F- score whch s gven n (5). β s a parameer whch can adjus weghs for precson and recall. When=, s known as F measure. The overall performance on he whole daa se s evaluaed by macro average whch s he arhmec average of each class performance ndex. From Fg.4 can be seen ha all of CWA, CVP and PSVM have ncreased he F values of mos classes. Through daa analyss, durng each mprovemen for every class among hem, he classfcaon accuracy of examples belongng o he class s ncreased. Moreover, he probably of examples belongng o oher classes msclassfed o he class s decreased. Therefore, boh recall and precson of he class are ncreased. So he F value of he class s ncreased. Bu excepons do exs. Class label 0 represens he class Recrumen, whch has he larges number of rules. Therefore, examples belongng o oher classes can be easly msclassfed o. Tha s why he precson of Recrumen s lowes of all classes. Even hough CWA can declne hs nfluence, can also reduce he recall of he class because he larger number of rules a class have, he lower wegh of he class wll be. So he classfcaon nensy of he class s weakened by CWA. Ths causes some examples belongng o class Recrumen are msclassfed o oher classes. Therefore, from Fg.4 we can see ha he F value of he class Recrumen s reduced by addng CWA. Moreover, he F value of class Recrumen s lowes n all classes. From error analyss we fnd ha here are abou 0.4% examples of Recrumen msclassfed o he class Compuer and also some ohers msclassfed o he class Educaon or Economc. Ths s he flaw of he classfcaon algorhm based on rules. For nsance, s reasonable ha an example belongng o Recrumen conans he words of nerne, elecronc and company. Bu he hree word can jus make a rule of class Compuer.e. nerneelecronccompanycompuer. When here are enough rules of hs knd, he example s msclassfed o class Compuer. Moreover, reduces he precson of Class Compuer. Clearly, he example of Recrumen has correlaon wh example of Compuer, Educaon or Economc. Therefore, ms-classfcaon s unavodable n hs suaon. In order o mprove classfcaon performance, a modfed rule generaon algorhm s needed whch s our fuure research drecon. Fβ β precson recall β precson recall = ( + ) ( ) /( + ) (5) F= ( precson recall )/( precson + recall ) (6) Table IV llumnaes ha he F-Meassue of macro average s ncreased by addng Class Weghng Adjusmen (CWA), Cener Vecor-based Pre-classfcaon (CVP) and Posprocessng wh SVM (PSVM) o CPAR n sequence. Furher, Table IV depcs our mprovemens owards CPAR. TABLE IV. MACRO_AVERAGE PERFORMANCE INDEX OF DIFFERENT IMPROVEMENTS CPAR CWA added CVP added PSVM added Macro_precso 86.53% 86.93% 86.79% 9.85% n Macro_recall 85.30% 87.5% 89.3% 9.6% Macro_F 85.% 86.58% 87.6% 9.3% Fgure 4. F comparson on each class addng each mprvemen o CPAR Tex classfcaon approaches KNN [3], SVM [] and CPAR [3] are used o compare wh ICPAR. Fgs. 5, 6 and 7 gven below show he comparsons of Precson, Recall and F n four dfferen approaches on each class respecvely. We can see ha ICPAR gves he hghes average performance n each compeon. 95

6 Fgure 5. Precson comparson n dfferen approach on each class Fgure 6. Recall comparson n dfferen approach on each class Fgure 7. F comparson n dfferen approaches on each class Table V shows he resuls of KNN, SVM, CPAR and ICPAR. Furher, Table V depcs ha ICPAR has he hghes classfcaon performance n four classfcaon approaches. We can also conclude ha he classfcaon performance of CPAR wh SVM s more effcen han CPAR or SVM alone. Because of he weakness of rule based classfcaon, CPAR s nadequae o examples sasfyng no rule. Based on sascal learnng heory, SVM can wegh each sngle feaure erm n he process of model ranng. In hs regard, SVM s superor o CPAR n generalzaon ably for new nsances. The negraon of CPAR and SVM n dfferen ways o ge beer classfcaon performance wll be our fuure research drecon. TABLE V. COMPARISON OF DIFFERE NT MACRO_AVERAGE PERFORMANCE INDEX Knn SVM CPAR ICPAR Macro_precson 8.5% 87.56% 86.53% 9.85% Macro_recall 74.% 84.4% 85.30% 9.6% Macro_F 74.80% 85.7% 85.% 9.3% V. CONCLUSIONS In hs paper, we have presened ha ICPAR acheved a hgher accuracy han CPAR. Furher, he proposed ICPAR has he followng dsngushed feaures: () I adjuss he wegh of each class usng Class Weghng Adjusmen algorhm whch balances he classfyng ably of each class. () Insead of loadng he rules of all classes, Cener Vecor-based Preclassfcaon seleced and loaded only hose rules of classes havng hgh level of concern wh he example. Therefore, he probably of ncorrec classfyng for he example has been reduced. (3) Pos-processng wh SVM has been used for examples sasfyng no rules. There s no way o classfy hs knd of examples n CPAR. The proposed algorhm has been llusraed ha s effcen. The mprovemen n exraced rules qualy by combnng he characerscs of Chnese ex, as well as he elmnaon of he rules whch may nerrup he correc classfcaon resul, wll be our fuure research drecon. ACKNOWLEDGMENT Ths work s suppored by he Naonal Hgh-ech R&D Program of Chna (863 Program, No. 007AA0Z94). And apprecae o Insue of Compung Technology, Chnese Academy of Scence for he Chnese ex classfcaon corpus. REFERENCES [] W. L, J. Han, and J. Pe. CMAR: Accurae and effcen classfcaon based on mulple class-assocaon rules. In ICDM'0, pp. 369{376, San Jose, CA, Nov. 00. [] B. Lu, W. Hsu, and Y. Ma. Inegrang classfcaon and assocaon rule mnng. In KDD'98, pp , New York, NY, Aug [3] Yn, X. and Han, J.: CPAR: Classfcaon Based on Predcve Assocaon Rules. Proc SIAM In Conf on Daa Mnng (SDM 03), 003, [4] Renne, J.D.M.[Jason D. M.], Rfkn, R.[Ryan], Improvng Mulclass Tex Classfcaon wh he Suppor Vecor Machne, MIT AIM-00-06, Ocober 00. [5] J. R. Qunlan and R. M. Cameron-Jones. FOIL: A mderm repor. In Proc. 993 European Conf. Machne Learnng, pp. 3{0, Venna, Ausra, 993. [6] Chen and Hu, Tex Assocaon Caegorzaon Based on Self-Adapve Weghng. Journal of Chnese Compuer Sysems. Vol.8, No.. [7] Lu Shao-hu e al. An Approach of Mul-herarchy Tex Classfcaon Based on Vecor Space Model. Journal of Chnese Informaon Processng. Vol.6, No.3. [8] Yang Y M, Pedersen J. A comparave sudy on feaure selecon n ex caegorzaon. In: Proceedngs of he 4h Inernaonal Conference on Machne Learnng (ICML97). San Francsco, USA: Morgan Kaufmann Publshers, [9] C. Cores and V. Vapnk, Suppor-Vecor Neworks, Machne Learnng, 0(3): [0] Joachms T. Tex caegorzaon wh suppor vecor machnes[ C]. Proc of European Conference on Machne Learnng(ECML). [ S. l.] : [ s. n. ],998. [] Songbo Tan e al. A Novel Refnemen Approach for Tex Caegorzaon. ACM CIKM 005. [] Chang C-C ; C-J Ln. C.-C. Chang and C.-J. Ln, LIBSVM: A lbrary for suppor vecor machnes 00. Sofware avalable a <u> hp:// [3] Oh-Woog Kwon, Jong-Hyoek Lee, Web page classfcaon based on k-neares Neghbor approach. 96

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information A Novel Objec Deecon Mehod Usng Gaussan Mxure Codebook Model of RGB-D Informaon Lujang LIU 1, Gaopeng ZHAO *,1, Yumng BO 1 1 School of Auomaon, Nanjng Unversy of Scence and Technology, Nanjng, Jangsu 10094,

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA Mchaela Chocholaá Unversy of Economcs Braslava, Slovaka Inroducon (1) one of he characersc feaures of sock reurns

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

ELASTIC MODULUS ESTIMATION OF CHOPPED CARBON FIBER TAPE REINFORCED THERMOPLASTICS USING THE MONTE CARLO SIMULATION

ELASTIC MODULUS ESTIMATION OF CHOPPED CARBON FIBER TAPE REINFORCED THERMOPLASTICS USING THE MONTE CARLO SIMULATION THE 19 TH INTERNATIONAL ONFERENE ON OMPOSITE MATERIALS ELASTI MODULUS ESTIMATION OF HOPPED ARBON FIBER TAPE REINFORED THERMOPLASTIS USING THE MONTE ARLO SIMULATION Y. Sao 1*, J. Takahash 1, T. Masuo 1,

More information

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose

More information

ECE 366 Honors Section Fall 2009 Project Description

ECE 366 Honors Section Fall 2009 Project Description ECE 366 Honors Secon Fall 2009 Projec Descrpon Inroducon: Muscal genres are caegorcal labels creaed by humans o characerze dfferen ypes of musc. A muscal genre s characerzed by he common characerscs shared

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes Ansoropc Behavors and Is Applcaon on Shee Meal Sampng Processes Welong Hu ETA-Engneerng Technology Assocaes, Inc. 33 E. Maple oad, Sue 00 Troy, MI 48083 USA 48-79-300 whu@ea.com Jeanne He ETA-Engneerng

More information

ISSN MIT Publications

ISSN MIT Publications MIT Inernaonal Journal of Elecrcal and Insrumenaon Engneerng Vol. 1, No. 2, Aug 2011, pp 93-98 93 ISSN 2230-7656 MIT Publcaons A New Approach for Solvng Economc Load Dspach Problem Ansh Ahmad Dep. of Elecrcal

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES H. Abrsham Moghaddam K. N. Toos Unversy of Technology, P.O. Box 635-355, Tehran, Iran moghadam@saba.knu.ac.r M. Ghayoum Islamc Azad Unversy,

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Optimal environmental charges under imperfect compliance

Optimal environmental charges under imperfect compliance ISSN 1 746-7233, England, UK World Journal of Modellng and Smulaon Vol. 4 (28) No. 2, pp. 131-139 Opmal envronmenal charges under mperfec complance Dajn Lu 1, Ya Wang 2 Tazhou Insue of Scence and Technology,

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

A Novel Efficient Stopping Criterion for BICM-ID System

A Novel Efficient Stopping Criterion for BICM-ID System A Novel Effcen Soppng Creron for BICM-ID Sysem Xao Yng, L Janpng Communcaon Unversy of Chna Absrac Ths paper devses a novel effcen soppng creron for b-nerleaved coded modulaon wh erave decodng (BICM-ID)

More information

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks Proceedngs of he 5h WSEAS In. Conf. on SIMULATION, MODELING AND OPTIMIZATION, Corfu, Greece, Augus 7-9, 005 (pp78-83) Shor-Term Load Forecasng Usng PSO-Based Phase Space Neural Neworks Jang Chuanwen, Fang

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

Detection of Waving Hands from Images Using Time Series of Intensity Values

Detection of Waving Hands from Images Using Time Series of Intensity Values Deecon of Wavng Hands from Images Usng Tme eres of Inensy Values Koa IRIE, Kazunor UMEDA Chuo Unversy, Tokyo, Japan re@sensor.mech.chuo-u.ac.jp, umeda@mech.chuo-u.ac.jp Absrac Ths paper proposes a mehod

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

CROSS ENTROPY METHOD FOR MULTICLASS SUPPORT VECTOR MACHINE

CROSS ENTROPY METHOD FOR MULTICLASS SUPPORT VECTOR MACHINE CROSS ENTROPY METHOD FOR MULTICLASS SUPPORT VECTOR MACHINE Bud Sanosa Deparmen of Indusral Engneerng Insu Teknolog Sepuluh Nopember (ITS) Surabaya ITS Campus Sukollo, Surabaya 60 Indonesa bud_s@e.s.ac.d

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

Constrained-Storage Variable-Branch Neural Tree for. Classification

Constrained-Storage Variable-Branch Neural Tree for. Classification Consraned-Sorage Varable-Branch Neural Tree for Classfcaon Shueng-Ben Yang Deparmen of Dgal Conen of Applcaon and Managemen Wenzao Ursulne Unversy of Languages 900 Mnsu s oad Kaohsng 807, Tawan. Tel :

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 MCs Deecon Approach Usng Baggng and Boosng Based Twn Suppor Vecor Machne Xnsheng Zhang School

More information

3. OVERVIEW OF NUMERICAL METHODS

3. OVERVIEW OF NUMERICAL METHODS 3 OVERVIEW OF NUMERICAL METHODS 3 Inroducory remarks Ths chaper summarzes hose numercal echnques whose knowledge s ndspensable for he undersandng of he dfferen dscree elemen mehods: he Newon-Raphson-mehod,

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks Effcen Asynchronous Channel Hoppng Desgn for Cognve Rado Neworks Chh-Mn Chao, Chen-Yu Hsu, and Yun-ng Lng Absrac In a cognve rado nework (CRN), a necessary condon for nodes o communcae wh each oher s ha

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Effect of Resampling Steepness on Particle Filtering Performance in Visual Tracking

Effect of Resampling Steepness on Particle Filtering Performance in Visual Tracking 102 The Inernaonal Arab Journal of Informaon Technology, Vol. 10, No. 1, January 2013 Effec of Resamplng Seepness on Parcle Flerng Performance n Vsual Trackng Zahdul Islam, Ch-Mn Oh, and Chl-Woo Lee School

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information