Confidence Estimation Using the Incremental Learning Algorithm, Learn++

Size: px
Start display at page:

Download "Confidence Estimation Using the Incremental Learning Algorithm, Learn++"

Transcription

1 Confdence Esmaon Usng he Incremenal Learnng Algorhm, Learn++ Jeffrey Byorck and Rob Polkar Elecrcal and Compuer Engneerng, Rowan Unversy, 136 Rowan Hall, Glassboro, NJ 08028, USA. Absrac. Paern recognon problems span a broad range of applcaons, where each applcaon has s own olerance on classfcaon error. The varyng levels of rsk assocaed wh many paern recognon applcaons ndcae he need for an algorhm wh he ably o measure s own confdence. In hs work, he supervsed ncremenal learnng algorhm Learn++ [1], whch explos he synergsc power of an ensemble of classfers, s furher developed o add he capably of assessng s own confdence usng a weghed exponenal majory vong echnque. 1 Inroducon 1.1 Incremenal Learnng I s wdely recognzed ha he recognon accuracy of a classfer s heavly ncumben on he avalably of an adequae and represenave ranng daase. Acqurng such daa s ofen edous, me-consumng, and expensve. In pracce, s no uncommon for such daa o be acqured n small baches over a perod of me. A ypcal approach n such cases s combnng new daa wh all prevous daa, and ranng a new classfer from scrach. Ths approach resuls n loss of all prevously learned knowledge, a phenomenon known as caasrophc forgeng. Furhermore, he combnaon of old and new daases s no even always a vable opon f prevous daases are los, dscarded, corruped, naccessble, or oherwse unavalable. Incremenal learnng s he soluon o such scenaros, whch can be defned as he process of exracng new nformaon whou losng pror knowledge from an addonal daase ha laer becomes avalable. Varous defnons and nerpreaons of ncremenal learnng can be found n leraure, ncludng onlne learnng [2,3], relearnng of prevously msclassfed nsances [4,5], and growng and prunng of classfer archecures [6,7]. For he purposes of hs work, an algorhm possesses ncremenal learnng capables, f mees he followng crera: (1) ably o acqure addonal knowledge when new daases are nroduced; (2) ably o rean prevously learned nformaon; (3) ably o learn new classes f nroduced by new daa. O. Kaynak e al. (Eds.): ICANN/ICONIP 2003, LNCS 2714, pp , Sprnger-Verlag Berln Hedelberg 2003

2 182 J. Byorck and R. Polkar 1.2 Ensemble of Classfers Ensemble sysems have araced a grea deal of aenon over he las decade due o her emprcal success over sngle classfer sysems on a varey of applcaons. Such sysems combne an ensemble of generally weak classfers o ake advanage of he so-called nsably of he weak classfer, whch causes he classfers o consruc suffcenly dfferen decson boundares for mnor modfcaons n her ranng parameers, causng each classfer o make dfferen errors on any gven nsance. A sraegc combnaon of hese classfers, such as weghed majory vong [8], hen elmnaes he ndvdual errors, generang a srong classfer. A rch collecon of algorhms have been developed usng mulple classfers, such as AdaBoos [9], wh he general goal of mprovng he generalzaon performance of he classfcaon sysem. Usng mulple classfers for ncremenal learnng, however, has been largely unexplored. Learn++, n par nspred by AdaBoos, was developed n response o recognzng he poenal feasbly of ensemble of classfers n solvng he ncremenal learnng problem. Learn++ was nally nroduced n [1] as an ncremenal learnng algorhm for MLP ype neworks. A more versale form of he algorhm was presened n [10] for all supervsed classfers. We have recenly recognzed ha nheren vong mechansm of he algorhm can also be used n effecvely deermnng he confdence of he classfcaon sysem n s own decson. In hs work, we descrbe he algorhm Learn++, along wh represenave resuls on ncremenal learnng and confdence esmaon obaned on one real world and one benchmark daabase from he Unv. of Calforna, Irvne (UCI) machne learnng reposory [11]. 2 Learn++ The Learn++ algorhm, gven n Fg. 1, explos he synergsc power of an ensemble of classfers o ncremenally learn new nformaon ha may laer become avalable. Learn++ generaes mulple weak classfers, each raned wh dfferen subses of he daa. For each daabase ' k, k1,,k ha becomes avalable, he npus o Learn++ are () Sk {( x, y) 1, L, mk}, a sequence of m k ranng daa nsances x, along wh her correc labels y, () a weak classfcaon algorhm BaseClassfer o generae weak classfers, and () an neger T k specfyng he number of classfers (hypoheses) o be generaed for ha daabase. We requre ha BaseClassfer oban a leas 50% correc classfcaon performance on s own ranng daase, o ensure a meanngful classfcaon performance for each classfer. Learn++ sars by nalzng a se of weghs for he ranng daa, w, and a dsrbuon D obaned from w, accordng o whch a ranng subse TR and a es subse TE are drawn a he h eraon of he algorhm, 1,,T k, where Sk TR U TE. Unless a pror nformaon ndcaes oherwse, hs dsrbuon s nally se o be unform, gvng equal probably o each nsance o be seleced no he frs ranng subse. The varaon of nsances whn he ranng daa subses s acheved by eravely updang he dsrbuon of weghs D. A each eraon, he weghs adjused a eraon -1 are normalzed o ensure ha a legmae dsrbuon, D, s obaned. TR and TE are hen drawn accordng o D and BaseClassfer s raned wh he ranng subse. A hypohess h s obaned as he h classfer, whose error s

3 Confdence Esmaon Usng he Incremenal Learnng Algorhm, Learn compued on he enre (curren) daabase S k smply by addng he dsrbuon weghs of he msclassfed nsances ε D ( ) (1) : h ( x ) y If ε > ½, h s dscarded and a new TR and TE are seleced. If he error s less hen half, hen he error s normalzed and compued as β ε ( 1 ε ), 0 β 1 (2) Hypoheses generaed n all prevous eraons are hen combned usng weghed majory vong o form a compose hypohess H usng H arg max y Y : h y 1 log β where he sum of weghs assocaed wh each classfer s compued for every class presen n he classfcaon ask. A hgher wegh s gven o classfers ha perform beer on her specfc ranng ses. The compose hypohess H s obaned by assgnng he class label o an nsance x ha receves he larges oal voe. The compose error made by H s hen compued as : H ( x ) y D ( ) m 1 D ( ) [ H ( x ) y ] where [ ] evaluaes o 1, f he predcae holds rue. Smlar o he calculaon of β, a normalzed compose error B s compued as ( 1 E ), 0 B 1 B E The weghs w () are hen updaed o oban D, whch s used for he selecon of +1 he nex ranng and esng subses, TR +1 and TE +1, respecvely. The dsrbuon updae rule whch comprses he hear of he algorhm s gven by w B, f H ( x ) y 1, oherwse [ H ( x ) y ] ( ) w ( ) w ( ) B. Ths rule reduces he weghs of hose nsances ha are correcly classfed by he compose hypohess H, so ha her probably of beng seleced no he nex ranng subse s reduced. When normalzed durng eraon +1, he weghs of msclassfed nsances are ncreased relave o he res of he daase. We emphasze ha unlke AdaBoos and s varaons, he wegh updae rule n Learn++ looks a he classfcaon oupu of he compose hypohess, no o ha of a specfc hypohess. Ths wegh updae procedure forces he algorhm o focus more on nsances ha have no been properly learned by he ensemble. When Learn++ s learnng ncremenally, he nsances nroduced by he new daabase are precsely hose no learned by he ensemble. Afer T k hypoheses are generaed for each daabase ' N, he fnal hypohess s obaned by he weghed majory vong of all compose hypoheses: (3) (4) (5) (6)

4 184 J. Byorck and R. Polkar H fnal arg max K y Y k 1 : H y 1 log B (7) Inpu: For each daase drawn from ' N k1,2,,k Sequence of m k examples S k {( x, y ) 1, L, mk } Weak learnng algorhm BaseClassfer Ineger T k, specfyng he number of eraons Inalze w 1 () D 1 ()1/m k,, 1,2,,m k Do for each k1,2,,k: Do for 1,2,,T k : m 1. Se D w w ( ) so ha D s a dsrbuon 1 2. Draw ranng TR and esng TE subses from D. 3. Call BaseClassfer o be raned wh TR. 4. Oban a hypohess h : XÅY, and calculae he error of h : ε D ( ) : h ( x ) y on TR +TE. If ε > ½, dscard h and go o sep 2. Oherwse, compue normalzed error as β ε ( 1 ε ). 5. Call weghed majory vong and oban he compose hypohess H argmax log 1 β y Y : h y ( ) 6. Compue he error of he compose hypohess E D ( ) m : H ( x ) y 1 7. Se B E ( E ) D ( ) [ H ( x ) y ] 1, and updae he weghs: Call Weghed majory vong and Oupu he fnal hypohess: H fnal y Y K k 1 : h y w + 1 ( ) w ( ) B ( ) arg max log 1 β Fg. 1. Learn++ Algorhm 1 [ H ( x ) y ] 3 Confdence Esmaon An nmaely relevan ssue s he confdence of he classfer n s decson, wh parcular neres n wheher he confdence of he algorhm mproves as new daa becomes avalable. The vong mechansm nheren n Learn++ hns o a praccal approach for esmang confdence: decsons made wh a vas majory of voes have beer confdence hen hose made by a slgh majory. We have mplemened McIver

5 Confdence Esmaon Usng he Incremenal Learnng Algorhm, Learn and Fredl s weghed exponenal vong based confdence merc [12] wh Learn++ as F j (8) e C j P( y j x) N, 0 C j 1 Fk e k 1 where C j (x) s he confdence assgned o nsance x when classfed as class j, F j (x) s he oal voe assocaed wh he j h class for he nsance x, and N s he oal number of classes. The oal voe F j (x) class j receves for any gven nsance s compued as N 1 log h j Fj β 1 0 oherwse The confdence of wnnng class s hen consdered as he confdence of he algorhm n makng he decson wh respec o he wnnng class. Snce C j (x) s beween 0 and 1, he confdences can be ranslaed no lngusc ndcaors, such as hose shown n Table 1. These ndcaors are adoped and used n abulang he resuls. (9) Table 1. Confdence percenages represened by lngusc ndcaors Confdence Percenage Range Confdence Level 90 C 100 Very Hgh (VH) 80 C < 90 Hgh (H) 70 C < 80 Medum (M) 60 C < 70 Low (L) C < 60 Very Low (VL) Equaons (8) and (9) allow Learn++ o deermne s own confdence n any classfcaon makes. The desred oucome of he confdence analyss s o observe a hgh confdence on correcly classfed nsances, and a low confdence on msclassfed nsances, so ha he low confdence can be used o flag hose nsances ha are beng msclassfed by he algorhm. A second desred oucome s o observe mproved confdences on correcly classfed nsances and reduced confdence on msclassfed nsances, as new daa becomes avalable, so ha he ncremenal learnng ably of he algorhm can be furher confrmed. 4 Smulaon Resuls on Learn++ Learn++ has been esed on a dverse se of benchmark daabases acqured from he UCI Machne Learnng Reposory, as well as a few real-world applcaons, boh for ncremenal learnng where new daases ncluded new classes and for esmang he confdence of Learn++ n s own decsons. The ncremenal learnng resuls wh new classes are presened n [1]. In hs paper, we presen he resuls on confdence

6 186 J. Byorck and R. Polkar esmaon, and we use wo daabases, one benchmark daabase from UCI, and one real world on gas sensng, as represenave smulaons. 4.1 Volale Organc Compound (VOC) Daabase The VOC daabase s a real world daabase for he odoran denfcaon problem. The nsances are responses of sx quarz crysal mcrobalances o fve volale organc compounds, ncludng ehanol (ET), ocane (OC), oluene (TL), rchloroehelene (TCE), and xylene (XL), consung a fve class, sx feaure daabase. Three daases, S 2 and, where each daase ncluded approxmaely one hrd of he enre ranng daa, were provded o Learn++ n hree ranng sessons for ncremenal learnng. The daa dsrbuon and he percen classfcaon performance are gven n Table 2. The performances lsed are on he valdaon daa, TEST, followng each ranng sesson. Table 3 provdes an acual breakdown of correcly classfed and msclassfed nsances fallng no each confdence range afer each ranng sesson. The rends of he confdence esmaes afer subsequen ranng sessons are gven n Table 4. The desred oucome on he acual confdences s hgh o very hgh confdences on correcly classfed nsances, and low o very low confdences on msclassfed nsances. The desred oucome on confdence rends s ncreasng or seady confdences on correcly classfed nsances, and decreasng confdences on msclassfed nsances, as new daa s nroduced. Table 2. Daa dsrbuon and performance on VOC daabase Ehanol Ocane Toluene TCE Xylene Tes Perf. (%) S TEST Correcly Classfed Msclassfed Table 3. Confdence resuls on VOC daabase VH H M L VL S S Table 4. Confdence rends for he VOC daabase. Increasng/Seady Decreasng Correcly Classfed Msclassfed 9 16 The performance fgures n Table 2 ndcae ha he algorhm s mprovng s generalzaon performance as new daa becomes avalable. The mprovemen s modes, however, as majory of he new nformaon s already learned n he frs ranng sesson. Oher expermens, where new daa nroduced new classes, showed remarkable performance ncrease as repored n [1]. Table 3 ndcaes ha he vas majory

7 Confdence Esmaon Usng he Incremenal Learnng Algorhm, Learn of correcly classfed nsances end o have very hgh confdences, wh connually mproved confdences a consecuve ranng sessons. Whle a consderable poron of msclassfed nsances also had hgh confdence for hs daabase, he general desred rends of ncreased confdence on correcly classfed nsances and decreasng confdence on msclassfed ones were noable and domnan, as shown n Table Glass Daabase The glass daabase, rereved from UCI reposory, s a 10-feaure, 6-class daabase wh samples of glass from buldngs, vehcles, conaners, ableware, and headlamps. The buldngs nclude wo ypes of glass, whch are floa processed and non-floa processed. Ths daabase was also provded ncremenally n hree ranng sessons, wh each sesson usng one of he daases, ~. The dsrbuon of daa, as well as he performance on he valdaon daase, s shown n Table 5. The confdence resuls are shown n Table 6, whle he confdence rends are provded n Table 7. Table 5. Daa dsrbuon and generalzaon performance on glass daabase Floa Non- Floa Vehcle Conaner Table Lamp Tes Perf. (%) S TEST Table 6. % Confdence Resuls on Glass Daabase VH H M L VL Correcly Classfed S Msclassfed S Table 7. Confdence rends for glass daabase Increasng/Seady Decreasng Correcly classfed 63 1 Msclassfed 3 2 For he glass daabase, he above-menoned desrable ras are even more remarkable. The majory of correcly classfed nsances fell no a very hgh confdence range, whle he msclassfed nsances fell no he very low confdence range. Posve arbues were also seen n he confdence rends where he majory of correcly classfed nsances had an ncreasng or seady confdence hrough consecuve ranng sessons. Furhermore, he ncremenal learnng ably of he algorhm s also demonsraed hrough he mproved generalzaon performance (from 83% o 93%) on he TEST daase wh avalably of addonal daa.

8 188 J. Byorck and R. Polkar 5 Dscusson and Conclusons Apar from he ncremenal learnng ably of Learn++, was found ha he algorhm can also assess he confdence of s own decsons. In general, majory of correcly classfed nsances had very hgh confdence esmaes whle lower confdence values were assocaed wh msclassfed nsances. Therefore, classfcaons wh low confdences can be used as a flag o furher evaluae hose nsances. Furhermore, he algorhm also showed ncreasng confdences n correcly classfed nsances and decreasng confdences n msclassfed nsances afer subsequen ranng sessons. Ths s a very comforng oucome, whch furher ndcaes ha algorhm can ncremenally acqure new and novel nformaon from addonal daa. Work s n progress o furher es he algorhm s capables on a more dverse se of real world and benchmark daabases. Acknowledgemen. Ths maeral s based upon work suppored by he Naonal Scence Foundaon under Gran No References [1] R. Polkar, L. Udpa, S. Udpa, V. Honavar. Learn++: An ncremenal learnng algorhm for supervsed neural neworks. IEEE Tran. on Sysems, Man, and Cybernecs(C)., Vol. 31, no. 4, November [2] P. Wnson, Learnng srucural descrpons from examples, In The Psychology of Compuer Vson P. Wnson (ed.), pp , McGraw-Hll: New York, NY, [3] N. Llesone, Learnng quckly when rrelevan arbues abound, Machne Learnng, vol. 2, pp , [4] P. Janke, Types of ncremenal learnng, Tranng Issues n Incremenal Learnng, A. Cornuejos (ed.) pp.26-32, The AAAI Press: Menlo Park, CA [5] M.A. Maloof and R.S. Mchalsk, Selecng examples for paral memory learnng, Machne Learnng, vol. 41, pp , [6] F.S. Osaro and B. Amy, INSS: A hybrd sysem for consrucve machne learnng, Neurocompung, vol. 28, pp , [7] J. Ghosh and A,c, Nag, Knowledge enhancemen and reuse wh radal bass funcon neworks, Proceedng of Inernaonal Jon Conference on Neural Neworks, vol. 9, no. 2, pp , [8] N. Llesone and M. Warmuh, "Weghed Majory Algorhm," Informaon and Compuaon, vol. 108, pp , [9] Y. Freund and R. Schapre, A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng, Journal of Compuer and Sysem Scences, [10] R. Polkar, J. Byorck, S. Krause, A. Marno, and M. Moreon, Learn++: A classfer ndependen ncremenal learnng algorhm, Proceedngs of Inernaonal Jon Conference on Neural Neworks, May [11] C. L. Blake and C. J. Merz. (1998) UCI reposory of machne learnng daabases. Dep. Inform. and Compu. Sc., Unversy of Calforna, Irvne. [Onlne]. Avalable: hp:// [12] D. McIver and M. Fredl, Esmang Pxel-Scale Land Cover Classfcaon Confdence Usng Nonparamerc Machne Learnng Mehods, IEEE Transacons on Geoscence and Remoe Sensng, Vol. 39, No. 9, Sepember 2001.

Dynamically Weighted Majority Voting for Incremental Learning and Comparison of Three Boosting Based Approaches

Dynamically Weighted Majority Voting for Incremental Learning and Comparison of Three Boosting Based Approaches Proceedngs of Inernaonal Jon Conference on Neural Neworks, Monreal, Canada, July 3 - Augus 4, 2005 Dynamcally Weghed Majory Vong for Incremenal Learnng and Comparson of Three Boosng Based Approaches Alasgar

More information

Ensemble of Classifiers Based Incremental Learning with Dynamic Voting Weight Update

Ensemble of Classifiers Based Incremental Learning with Dynamic Voting Weight Update Ensemble of Classfers Based Incremenal Learnng wh Dynamc Vong Wegh Updae Rob Polkar, Sefan Krause and Lyndsay Burd Elecrcal and Compuer Engneerng, Rowan Unversy, 36 Rowan Hall, Glassboro, NJ 828, USA.

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Boosted LMS-based Piecewise Linear Adaptive Filters

Boosted LMS-based Piecewise Linear Adaptive Filters 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar,

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Labs. 180 Park Avenue. Florham Park, NJ 07932

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Labs. 180 Park Avenue. Florham Park, NJ 07932 A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng Yoav Freund Rober E. Schapre AT&T Labs 80 Park Avenue Florham Park, NJ 07932 fyoav, schapreg@research.a.com December 9, 996 Absrac

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

A Novel Efficient Stopping Criterion for BICM-ID System

A Novel Efficient Stopping Criterion for BICM-ID System A Novel Effcen Soppng Creron for BICM-ID Sysem Xao Yng, L Janpng Communcaon Unversy of Chna Absrac Ths paper devses a novel effcen soppng creron for b-nerleaved coded modulaon wh erave decodng (BICM-ID)

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Bell Laboratories. 600 Mountain Avenue

A decision-theoretic generalization of on-line learning. and an application to boosting. AT&T Bell Laboratories. 600 Mountain Avenue A decson-heorec generalzaon of on-lne learnng and an applcaon o boosng Yoav Freund Rober E. Schapre AT&T Bell Laboraores 600 Mounan Avenue Room f2b-428, 2A-424g Murray Hll, NJ 07974-0636 fyoav, schapreg@research.a.com

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks Effcen Asynchronous Channel Hoppng Desgn for Cognve Rado Neworks Chh-Mn Chao, Chen-Yu Hsu, and Yun-ng Lng Absrac In a cognve rado nework (CRN), a necessary condon for nodes o communcae wh each oher s ha

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov June 7 e-ournal Relably: Theory& Applcaons No (Vol. CONFIDENCE INTERVALS ASSOCIATED WITH PERFORMANCE ANALYSIS OF SYMMETRIC LARGE CLOSED CLIENT/SERVER COMPUTER NETWORKS Absrac Vyacheslav Abramov School

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500 Comparson of Supervsed & Unsupervsed Learnng n βs Esmaon beween Socks and he S&P500 J. We, Y. Hassd, J. Edery, A. Becker, Sanford Unversy T I. INTRODUCTION HE goal of our proec s o analyze he relaonshps

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

ECE 366 Honors Section Fall 2009 Project Description

ECE 366 Honors Section Fall 2009 Project Description ECE 366 Honors Secon Fall 2009 Projec Descrpon Inroducon: Muscal genres are caegorcal labels creaed by humans o characerze dfferen ypes of musc. A muscal genre s characerzed by he common characerscs shared

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION EERAIED BU-MAU YTEM ITH A FREQUECY AD A EVERITY CMET A IDIVIDUA BAI I AUTMBIE IURACE* BY RAHIM MAHMUDVAD AD HEI HAAI ABTRACT Frangos and Vronos (2001) proposed an opmal bonus-malus sysems wh a frequency

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Tools for Analysis of Accelerated Life and Degradation Test Data

Tools for Analysis of Accelerated Life and Degradation Test Data Acceleraed Sress Tesng and Relably Tools for Analyss of Acceleraed Lfe and Degradaon Tes Daa Presened by: Reuel Smh Unversy of Maryland College Park smhrc@umd.edu Sepember-5-6 Sepember 28-30 206, Pensacola

More information

Improved Classification Based on Predictive Association Rules

Improved Classification Based on Predictive Association Rules Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 Improved Classfcaon Based on Predcve Assocaon Rules Zhxn Hao, Xuan Wang, Ln Yao, Yaoyun Zhang

More information

Ensemble Confidence Estimates Posterior Probability

Ensemble Confidence Estimates Posterior Probability Ensemble Esimaes Poserior Probabiliy Michael Muhlbaier, Aposolos Topalis, and Robi Polikar Rowan Universiy, Elecrical and Compuer Engineering, Mullica Hill Rd., Glassboro, NJ 88, USA {muhlba6, opali5}@sudens.rowan.edu

More information

An Integrated and Interactive Video Retrieval Framework with Hierarchical Learning Models and Semantic Clustering Strategy

An Integrated and Interactive Video Retrieval Framework with Hierarchical Learning Models and Semantic Clustering Strategy An Inegraed and Ineracve Vdeo Rereval Framewor wh Herarchcal Learnng Models and Semanc Cluserng Sraegy Na Zhao, Shu-Chng Chen, Me-Lng Shyu 2, Suar H. Rubn 3 Dsrbued Mulmeda Informaon Sysem Laboraory School

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

ISSN MIT Publications

ISSN MIT Publications MIT Inernaonal Journal of Elecrcal and Insrumenaon Engneerng Vol. 1, No. 2, Aug 2011, pp 93-98 93 ISSN 2230-7656 MIT Publcaons A New Approach for Solvng Economc Load Dspach Problem Ansh Ahmad Dep. of Elecrcal

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Constrained-Storage Variable-Branch Neural Tree for. Classification

Constrained-Storage Variable-Branch Neural Tree for. Classification Consraned-Sorage Varable-Branch Neural Tree for Classfcaon Shueng-Ben Yang Deparmen of Dgal Conen of Applcaon and Managemen Wenzao Ursulne Unversy of Languages 900 Mnsu s oad Kaohsng 807, Tawan. Tel :

More information

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field Submed o: Suden Essay Awards n Magnecs Bernoull process wh 8 ky perodcy s deeced n he R-N reversals of he earh s magnec feld Jozsef Gara Deparmen of Earh Scences Florda Inernaonal Unversy Unversy Park,

More information

Algorithm Research on Moving Object Detection of Surveillance Video Sequence *

Algorithm Research on Moving Object Detection of Surveillance Video Sequence * Opcs and Phooncs Journal 03 3 308-3 do:0.436/opj.03.3b07 Publshed Onlne June 03 (hp://www.scrp.org/journal/opj) Algorhm Research on Movng Objec Deecon of Survellance Vdeo Sequence * Kuhe Yang Zhmng Ca

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or

More information

Solving the multi-period fixed cost transportation problem using LINGO solver

Solving the multi-period fixed cost transportation problem using LINGO solver Inernaonal Journal of Pure and Appled Mahemacs Volume 119 No. 12 2018, 2151-2157 ISSN: 1314-3395 (on-lne verson) url: hp://www.pam.eu Specal Issue pam.eu Solvng he mul-perod fxed cos ransporaon problem

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks

Short-Term Load Forecasting Using PSO-Based Phase Space Neural Networks Proceedngs of he 5h WSEAS In. Conf. on SIMULATION, MODELING AND OPTIMIZATION, Corfu, Greece, Augus 7-9, 005 (pp78-83) Shor-Term Load Forecasng Usng PSO-Based Phase Space Neural Neworks Jang Chuanwen, Fang

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Kernel-Based Bayesian Filtering for Object Tracking

Kernel-Based Bayesian Filtering for Object Tracking Kernel-Based Bayesan Flerng for Objec Trackng Bohyung Han Yng Zhu Dorn Comancu Larry Davs Dep. of Compuer Scence Real-Tme Vson and Modelng Inegraed Daa and Sysems Unversy of Maryland Semens Corporae Research

More information

A General Magnitude-Preserving Boosting Algorithm for Search Ranking

A General Magnitude-Preserving Boosting Algorithm for Search Ranking A General Magnude-Preservng Boosng Algorhm for Search Rankng Chenguang Zhu 1 * Wezhu Chen Zeyuan Allen Zhu 3 Gang Wang Dong Wang 1 Zheng Chen 1 nsue for Theorecal Compuer Scence Tsnghua Unversy Beng Chna

More information

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund

More information

Local Cost Estimation for Global Query Optimization in a Multidatabase System. Outline

Local Cost Estimation for Global Query Optimization in a Multidatabase System. Outline Local os Esmaon for Global uery Opmzaon n a Muldaabase ysem Dr. ang Zhu The Unversy of Mchgan - Dearborn Inroducon Oulne hallenges for O n MDB uery amplng Mehod ualave Approach Fraconal Analyss and Probablsc

More information

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 MCs Deecon Approach Usng Baggng and Boosng Based Twn Suppor Vecor Machne Xnsheng Zhang School

More information

Authentication Management for Information System Security Based on Iris Recognition

Authentication Management for Information System Security Based on Iris Recognition Journal of Advanced Managemen Scence, Vol 1, No 1, March 2013 Auhencaon Managemen for Informaon Sysem Secury Based on Irs Recognon Yao-Hong Tsa Deparmen of Informaon Managemen, Hsuan Chung Unversy, Hsnchu

More information