Constrained-Storage Variable-Branch Neural Tree for. Classification

Size: px
Start display at page:

Download "Constrained-Storage Variable-Branch Neural Tree for. Classification"

Transcription

1 Consraned-Sorage Varable-Branch Neural Tree for Classfcaon Shueng-Ben Yang Deparmen of Dgal Conen of Applcaon and Managemen Wenzao Ursulne Unversy of Languages 900 Mnsu s oad Kaohsng 807, Tawan. Tel : E-mal: 9800@mal.wzu.edu.w ABSTACT In hs sudy, he consraned-sorage varable-branch neural ree (CSVBNT) s proposed for paern classfcaon. In he CSVBNT, each nernal node s desgned as a sngle layer neural newor () ha s used o classfy he npu samples. The genec algorhm (GA) s proposed o search for he proper number of oupu nodes n he oupu layer of he. Furhermore, he growng mehod s proposed o deermne whch node has he hghes prory o spl n he CSVBNT because of sorage consran. The growng mehod selecs a node o spl n he CSVBNT accordng o he classfcaon error rae and compung complexy of he CSVBNT. In he expermens, CSVBNT has lower classfcaon error rae han oher NTs when hey have he same compung me. Key words: Neural rees, neural newor, genec algorhm. I. INTODUCTION Decson rees (DT) ([] - [6]) are commonly used echnues n machne learnng, recognon and classfcaon sysems. The advanage of DTs s ha decson nodes are easly desgned and have low compung complexy. However, he dsadvanage of DTs s ha he classfcaon error rae s hgh when a node s no a pure class. Therefore, large DTs are preferred o reduce he classfcaon error rae, and hen he DTs become deep rees and low performance. Neural rees (NT) ([7] []) provde a soluon for combnng boh decson rees and neural newors (NN), hus offerng he advanages of boh DTs and NNs. ecenly, NTs have been appled o he recognon of characer ses [], human faces [3], range mages [4], large paern ses [5] and complex scenes [6].

2 ecenly, he sae-of-he-ar NTs have been proposed. A neural ree wh mul-layer percepron (MLP) a he nernal nodes was proposed by Guo and Gelfand [7]. In [7], he expermenal resuls show ha he NT wh MLPs allows ha he mode uses a smaller number of nodes and leaves. However, he dsadvanage of NTs wh MLPs s ha he compung complexy s ncreased, snce large number of parameers o be uned and hgher rs of overfng. In [8], he adapve hgh order neural ree (AHNT) s proposed. Nodes are hgh-order perceprons (HOP) [9] whose order depends on he varaon of he global error rae. Frs-order nodes dvde he npu space wh hyperplanes, whle HOPs dvde he npu space arbrarly. The drawbac of he AHNT s ncreased complexy and, hus, hgher compuaonal cos. In [0], he MLP s used o desgn he neural newor ree (NNTree). Insead of usng nformaon gan rao as splng creron, a new creron s nroduced for NNTree desgn. The new creron capures well he goal of reducng he classfcaon error rae. However, he man drawbac s he necessy of an ad hoc defnon of some parameers for each conex and ranng se, such as he number of hdden layers, number of nodes for each layer n he neural newors. Alhough he NTs wh MLPs prevously descrbed allow o dvde he npu space wh arbrary hypersurfaces, he compung complexy of a MLP s abou several mes he compung complexy of he sngle-layer neural newor (). In [], he sngle-layer percepron ree was presened. In [], Sra and Nadal presened a bnary srucure of NT by usng he s o solve wo-class problems. Fores and Mchelon [3] proposed a generalzed neural ree (GNT), n whch each node of he GNT s composed of wo pars: he, and he normalzer. The learnng rule s calculaed by mnmzng a cos funcon whch represens a measure of he overall classfcaon error of he GNT. In 0, he balanced neural ree (BNT) was proposed o reduce ree sze and mprove classfcaon wh respec o classcal neural ree [4]. The s also used o desgn each node n he BNT. However, here are wo common drawbacs o hese sae-of-he-ar NTs, ncludng he AHNT, NNTree, GNT and BNT. Frs, he growh sraeges of hese NTs only consder how o reduce he classfcaon error rae, bu do no consder how o reduce he compung complexy. Then, each NT s desgned as a deep and large ree-srucured NT o reduce he classfcaon error rae, whch resuls n

3 he need for greaer compung complexy. Therefore, an opmal NT mus ae no accoun how o reduce boh he classfcaon error rae and compung complexy when he deph or he number of nodes s lmed n NT. The nex drawbac of hese exsng NTs s ha he number of oupu nodes n he neural newor (NN) s se o wo (such as BNT) or he number of classes needed o be dsngushed (such as GNT, AHNT and NNTree). In many cases, he number of oupu nodes n he oupu layer of NN s se o he number of classes needed o be dsngushed, s no always he bes sregy o desgn he NT. If he number of oupu node n NN s oo small, he samples belongng o dfferen classes may be classfed o he same oupu node n NN, and hen a narrower and deeper NT wll be generaed. Oherwse, he compung complexy of NN s ncreased when he number of oupu node n NN s oo many. Therefore, fndng he proper number of oupu nodes n NN s an mporan research wor n hs sudy. Wegh assgnmen s one of he mos mporan facors durng NN desgn. Bascally, he weghs are conrolled by boh newor archecure and he parameers of he learnng algorhm. In addon, usng several layers and nodes n hdden layers causes he newor o be much more complex. Oher NN parameers such as npus, he number of hdden layers and her nodes, he number of memory aps and he learnng raes also affec NN performance. Genec algorhm (GA) ([5]-[34]) has been a popular approach o adng neural newor learnng. In [30], GA s appled o he desgn of boh NN srucure evoluon and wegh adapaon. Mahmoudabad e al. opmzed an NN wh he Levenberg-Maruard (LM) and GA n order o mprove s performance for grade esmaon purposes [3]. Samana e al. appled smulaed annealng for NN ranng [3]. Chaeree e al. appled GA n NNs, and showed ha hs combnaon can resul n beer NN performance [33]. Tahmaseb and Hezarhan used GA o opmze NN parameers and opology, and obaned mproved resuls [34]. However, he above GAs assume ha he number of oupu nodes n an NN s nown and can be se by he user n advance. Ths sudy proposes he CSVBNT whch, unle prevously appled GAs, s able o auomacally generae he proper number of oupu nodes n an NN. Tha s, when usng he proposed GA, he user does no need o se he number of oupu nodes of an NN n advance. 3

4 The conrbuons of hs sudy are summarzed as follows: () Ths sudy proposes he CSVBNT. To mprove he frs drawbac of exsng NTs, descrbed above, he CSVBNT s desgned o be opmzed accordng o boh he classfcaon error rae and compung complexy of he CSVBNT. To farly compare he performance of he CSVBNT and oher NTs, he sorage (.e. he number of nodes n he NTs) s consraned. Because of he sorage lmaon, deermnng whch node has he hghes prory o spl n he CSVBNT s an mporan desgn ssue. The expermens conduced n hs research demonsrae ha he CSVBNT has a lower classfcaon error rae han oher exsng NTs when hey have he same compung me. () To mprove he second drawbac of he exsng NTs descrbed above, GA s proposed o desgn he for each nernal node n he CSVBNT. The proposed GA has he ably o auomacally generae he weghs and deermne he proper number of oupu nodes n he accordng o boh he classfcaon error rae and compung complexy of he CSVBNT. Then, he CSVBNT ends o be opmzed. The remander of hs paper s organzed as follows. The concep desgn of our proposed mehods s descrbed n Secon II, and he desgn of CSVBNT wh he sorage consran s presened n Secon III. Secon IV presens he desgn of he GA. The expermens are descrbed n Secon V, and Secon VI presens he conclusons. II. CONCEPT DESIGN OF OU POPOSED METHODS In order o undersand he enre desgn of he CSVBNT, he followng descrbes he concep desgn of CSVBNT. The CSVBNT s a ree-srucured classfer, n whch each nernal node s desgned as an node, and each leaf node denoes he oupu space. Noably, he s a fully conneced sngle-layer NN and each oupu node n he oupu layer of represens a branch n he CSVBNT. The wo man conrbuons of he CSVBNT are descrbed below: () The growh sraegy of he CSVBNT s desgned based on boh he classfcaon error rae and compung complexy, and hen he CSVBNT ends o be opmzed. Fgure shows an example of 4

5 he performance of wo NTs, NT and NT. In Fg., f he deph of he NT s ncreased, he average classfcaon error rae s reduced and he average compuaon me s ncreased. From Fg., alhough he same classfcaon error rae e can be acheved when NT and NT have suffcen deph, NT sll has beer performance han NT. Ths s because he classfcaon error rae of NT s smaller han ha of NT (e <e 3 ) when hey have he same compung me,. Tha s, NT ouperforms NT when he deph or he number of nodes s lmed. Ths suaon means ha f he sorage space (.e. he number of nernal nodes n he NT) s lmed, boh he classfcaon error rae and compung complexy mus be used o denfy he bes NT. Therefore, he growh sraegy of he CSVBNT aes no accoun how o reduce boh he classfcaon error rae and compung complexy, raher han he growh sraeges of oher NTs ha only emphasze reducng he classfcaon error rae. Fgure. An example o llusrae wo NTs. () The nex conrbuons of he CSVBNT s ha he proposed GA s enable o auomacally search for he proper number of oupu nodes for each accordng o boh he classfcaon error rae and compung complexy of he CSVBNT. In he exsng NTs, he number of oupu nodes n he oupu layer of NN s usually se o he number of classes needed o be dsngushed, s no always he bes growh sregy o desgn he NT. Fgure shows a smple example o llusrae hs suaon. Fgure (a) shows he orgnal daa se conssng of hree classes of samples, Fg. (b) shows he parons of he daa se by he exsng NTs, and Fg. (c) shows he correspondng ree classfer. Inally, he daa se ncludes he hree classes of samples ( a, b and c ) n Fg. (a). Snce he daa se has hree classes of samples, he NN conaned n he node s hen desgned o have hree 5

6 oupu nodes n Fg. (c). Smlarly, node 3 n Fg. (c) consss of wo classes of samples, a and c, and he NN conaned n he node 3 s desgned o have wo oupu nodes o dsngush he wo classes of samples. Therefore, an npu sample classfed as shown n Fg. (c) mus be calculaed wh an average of wo NNs. However, Fg. (d) shows anoher paron of daa se, and Fg. (e) shows he correspondng NT ha he node s desgned o have four oupu nodes. In Fg. (e), he classfcaon of he npu sample reures an average of.7 NNs (< NNs). Thus, he NT shown n Fg. (e) han he NT shown n Fg. (c) has a shorer compung me when hey have he same classfcaon error rae. Tha s, he NN conaned n he node ends o be desgned wh four oupu nodes nsead of hree oupu nodes. Snce he proposed GA can fnd he proper number of oupu nodes for he accordng o boh he classfcaon error rae and compung complexy of he CSVBNT, he CSVBNT becomes a varable-branch ree classfer ha ends o be opmzed. (a) Orgnal daa se. (b) Paron of he daa se by he exsng NTs. (c) The exsng NTs. (d) Oher paron of daa se. (e) Oher beer NT. Fgure. An example o llusrae he NTs. 6

7 The followng descrbes he enre desgn flow of CSVBNT. In he ranng phase, he Desgn_CSVBNT algorhm descrbed n Secon III s proposed o desgn he CSVBNT. Fgure 3(a) shows he flow char of he Desgn_CSVBNT. In Fg. 3(a), he Desgn_CSVBNT algorhm s appled o desgn he CSVBNT by employng he GA descrbed n Secon IV o desgn he s n he CSVBNT. In he Desgn_CSVBNT, he number of nernal nodes ( nodes) denoes he sorage space reured o sore he CSVBNT. The Desgn_CSVBNT s connued o desgn he CSVBNT from op o boom unl he sorage space s reached. Afer he Desgn_CSVBNT s fnshed n he ranng phase, he CSVBNT can be obaned. Furhermore, he Desgn_CSVBNT(recursve) algorhm s proposed o mprove he effcency of Desgn_CSVBNT n fndng he bes soluon for he GA. The Desgn_CSVBNT(recursve) s a recursve verson of Desgn_CSVBNT, and hey all use he same GA o desgn he s n he CSVBNT. Fgure 3(b) presens he flow char of he Desgn_CSVBNT(recursve). In Fg. 3(b), he GA_ s used o employ he GA n a recursve way. In he sudy, he Desgn_CSVBNT(recursve) nsead of he Desgn_CSVBNT s used o generae he CSVBNT. In he esng phase, he Tes_CSVBNT algorhm descrbed n Secon III s used o explan how he npu sample ravels o he CSVBNT and derves he fnal classfcaon resul. Fg. 3(c) shows an example of he CSVBNT. In Fg. 3(c), he nernal nodes, A, B and C, are desgned as s, and he leaf nodes, L, L, L3, L4, L5 and L6, are used o poson he oupu space. Each oupu node n he has a relaed branch n he CSVBNT. In Fg 3(c), node A consss of hree branches because he A consss of hree oupu nodes n he oupu layer. Also, B and C conss of wo and hree oupu nodes, respecvely. Le he npu sample be npu o node A and classfed o he oupu node, O 3, n he A. The npu sample hen races he relaed branch of O 3 and reaches node C. Smlarly, he npu sample connues o be classfed n he C, and reaches node L 4. Fnally, he npu sample s posoned n he same class as he arrvng leaf node, L 4. The deals of he esng algorhm, Tes_CSVBNT, are descrbed n Secon III. 7

8 (a) The flow char of he Desgn_CSVBNT (b) The flow char of he Desgn_CSVBNT(recursve) (c) The example of CSVBNT ( : node : leaf node) Fgure 3. The concep desgn of he CSVBNT. 8

9 In order o enable readers o easly undersand he proposed mehods n hs sudy, Table lss he summary of mporan symbols used n Secons III, IV, and V. Table. Summary of symbols used n hs sudy Symbols Descrpon T CSVBNT desgned for he nernal node, V( ) compung complexy of he leaf node, (E. ()) CC ( ) compung complexy of (E. ()) f number of npus n r number of oupus n H se of all leaf nodes n he CSVBNT, T P( ) probably of fallng a node (E. (3)) V(T) compung complexy of he CSVBNT, T (E. (3)) Class( X ) class of ranng sample, X h cluser C S cener of cluser C m number of ranng samples conaned n he node P ( C, X ) probably ha X s classfed o C (E. (5)) M oupu space of node (E. (6)) ) classfcaon error rae of node (E. (7)) ( (T) classfcaon error rae of he CSVBNT, T (E. (8)) λ ( ) slope of he classfcaon error rae and compung complexy for he leaf node, (E. (9)) sorage space (Desgn_CSVBNT),, M varables used o record he range of soluon space (Desgn_CSVBNT(recursve)) θ mnmum soluon space sze (Desgn_CSVBNT(recursve)) L generaed n he lef subspace (Desgn_CSVBNT(recursve)) generaed n he rgh subspace (Desgn_CSVBNT(recursve)) h acvaon hreshold w, wegh beween he h npu node and he h oupu node n he O value of he h oupu node n he (E. ()) N populaon sze encoded by he h srng, λ () changes n boh he compung me and he classfcaon error rae of CSVBNT afer he for node s generaed by he srng, (E. (3)) ε random value used o change he weghs n he muaon phase (E. (5)) v(, ) wavele coeffcen a locaon (, ) n he subband (E. (7)) n sze of he subband (E. (6) and E. (7)) P c crossover rae muaon rae P m 9

10 III. DESIGN OF THE CSVBNT In he desgn of CSVBNT, boh of he compung complexy and classfcaon error rae of CSVBNT are emphaszed o be as small as possble. Before he desgn of CSVBNT s descrbed, boh of he compung complexy and classfcaon error rae of CSVBNT are defned n he followng. The compung complexy of CSVBNT s defned as follows. Le T denoe he CSVBNT, and le H denoe he se of all leaf nodes n T, ncludng leaf node. Le L( ) denoe he se of nernal nodes ha are reured o be calculaed when he npu sample ravels he CSVBNT from he roo node o he leaf node. Then, he compung complexy of he leaf node, V( ), s defned as V( )= CC ( ) () L( ),where CC ( ) denoe he compung complexy of for he nernal node,. Le he conan f npus and r oupu nodes. Then, he CC ( ), s defned as CC( ) fr. () Le P( ) represen he probably on he ranng samples n he leaf node complexy of T, V(T), s defned as H. The compung V ( T) P( ) V ( ) (3) The classfcaon error rae of CSVBNT s descrbed as follows. Le he leaf node conan m ranng samples, X ( x, x,..., x ) f f, for m, and le Class( X ) denoe he class of ranng sample, X. Le C = {( X, Class( X )) m } be a cluser ha collecs he ranng samples conaned n he leaf node. Le S be he cener of cluser C. The cener S of cluser C s defned as follows. S m X. (4) m 0

11 Thus, he probably ha X s classfed o C, P C, X ), s defned as ( P( C, X K ), (5) S X K [ ] S X h H h K where denoes he Eucldean dsance. Noably, P C, X ) sasfes he consrans, ( 0 P ( C, X ) and P ( C, X ). Then, he represenave of posoned n he oupu space s gven as H h h M P( C, X ( X, Class( X )) C m ( X, Class( X )) C m ) Class( X P( C, X ) ). (6) Then, he classfcaon error rae of leaf node, ), s hus defned as ( ( ) P( C, X )( Class( X ) M ). (7) ( X, Class( X )) C m Fnally, he classfcaon error rae of T, (T), s defned as ( T) P( ) ( ). (8) H In he desgn of CSVBNT, how o grow he ree srucure of CSVBNT s descrbed as follows. The growng mehod of CSVBNT selecs he bes leaf node o spl a a me. Tha s, he GA descrbed n Secon III aemps o spl each leaf node, and fnds he bes leaf node o branch n he CSVBNT. The followng explans how o deermne whch one s he bes leaf node. Le he leaf node conan m ranng samples. If he leaf node wll be spl n he CSVBNT, he GA s appled o hese m samples o desgn an for he node. Each oupu node n he oupu layer of he dnoes he correspondng chld node of node. Le he ree T ndcae he ree T afer r chld nodes,,, of are generaed by he GA. Then, he slope of he classfcaon error rae and,,,...,, r

12 compung complexy for he leaf node,, beween T and T s: λ( ( T) - ( T ) ). (9) V V ( T ) -V ( T) In he growng mehod of CSVBNT, he bes leaf node s defned as he node wh he larges value n E. (9). Therefore, le he se H conss of he leaf nodes n T. Afer each leaf node n H s desgned as an, we can oban u arg max λ( ). (0) H Then, he node u s he bes leaf node ha has he hghes prory o be spl n T. Fgure 4 shows an example o llusrae he growng mehod of CSVBNT. In Fg. 4, T denoes he CSVBNT afer he leaf node s spl n T, and T denoes he CSVBNT afer he leaf node s spl n T. The growng mehod selecs one node a a me o spl n T. From Fg. 4, T s beer han T because he classfcaon error rae of T s smaller han ha of T when he compung me s he same. Thus, he am of he desgn of CSVBNT s o maxmze he slope of he classfcaon error rae and compung complexy. Fgure 4. The relaon beween he classfcaon error rae and compung me n CSVBNT. The ranng and esng phases of CSVBNT are descrbed below. Tranng Phase: In he ranng phase, wo algorhms, Desgn_CSVBNT and Desgn_CSVBNT(recursve), are proposed o desgn he CSVBNT n hs sudy. The Desgn_CSVBNT(recursve) s a recursve verson of Desgn_CSVBNT. Before descrbng he desgn of Desgn_CSVBNT(recursve), he Desgn_CSVBNT s gven as follows. Fgure 3(a) also shows he flow char of he Desgn_CSVBNT.

13 Noably, he sorage space s defned as he number of nernal nodes of he CSVBNT n he Desgn_CSVBNT. Algorhm: Desgn_CSVBNT Inpu: The sorage speac and he ranng daa se. Oupu: The CSVBNT wh he sorage consran. Sep. Le he roo node,, of ree T conan all of he ranng samples n he ranng daa se. Se STOAGE= 0 and H={}. Sep. Whle Sep.. STOAGE For each node H and ( ) 0, perform he followng. Sep.. The GA descrbed n Secon IV s appled o he samples conaned n he node, and hen he s produced for node. Sep.. Le he conan r oupu nodes n he oupu layer. Each oupu node n he denoes a chld node of node,. Calculae he value of λ ) as E. (9). ( Sep.. Calculae u arg max λ( ) as E. (0). The node, u, s he beas leaf noode ha H has he hghes prory o be desgned as an, and hen he new CSVBNT, u T u, s produced. Le here be w chld nodes for node u. Delee he node u n he se of H and add hese w new nodes o he se of H. Sep.3. Se T= T and STOAGE=STOAGE+. Sep 3. Oupu he CSVBNT, T. u In Sep, he roo node conans all he ranng samples. The se H represens a collecon ha conans all nodes ha can be branched n he CSVBNT. Inally, se H conans only he roo node, n Sep. In he frs loop of Sep, he se H={} conans only he roo node, and hen he roo node, s seleced o be desgned as an node by he GA algorhm n Sep.. In Sep., assume ha 3

14 he roo node s spl no hree chld nodes,, and 3. Then, he se H ={,, 3 } can be obaned. In Sep.3, he CSVBNT s bul no a ree, T, ha conans only one roo node and hree leaf nodes, and he STOAGE value s ncreased o one. Fgure 5(a) shows he CSVBNT, T, afer he frs loop of Sep s complee. In he second loop of Sep, he GA algorhm s appled o nodes, and 3, conaned n H, hen he hree nodes, and, and hree values, λ ( ) 3, λ ( ) and λ ) ( 3, are produced n Sep.. In Sep., he bes node s he one wh he maxmal value of λ. Assume ha he value of λ ) s larger han boh values of λ ) and λ ) ( ( ( 3. Node s seleced o be desgned as he bul no he CSVBNT. Assume ha node conans wo chld nodes, and. Then, se H s updaed o {,,, 3 }. In Sep.3, he STOAGE value s ncreased o wo, and he new CSVBNT ree, T, replaces ree T. Fgure 5(b) shows he CSVBNT, T, afer he second loop of Sep s complee. In Sep of each loop, he bes node s seleced from se H o be desgned as he node n he CSVBNT, and hs sep s repeaed unl he sorage space lm s reached. (a) T (b) T Fgure 5. The example llusraes he Desgn_CSVBNT ( : node : leaf node). In hs sudy, he Desgn_CSVBNT(recursve), s also proposed o mprove he effcency of he Desgn_CSVBNT. The desgn of Desgn_CSVBNT(recursve) s descrbed as follows. Fgure 6 shows he dfference n callng GA o fnd he bes beween Desgn_CSVBNT and Desgn_CSVBNT(recursve). In Sep.. of he Desgn_CSVBNT algorhm, he number of samples m conaned n node ndcaes ha ha number of oupu nodes n he can be up o m. 4

15 If he value of m s large, he GA should spend more me searchng for he proper number of oupu nodes n he from m possble soluons, as shown n Fg. 6(a). To avod he GA havng o search a large soluon space o fnd he soluon, he overall soluon space can be dvded no wo or more smaller subspaces n a recursve way, and hen he GA can effcenly fnd he bes soluon n he smaller subspaces. Therefore, he Desgn_CSVBNT algorhm can be rewren as a recursve verson, namely Desgn_CSVBNT(recursve), whch uses he dvde-and-conuer sraegy o desgn he by callng he recursve algorhm GA_, as shown n Fg. 6(b). Fgure 3(b) also shows he flow char of he Desgn_CSVBNT(recursve). (a) The Desgn_CSVBNT (b) The Desgn_CSVBN(recursve) Fgure 6. The dfference n callng GA o fnd he bes beween Desgn_CSVBNT and Desgn_CSVBNT(recursve). ( : Call, : eurn) The Desgn_CSVBNT(recursve) algorhm s descrbed as follows. Algorhm: Desgn_CSVBNT(recursve) Inpu: The sorage speac and he ranng daa se. Oupu: The CSVBNT wh he sorage consran. 5

16 Sep. Le he roo node,, of ree T conan all of he ranng samples n he ranng daa se. Se STOAGE= 0 and H={}. Sep. Whle Sep.. STOAGE For each node n H and ( ) 0, perform he followng. Sep.. Le m be he number of samples conaned n he node. Call he GA_( =, = m ) o oban he for he leaf node. Sep.. Le he conan r oupu nodes n he oupu layer. Each oupu node n he denoes a chld node of he node,. Calculae he value of λ ) as E. (9). ( Sep.. Calculae arg max λ( ) as E. (0). The node,, s he beas leaf noode ha H has he hghes prory o be desgned as an, and hen he new CSVBNT, T, s produced. Le here be w chld nodes for he node. Delee he node n he se of H and add hese w new nodes o he se of H. Sep.3. Se T= T and STOAGE=STOAGE+. Sep 3. Oupu he CSVBNT, T. Algorhm: GA_ (Inpu:,Inpu: ) Sep. If, hen GA s appled o hese m samples conaned n he leaf node, o desgn he whose number of oupu nodes n he oupu layer s whn he range [, ]. Se Bes_= and Bes_F= he value of he bes fness. eurn boh he Bes_ and Bes_F. End. Sep. If, do he followng. Sep. Calculae M. 6

17 Sep. Call GA_(, M -). Then, boh he L and Bes_F L can be obaned. Sep.3 Call GA_( M, ). Then, boh he and Bes_F can be obaned. Sep 3. If Bes_F > Bes_F L Then Bes_= and Bes_F= Bes_F. Oherwse Bes_= and Bes_F= Bes_F L. eurn boh he Bes_ and Bes_F. End. L In Sep.. of Desgn_CSVBNT(recursve), GA_( =, = m ) s a recursve algorhm. Afer callng he GA_( =, = m ), he wh he number of oupu nodes whn he range [ =, = m ] can be generaed by he GA. Noably, wo classes ( =) are he mnmum number of classes ha need o be dsngushed, and he maxmal number of oupu nodes n s eual o m ( = m ) such ha each sample s regarded as he only member of s own class. Therefore, he generaed by he GA sll ends o be opmal because he GA has a global search whn he range [, m ]. In Sep of he GA_, f he range of he soluon space, [, ], s small, he GA s drecly used o desgn he wh he number of oupu nodes n he oupu layer whn he range [, ]. Then, he GA_ reurns boh he bes and he maxmal fness value generaed by he GA. In he Sep of he GA_, f he range of he soluon space, [, ], s large, he soluon space, [, ], s dvded no wo subspaces, [, M -] and [ M, Tha s, he GA s used o desgn he s n hese wo subspace, [, M -] and [ M, 7 ], n a recursve way. ], nsead of he whole soluon space, [, ]. In Sep 3, he GA_ reurns boh of he varables, Bes_ and Bes_F, recordng he bes and he maxmal value of he fness generaed by GA n hese wo subspaces, respecvely.

18 Noably, n he GA_, he hreshold θ s no a crcal value for users. If he hreshold θ s small, he overall soluon space can be dvded no smaller (or more) subspaces, and hen he GA can effcenly fnd he bes soluon n he smaller subspaces. Oherwse, he soluon space wll be cu no larger (or fewer) subspaces. However, f he hreshold θ s large enough, he Sep wll only be performed n GA_, and hen Desgn_CSVBNT(recursve) and Desgn_CSVBNT become he same algorhm. Afer he ranng phase s fnshed, he CSVBNT can be obaned. In he CSVBNT, each leaf node s used o poson he oupu class of he npu samples. However, once here wll be many dfferen classes of ranng samples classfed o he same leaf node n he ranng phase. Then, he oupu class represenng a leaf node s defned as he class o whch he larges number of ranng samples belongs n he leaf node. Tesng Phase: The esng algorhm, Tes_CSVBNT, s proposed o classfy an npu sample n he CSVBNT. Before he Tes_CSVBNT s gven, he followng descrbes how o classfy an npu sample n he for he nernal node. Le he conan f npus and r oupu nodes. The values of for 0< <, r denoe he acvaon hresholds. The values of w, for 0< w, <, f, and r, ndcae he weghs beween he npu and oupu layers. Le he sample, X ( x, x,..., x ) f f, be he npu o he. In he, he value of he h oupu node, O, s he sum of s weghed npus as follows. O f ( w x ) r (), Le u arg max{ O r }. () The npu sample, X, s hen classfed o he uh oupu node n he. 8

19 The Tes_CSVBNT algorhm s gven as follows. Algorhm: Tes_CSVBNT Inpu: The npu sample X ( x, x,..., x ) f f and ree classfer, CSVBNT. Oupu: The class of X. Sep. Le be he roo node of he CSVBNT. Sep. Whle s no a leaf node Sep.. The npu sample, X, s he npu o he. Calculae he values, O, for r, as E. (). Sep.. Calculae u arg max{ O r } as E. (). Sep.3. The npu sample, X, s classfed o he uh oupu node n he. Then, he npu sample, X, owards he correspondng chld node, u, of he node,, n he Sep.3. Se = u CSVBNT. Sep 3. Oupu he represenave class of node. IV. DESIGN OF THE GA There are wo desgn ssues for he GA. Frs, he GA searches for he weghs n he. Nex, he GA can auomacally search for he proper number of oupu nodes n he oupu layer of he based on he classfcaon error rae and compung complexy of CSVBNT. Le T be he CSVBNT, and le be he node ha conans m npu samples n T. The man goal of GA s o desgn he for node. Each oupu node n he oupu layer of he s desgned as a correspondng chld node of node, n he CSVBNT. The GA s desgned based on he genec algorhm, whch ncludes he nalzaon sep and hree phases: reproducon, crossover, and muaon. They are descrbed as follows. 9

20 Inalzaon In he nalzaon sep of GA, a populaon of N srngs,,...,, N, s randomly generaed. The lengh of each srng s se o be smaller han, or eual o, he value of m,.e. he lengh of each srng s varan n GA. Tha s, he number of chld nodes of node generaed by he GA s whn he range [, m ]. Le he srng encode he, whch has f npus and r oupu nodes, for N. The values of denoe he h acvaon hresholds for 0< <, r. The values of w, ndcae he weghs beween he npu and oupu layers for 0< w, <, f and r. The srng hen encodes he, as follows. =(O_node (),O_node (),, O_node ( r )), where O_node ()=(, w,, r. w f ), for The followng s an example of he nalzaon sep. Le here be wo elemens (f=) for each npu sample, and le as follows. r =3 be a random value generaed whn [, m ]. hen encodes he soluon =(, w, w,, w, w, 3, w 3, w 3 ) =(O_node (), O_node (), O_node (3)),where O_node ()=(, w, w ), O_node ()=(, w, w ), and O_node (3)=( 3, w 3, w 3 ). Fgure. 7 also shows he correspondng encoded by he srng. Fgure 7. The correspondng encoded by he srng. 0

21 eproducon: In he reproducon phase, he defnon of fness depends on boh he classfcaon error rae and compung complexy of he CSVBNT. The followng descrbes he defnon of fness for he srng. Before he fness of srng s defned, he followng descrbes how o classfy a ranng sample n he. Le X ( x, x,..., x ) f f be a sample conaned n node. In he, he values, O f ( w x ), for r, are calculaed accordng o E. (), and he value of u arg max{ O r } s calculaed va E. (). Sample X s hen classfed o he uh oupu node n he classfed o hese. Afer all of hese m ranng samples conaned n node have been r oupu nodes, he ranng samples classfed o he same oupu nodes can be colleced o be desgned as chld nodes of. Tha s, r chld nodes,,..., of node are generaed based on hese r oupu nodes n he. Le he ree T ndcae he ree T afer he r chld nodes of node have been generaed. The fness funcon of srng s hen defned as:, Fness( )= λ (), for N, (3) where λ () s defned n E. (9) as represenave of changes n boh he compung me and he classfcaon error rae of CSVBNT. Afer he fness values of hese N srngs,,...,, N, n he populaon have been obaned, he probably of each srng beng seleced for he nex generaon can hen be calculaed as follows: Prob( )= ( ) N l ( ) l, for N. (4) Noably, Prob( ) sasfes 0 Prob ( ) for N, and Prob ( ). In he reproducon phase, he reproducon operaor selecs N srngs as he new populaon n he nex generaon accordng N

22 o he probably values, Prob( ) for N. The followng shows an example of he reproducon phase. Le here be fve probably values, Prob( )=0., Prob( )=0.5, Prob( 3 )=0., Prob( 4 )=0.5 and Prob( 5 )=0.4, for fve srngs, for 5, n he populaon. Fgure 8 shows he dsrbuon of hese fve probably values n he range [0, ]. In he reproducon phase, he reproducon operaor generaes fve random values whn [0, ] such as 0.5, 0.3, 0.5, 0.7 and 0.9, o deermne whch srngs are seleced for he nex generaon. From Fg. 8, he reproducon operaor selecs,, 4, 5, and 5, o be he new populaon n he nex generaon. Noably, he srng 5 s seleced wce and 3 s omed. The meanng of he reproducon phase s ha he srng wh hgher fness has a greaer probably of beng repeaedly seleced and he srngs wh lower fness may be removed n he nex generaon. Prob( )=0. Prob( )=0.5 Prob( 3 )=0. Prob( 4 )=0.5 Prob( 5 )= Fgure 8. The dsrbuon of fve probably values n he range, [0. ]. Crossover: In he crossover phase, he crossover operaor s appled o he populaon of N srngs. Then, a par of srngs, x and y, s seleced o do crossover operaor. Then, wo random negers, a and b, are generaed o deermne whch peces of he srngs are o be nerchanged. Noably, f he number of oupu nodes conaned n he srng s ousde he range [, m ] afer he crossover operaor s compleed, he wo values, a and b, should be randomly generaed agan. Afer he crossover operaor s fnshed, wo new srngs, ˆ x and ˆ y, replace he srngs, x and y, n he populaon. The sgnfcance of he crossover phase s ha exchanges he oupu nodes ncludng he conneced weghs beween dfferen srngs, o yeld varous neural newors. The followng shows an example of he crossover phase. Le here be wo elemens (f=) for each sample. Le conan hree oupu nodes, and le conan four oupu nodes. Then, afer he

23 crossover operaor wh wo random negers (a= and b=) s appled o he par of srngs, and, wo new srngs, ˆ and ˆ, are generaed n he nex generaon., where, where a= = ( O _ node (), O _ node ()) O_node ()=(, w, w ), O_node ()=(, w, w ). b= =( O _ node (), O _ node (), O _ node (3)) O_node ()=(, w, w ), O_node ()=(, w, w ), O_node (3)=( 3, w 3, w 3). Afer he crossover operaor s appled o and, wo new srngs, ˆ and ˆ, are generaed as follows. ˆ = ( O _ node (), _ ( ) O node, O _ node()) ˆ = ( O _ node( ), O _ node(3)) Muaon: In he muaon phase, he weghs of he srngs n he populaon are randomly chosen wh a probably. Each chosen wegh s added by he mulplcaon of he chosen wegh and a random value (0<ε<). The followng shows an example of he muaon phase. Le be presened as follows: = ( O _ node (), O _ node ()), where O_node ()=(, w, w ), O_node ()=(, w, w ). 3

24 If he wegh w s chosen o perform he muaon, he new wegh w replaces wegh w as follows: w = w ±w *ε, (5) where he value, ε, s a random value whn he range [0, ]. Afer he muaon phase, he new srng can be obaned and replace he orgnal srng. The user may specfy he number of generaons over whch o run n he GA. Suppose ha he srng ˆ x wh he bes fness generaes he wh rˆ x oupu nodes. Then, rˆ x chld nodes of node are generaed n T, accordng o hese rˆ x oupu nodes conaned n he. V. EXPEIMENTS A. Expermenal seng In he expermens, he Desgn_CSVBNT(recursve) algorhm s used o desgn he CSVBNT wh he sorage consran. The proposed CSVBNT s compared wh oher NTs under he same number of nernal nodes. A run of weny mes s carred ou for accuraely proposng he resuls of he CSVBNT and oher NTs. In our proposed mehods, he hreshold, m 0, s appled o he GA_ snce he one-enh sze of he overall soluon space s suffcenly small for he GA o effcenly fnd he bes soluon. Also, he parameers used n he GA are as follows: populaon of 300, crossover rae, Pc = 80%, and muaon rae, Pm = 5%. Fve hundred generaons are run n he GA, and he bes soluon s reaned. All he expermens are carred ou on personal compuers. Four daa ses: speech, raffc sgn mages, naural mages, and chessboard daa ses are used o es he CSVBNT and oher NTs n he expermens. These four daa ses are descrbed as follows. () In he speech daa se, he ISOLET daabase usng he 6 leers of he Englsh alphabe s used n he solaed word recognon es. The speech daa se consss of 640 uerances ncludng 4000 ranng uerances and 40 esng uerances, from 0 speaers. Each uerance s sampled a 6 Hz wh a 6-b resoluon. A Hammng wndow of 0 ms wh 50% overlap s used o process each uerance furher by Fas Fourer Transform (FFT). Each uerance s dvded no 5 Hammng wndows, wh each represened by 3 FFT coeffcens; ha s, each uerance consss of 480 4

25 feaures. () The raffc sgn mages daa se s obaned from he GTSB daabase [35]. The ranng se consss of 5000 mages, and he oher 5000 mages are used o es he mehods n our expermens. All mages belong o fory classes. The acual raffc sgn s no always cenered whn he mage; s boundng box s par of he annoaons. We crop all he mages and process hem only whn he boundng box, and resze hem o acheve suare boundng boxes. All raffc sgn mages are reszed o 48x48 pxels. (3) The naural mages of 3x3 pxels are obaned from he CIFA0 daabase [36]. The CIFA0 daabase consss of en classes of mages, each wh 5000 ranng mages and 000 esng mages. Images vary grealy whn each class. They are no necessarly cenered, and may conan only pars of he obec, and show dfferen bacgrounds. (4) The symmercally dsrbued four-class chessboard daa se s used o es he CSVBNT and oher curren mehods. Fgure 4(a) shows he chessboard daa se ha consss of 400 paerns eually dsrbued among four classes. A fve-fold cross-valdaon s performed, and average resuls are presened. In boh he raffc sgn and naural mages daa ses, all color mages should be ransformed no he gray mages n he expermens. Then, each mage s dvded no blocs of 6 6 pxels. Each bloc s hen ransformed by a Haar wavele ransform [37] o oban four subbands. The mean values (mv) and sandard devaons (sd) of he four subbands are calculaed as follows: mv n n v(, ),, (6) n sd, (7) n v(, mv ), where n denoes he sze of he subband, whch s se o 8 n hs expermen, and v(, ) denoes he wavele coeffcen a locaon (, ) n he subband. Therefore, each bloc, whch conans four subbands, can be represened by a feaure vecor wh egh values snce each subband s assocaed wh wo values, sd and mv. 5

26 B. Performance of CSVBNT Before esng he performance of CSVBNT, he sensvy of hese wo parameers, Pc and Pm, n he GA s descrbed as follows. If Pc s se o 50%, he GA reures wce he number of generaons o ge a smlar or poor soluon n four daa ses, compared wh when Pc s se o 80% or 90%. Ths s because he use of a oo small Pc wll affec he effcency of he GA. Also, when he probably Pm s se o less han 0%, he GA can oban smlar resuls under he same number of generaons. However, f Pm s se o 5%, he GA usually does no converge. Ths s because he srngs are changed oo much n he GA, so ha he GA canno converge o a soluon. Therefore, boh parameers, Pc = 80% and Pm = 5%, are appled o he GA n he expermens. In Table, he dfferen sorages of are used o desgn he CSVBNT on four daases. In hs sudy, he sorage s defned as he number of nernal nodes n he CSVBNT. Noably, because each nernal node conans an, he sorage,, also denoes he oal number of s dsgned n he CSVBNT. In Table, he Num_O denoes he average number of chld nodes of an nernal node (or he average number of oupu nodes n he oupu layer of an ), he DEP denoes he average deph of he CSVBNT, and he CE denoes he average classfcaon error rae on he ranng daase. The classfcaon error rae s defned as E. (8). When he value of s se o, represens ha he sorage space s unlmed, and hen he CSVBNT s grown unl he classfcaon error rae s less han a small hreshold ( = %), or he ranng error rae exhbs no obvous decrease. In he expermens, he proposed CSVBNT s compared wh four NTs: GNT [3], AHNT [8], NNTree [0] and BNT [4]. To farly compare our proposed CSVBNT and oher NTs, he CSVBNT and oher NTs have he same sorage space (.e., he same number of nernal nodes) n he expermen. In Table 3, he TIME denoes he average compung me for a esng sample, and he CE denoes he average classfcaon error rae on he esng daase. In Table 3, we observe ha boh of he AHNT and NNTree han he proposed CSVBNT have lower classfcaon error rae when he sorage space s he same. The reason s ha he MLP allows o dvde he npu space wh arbrary hypersurfaces n he AHNT and NNTree. Tha s, f he dsrbuon of samples conaned n he node s 6

27 complex (non-lnear dsrbuon), he node s preferred n desgnng an MLP, and hen he classfcaon error rae of boh he AHNT and NNTree can be decreased. However, we also observe ha boh of he AHNT and NNTree usng he MLPs han he CSVBNT usng he s ae more compung me when hey have he same sorage n Table 3. The reason s ha he han he MLP has lower compung complexy. The compung complexy of an MLP s usually larger han wce he compung complexy of he. Fgure 9 shows he expermenal resuls proposed n Table 3, and Fgure 0 shows he classfcaon resuls obaned by he CSVBNT on he chessboard daa se. From Fg. 9, we observe ha he CSVBNT han oher NTs has lower classfcaon error rae when hey have he same compung me. Two reasons are offered as follows. () The proposed growng sraegy desgns he CSVBNT accordng o he classfcaon error rae and compung complexy of CSVBNT, whle he oher NTs ncludng GNT, AHNT, NNTree and BNT, only consder he reducon of he classfcaon error rae. () The GA s capable of searchng for he proper number of oupu nodes n he accordng o he classfcaon error are and compung complexy of CSVBNT. Fgures and has shown ha he characerscs of he CSVBNT dffer from hose of oher NTs. Table. The effcency of he CSVBNT on he ranng daases. Daa Ses Speech Traffc sgn mages Naural mages Chessboard Num_O DEP CE (%)

28 Table 3. The performance he CSVBNT and oher NTs on he esng daases Daa Ses Speech Traffc sgn mages Naural mages Chessboard CSVBNT GNT [3] BNT [4] NNTree [0] AHNT [8] CE (%) TIME (sec) CE (%) TIME (sec) CE (%) TIME (sec) CE (%) TIME (sec) CE (%) TIME (sec)

29 (a) Speech daase (b) Traffc sgn mages daase (c) Naural mages daase (d) Chessboard daase Fgure 9. The performance of CSVBNT and oher NTs. (a) Chessboard daa se (b) The resul obaned by he CSVBNT. Fgure 0. The classfcaon resul by he CSVBNT on he chessboard daase. 9

30 VI. CONCLUSIONS Ths sudy proposes he CSVBNT based on GA. The CSVBNT ends o be opmal because s desgn aes no accoun how o reduce boh he classfcaon error rae and compung complexy. The CSVBNT s also a varable-branch neural ree because he number of oupu nodes n he oupu layer of each node s auomacally deermned by he GA accordng o boh he classfcaon error rae and compung complexy of he CSVBNT. Furhermore, hs sudy proposes Desgn_CSVBNT (recursve), whch operaes smlarly o he dvde-and-conuer sraegy for effcen desgn. Desgn_CSVBNT(recursve) s able o deermne whch node has he hghes prory o be seleced o spl n he CSVBNT under he sorage consran. The expermen resuls n hs sudy demonsrae ha he CSVBNT has lower classfcaon error rae han exsng NTs when hey have he same compung me. 30

31 Dsclosure of Poenal Conflcs of Ineres Shueng-Ben Yang s he correspondng auhor of hs manuscrp led " Consraned-Sorage Varable-Branch Neural Tree for Classfcaon". He declare ha here s no conflc of neres n hs manuscrp. 3

32 EFEENCES [] Gelfand,S. B., avshanar,c. S., and Delp, E. J. (99). an Ierave Growng and Prunng Algorhm for Classfcaon Tree Desgn. IEEE Trans. Paern Analyss and Machne Inell., 3(), [] Yldz, O. T. and Alpaydn, E. (00). Omnvarae Decson Trees. IEEE Trans. Neural Newor, (6), [3] Zhao, H. and S. am (004). Consraned Cascade Generalzaon of Decson Trees. IEEE Trans. Knowledgemen and daa Engneerng, 6(6), [4] Gonzalo, M. M. and Albero, S. (004). Usng All Daa o Generae Decson Tree Ensembles. IEEE Trans. Sysems, Man, Cybernecs, C, Applcaons and evews, 34(4), [5] Wold, P. and Zenon, A. S. (005). C-fuzzy Decson Trees. IEEE Trans. Sysems, Man, Cybernecs, C, Applcaons and evews, 35(4), [6] Wang, X. B. Chen, G. Q., and Ye, F. (000). On he Opmzaon of Fuzzy Decson Trees. Fuzzy Ses and Sysems, (3), 7-5. [7] Deffuan, G., Neural uns recrumen algorhm for generaon of decson rees., Proceedngs of he nernaonal on conference on neural newors, (990) [8] Lppmann,., An nroducon o compung wh neural nes., IEEE Acouscs, Speech, and Sgnal Processng Magazne. 4() (987) 4. [9] Sanar, A., & Mammone,., Neural ree newors. In Neural newor: heory and applcaon, San Dego, CA, USA: Academc Press Professonal, Inc. 99, pp [0] Seh, I. K. and Yoo, J., Srucure-drven nducon of decson ree classfers hrough neural learnng, Paern ecognon. 30() (997) [] Sra, J., & Nadal, J., Neural rees: a new ool for classfcaon, Neural Newor. (990) [] T. L, Y. Y. Tang, and F. Y. Fang, A srucure-parameer-adapve (SPA) neural ree for he recognon of large characer se, Paern ecogn. 8(3) (995) [3] M. Zhang and J. Fulcher, Face recognon usng arfcal neural newors group-based adapve olerance (GAT) rees, IEEE Trans. Neural Newors. 7 (996) [4] G. L. Fores and G. G. Peron, Explong neural rees n range mage undersandng, Paern ecogn. Le. 9(9) (998) [5] H. H. Song and S.W. Lee, Aself-organzng neural ree for large se paern classfcaon, IEEE 3

33 Trans. Neural Newors. 9 (998) [6] G. L. Fores, Oudoor scene classfcaon by a neural ree based approach, Paern Anal. Applc. (999) 9 4. [7] H. Guo and S. B. Gelfand, Classfcaon rees wh neural newors feaure exracon, IEEE Trans. Neural Newors. (99) [8] G. L. Fores, An adapve hgh-order neural ree for paern recognon, IEEE Trans. Sysems, man, cybernecs-par B: cybernecs. 34 (004) [9] G. L. Gles and T. Maxwell, Learnng, nvarance, and varable-branchzaon n hgh-order neural newors, 6 (987) [0] P. Ma, Effcen desgn of neural newor ree usng a sngle splng creron, Nerocompung. 7 (008) [] P. E. Ugoff, Percepron ree: a case sudy n hybrd concep represenaon, Proc. VII Na. Conf. Arfcal Inellgence. (998) [] J. A. Sra and J. P. Nadal, Neural ree: a new ool for classfcaon, Newor. (990) [3] G. L. Fores and C. Mchelon, Generalzed neural rees for paern classfcaon, IEEE Trans. Neural Newors. 3 (00) [4] C. Mchelon, A. an, S. Kumarb, G. L. Fores, A balanced neural ree for paern classfcaon, Neural Newors. 7 (0) [5] D. Goldberg, Genec Algorhms n Search, Opmzaon, and Machne Learnng. eadng, MA: Addson-Wesley, 989. [6] J. Koza, Genec Programmng. Cambrdge, MA: MIT Press, 99. [7] S. Grossberg, Ed., Neural Newors and Naural Inellgence. Cambrdge, MA: MIT Press, 988. [8] D. umelhar and J. McClelland, Eds., Parallel Dsrbued Processng: Exploraons n Mcrosrucure of Cognon. Cambrdge, MA: MIT Press, 986. [9] J. M. Zurada, Ed., Inroducon o Neural Sysems. S. Paul, MN:Wes, 99. [30] P. J. Angelne, G. M. Saunders, and J. B. Pollac, An evoluonary algorhm ha consrucs 33

34 recurren neural newors, IEEE Trans. Neural New., vol. 5, no., pp , Jan [3] Mahmoudabad H., Izad M., Menha M.B. A hybrd mehod for grade esmaon usng genec algorhm and neural newors. Compuaonal Geoscences. 009;3:9 0. [3] Samana B., Bandopadhyay S., Gangul. Daa segmenaon and genec algorhms for sparse daa dvson n Nome placer gold grade esmaon usng neural newor and geosascs. Mnng Exploraon Geology. 004;( 4): [33] Chaeree S., Bandopadhyay S., Machuca D. Ore grade predcon usng a genec algorhm and cluserng based ensemble neural newor model. Mahemacal Geoscences. 00;4(3): [34] Tahmaseb P., Hezarhan A. IAMG09. Sanford Unversy; Calforna: 009. (Applcaon of Opmzed Neural Newor by Genec Algorhm). [35] J. Sallamp, M. Schlpsng, J. Salmen, and C. Igel. The German Traffc Sgn ecognon Benchmar: A mul-class classfcaon compeon. In Inernaonal Jon Conference on Neural Newors, 0. [36] A. Krzhevsy. Learnng mulple layers of feaures from ny mages. Maser s hess, Compuer Scence Deparmen, Unversy of Torono, 009. [37]. C. Gonzalez and. E. Woods. Dgal Image Processng. Addson Wesley, Boson, MA,

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

MANY real-world applications (e.g. production

MANY real-world applications (e.g. production Barebones Parcle Swarm for Ineger Programmng Problems Mahamed G. H. Omran, Andres Engelbrech and Ayed Salman Absrac The performance of wo recen varans of Parcle Swarm Opmzaon (PSO) when appled o Ineger

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window A Deermnsc Algorhm for Summarzng Asynchronous Sreams over a Sldng ndow Cosas Busch Rensselaer Polyechnc Insue Srkana Trhapura Iowa Sae Unversy Oulne of Talk Inroducon Algorhm Analyss Tme C Daa sream: 3

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

2/20/2013. EE 101 Midterm 2 Review

2/20/2013. EE 101 Midterm 2 Review //3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

Parameter Estimation of Three-Phase Induction Motor by Using Genetic Algorithm

Parameter Estimation of Three-Phase Induction Motor by Using Genetic Algorithm 360 Journal of Elecrcal Engneerng & Technology Vol. 4, o. 3, pp. 360~364, 009 Parameer Esmaon of Three-Phase Inducon Moor by Usng Genec Algorhm Seesa Jangj and Panhep Laohacha* Absrac Ths paper suggess

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Boosted LMS-based Piecewise Linear Adaptive Filters

Boosted LMS-based Piecewise Linear Adaptive Filters 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar,

More information

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks

Efficient Asynchronous Channel Hopping Design for Cognitive Radio Networks Effcen Asynchronous Channel Hoppng Desgn for Cognve Rado Neworks Chh-Mn Chao, Chen-Yu Hsu, and Yun-ng Lng Absrac In a cognve rado nework (CRN), a necessary condon for nodes o communcae wh each oher s ha

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Detection of Waving Hands from Images Using Time Series of Intensity Values

Detection of Waving Hands from Images Using Time Series of Intensity Values Deecon of Wavng Hands from Images Usng Tme eres of Inensy Values Koa IRIE, Kazunor UMEDA Chuo Unversy, Tokyo, Japan re@sensor.mech.chuo-u.ac.jp, umeda@mech.chuo-u.ac.jp Absrac Ths paper proposes a mehod

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes Ansoropc Behavors and Is Applcaon on Shee Meal Sampng Processes Welong Hu ETA-Engneerng Technology Assocaes, Inc. 33 E. Maple oad, Sue 00 Troy, MI 48083 USA 48-79-300 whu@ea.com Jeanne He ETA-Engneerng

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Theoretical Analysis of Biogeography Based Optimization Aijun ZHU1,2,3 a, Cong HU1,3, Chuanpei XU1,3, Zhi Li1,3

Theoretical Analysis of Biogeography Based Optimization Aijun ZHU1,2,3 a, Cong HU1,3, Chuanpei XU1,3, Zhi Li1,3 6h Inernaonal Conference on Machnery, Maerals, Envronmen, Boechnology and Compuer (MMEBC 6) Theorecal Analyss of Bogeography Based Opmzaon Aun ZU,,3 a, Cong U,3, Chuanpe XU,3, Zh L,3 School of Elecronc

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Let s treat the problem of the response of a system to an applied external force. Again,

Let s treat the problem of the response of a system to an applied external force. Again, Page 33 QUANTUM LNEAR RESPONSE FUNCTON Le s rea he problem of he response of a sysem o an appled exernal force. Agan, H() H f () A H + V () Exernal agen acng on nernal varable Hamlonan for equlbrum sysem

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations. Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

Attributed Graph Matching Based Engineering Drawings Retrieval

Attributed Graph Matching Based Engineering Drawings Retrieval Arbued Graph Machng Based Engneerng Drawngs Rereval Rue Lu, Takayuk Baba, and Dak Masumoo Fusu Research and Developmen Cener Co LTD, Beng, PRChna Informaon Technology Meda Labs Fusu Laboraores LTD, Kawasak,

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Particle Swarm Optimization Algorithm with Reverse-Learning and Local-Learning Behavior

Particle Swarm Optimization Algorithm with Reverse-Learning and Local-Learning Behavior 35 JOURNAL OF SOFTWARE, VOL. 9, NO. 2, FEBRUARY 214 Parcle Swarm Opmzaon Algorhm wh Reverse-Learnng and Local-Learnng Behavor Xuewen Xa Naonal Engneerng Research Cener for Saelle Posonng Sysem, Wuhan Unversy,

More information

Modeling and Solving of Multi-Product Inventory Lot-Sizing with Supplier Selection under Quantity Discounts

Modeling and Solving of Multi-Product Inventory Lot-Sizing with Supplier Selection under Quantity Discounts nernaonal ournal of Appled Engneerng Research SSN 0973-4562 Volume 13, Number 10 (2018) pp. 8708-8713 Modelng and Solvng of Mul-Produc nvenory Lo-Szng wh Suppler Selecon under Quany Dscouns Naapa anchanaruangrong

More information

ECE 366 Honors Section Fall 2009 Project Description

ECE 366 Honors Section Fall 2009 Project Description ECE 366 Honors Secon Fall 2009 Projec Descrpon Inroducon: Muscal genres are caegorcal labels creaed by humans o characerze dfferen ypes of musc. A muscal genre s characerzed by he common characerscs shared

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov June 7 e-ournal Relably: Theory& Applcaons No (Vol. CONFIDENCE INTERVALS ASSOCIATED WITH PERFORMANCE ANALYSIS OF SYMMETRIC LARGE CLOSED CLIENT/SERVER COMPUTER NETWORKS Absrac Vyacheslav Abramov School

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

Algorithm Research on Moving Object Detection of Surveillance Video Sequence *

Algorithm Research on Moving Object Detection of Surveillance Video Sequence * Opcs and Phooncs Journal 03 3 308-3 do:0.436/opj.03.3b07 Publshed Onlne June 03 (hp://www.scrp.org/journal/opj) Algorhm Research on Movng Objec Deecon of Survellance Vdeo Sequence * Kuhe Yang Zhmng Ca

More information

RADIAL BASIS FUNCTION PROCESS NEURAL NETWORK TRAINING BASED ON GENERALIZED FRÉCHET DISTANCE AND GA-SA HYBRID STRATEGY

RADIAL BASIS FUNCTION PROCESS NEURAL NETWORK TRAINING BASED ON GENERALIZED FRÉCHET DISTANCE AND GA-SA HYBRID STRATEGY Compuer Scence & Engneerng: An Inernaonal Journal (CSEIJ), Vol. 3, No. 6, December 03 RADIAL BASIS FUNCTION PROCESS NEURAL NETWORK TRAINING BASED ON GENERALIZED FRÉCHET DISTANCE AND GA-SA HYBRID STRATEGY

More information

Fuzzy Set Theory in Modeling Uncertainty Data. via Interpolation Rational Bezier Surface Function

Fuzzy Set Theory in Modeling Uncertainty Data. via Interpolation Rational Bezier Surface Function Appled Mahemacal Scences, Vol. 7, 013, no. 45, 9 38 HIKARI Ld, www.m-hkar.com Fuzzy Se Theory n Modelng Uncerany Daa va Inerpolaon Raonal Bezer Surface Funcon Rozam Zakara Deparmen of Mahemacs, Faculy

More information