Author s Accepted Manuscript

Size: px
Start display at page:

Download "Author s Accepted Manuscript"

Transcription

1 Auhor s Acceped anuscrp Dscrmnave srucure selecon mehod of gaussan mxure models wh s applcaon o handwren dg recognon Xuefeng Chen Xab Lu Yunde Ja PII: S095-3(0) DOI: do:0.06/j.neucom Reference: NEUCO To appear n: Neurocompung Receved dae: 0 January 00 Revsed dae: 4 November 00 Acceped dae: 8 November 00 Ce hs arcle as: Xuefeng Chen Xab Lu and Yunde Ja Dscrmnave srucure selecon mehod of gaussan mxure models wh s applcaon o handwren dg recognon Neurocompung do:0.06/j.neucom Ths s a PD fle of an uneded manuscrp ha has been acceped for publcaon. As a servce o our cusomers we are provdng hs early verson of he manuscrp. The manuscrp wll undergo copyedng ypeseng and revew of he resulng galley proof before s publshed n s fnal cable form. Please noe ha durng he producon process errors may be dscovered whch could affec he conen and all legal dsclamers ha apply o he journal peran.

2 Dscrmnave Srucure Selecon ehod of Gaussan xure odels wh s Applcaon o Handwren Dg Recognon Xuefeng Chen Xab Lu * Yunde Ja Bejng Laboraory of Inellgen Informaon Technology School of Compuer Scence and Technology Bejng Insue of Technology Bejng 0008 Chna Absrac odel srucure selecon s currenly an open problem n modelng daa va Gaussan xure odels (G). Ths paper proposes a dscrmnave mehod o selec G srucures for paern classfcaon. We nroduce a G srucure selecon creron based on a dscrmnave objecve funcon called Sof arge based ax-n poseror Pseudo-probables (Sof-P). The srucure and he parameers of he opmal G are esmaed smulaneously by seekng he maxmum value of Laplace s approxmaon of he negraed Sof-P funcon. The lne search algorhm s employed o solve hs opmzaon problem. We evaluae he proposed G srucure selecon mehod hrough he expermens of handwren dg recognon on he well-known CENPARI and NIST dg daabases. Our mehod behaves beer han he manual mehod and he generave counerpars ncludng Bayesan Informaon Creron (BIC) nmum Descrpon Lengh (DL) and AuoClass. urhermore o our bes knowledge he dg classfer raned by usng our mehod acheves he bes error rae so far on he CENPARI daabase and he error rae comparable o he currenly bes ones on he NIST daabase. Keywords: Gaussan xure odels (G) Srucure selecon parameer esmaon dscrmnave learnng ne xure odels () ax-n poseror Pseudo-probables (P). * Correspondng auhor. Tel.: ax: E-mal addresses: crocodel@b.edu.cn (X. Chen) luxab@b.edu.cn (X. Lu) jayunde@b.edu.cn (Y. Ja).

3 . Inroducon Gaussan xure odel (G) s a wdely used sascal ool n paern classfcaon. They are flexble enough o approxmae any gven densy wh hgh accuracy []. In fng G o daa we need o selec he number of G componens and esmae parameers n he G wh ceran number of componens. The wo asks are usually known as srucure selecon and parameer esmaon respecvely. Currenly he sasfacory resuls of G parameer esmaon have been repored n many leraures. Bu how o selec approprae G srucures s sll a challenge. Exsng mehods of G srucure selecon can be dvded no fve caegores: cross-valdaon [-3] sochasc mehods [4] nformaon heory approaches [5-7] nfne Gaussan xure odels [8-9] and Bayesan mehods [ 0-3]. The man dea of he las caegory s o evaluae model srucures usng he negral over parameers. Varous crera under Bayesan framework have been developed for G srucure selecon ncludng Bayesan Informaon Creron (BIC) [0] Laplace creron [] varaonal Bayesan creron [] Laplace-Emprcal creron [3] ec. These works show ha Bayesan mehods are promsng o solve he problem of model srucure selecon. However mos exsng Bayesan mehods are based on generave learnng usually on classcal axmum Lkelhood Esmaon (LE) where only posve examples are nvolved o deermne model srucures. Therefore he dscrmnave ably of he model s somewha gnored. In recen years dscrmnave learnng algorhms such as nmum Classfcaon Error (CE) [4] axmum uual Informaon (I) [5] nmum Phone Error (PE) [6] and ax-n Poseror Pseudo-Probables [7] have demonsraed her advanages over generave learnng counerpars for parameer esmaon of Gs. Compared o he advances n dscrmnave parameer esmaon dscrmnave srucure selecon has no receved enough aenon. Recenly Klauau e al. [8] presened a I based mehod o deermne he G srucure. Lu and Gales [9] nroduced a dscrmnave mehod of G complexy conrol under Bayesan model srucure selecon framework. In he mehod of Lu and Gales a margnalzed dscrmnave growh funcon of I\PE creron was presened o selec he G srucure. They evaluaed her mehod n large-vocabulary connuous-speech recognon. In hs paper we propose a dscrmnave G srucure selecon mehod for paern classfcaon hrough embeddng a dscrmnave learnng creron no Bayesan model srucure selecon framework. The used dscrmnave learnng creron s SOT arge based ax-n poseror Pseudo-probables (Sof-P) [0]. An negraed Sof-P funcon s nroduced and approxmaed wh Laplace s mehod he value of whch s used o evaluae he G srucure. By employng he lne search algorhm o fnd ou he maxmum value of Laplace s approxmaon of he negraed Sof-P funcon he srucure and he parameers of he opmal G are deermned smulaneously n a dscrmnave manner. Our work s closely relaed o ha of Lu and

4 Gales [9] bu he model evaluaon crera are dfferen from each oher. We advse a Sof-P based creron n hs paper whle Lu and Gales desgned I or PE based one. urhermore we employ he lne search algorhm for model srucure selecon whle he merge-spl sraegy s used by Lu and Gales. Our mehod was appled o handwren dg recognon and evaluaed on wo well-known handwren dg daabases CENPARI [] and NIST []. We compare our mehod wh he manual mehod and hree man generave counerpars ncludng BIC [0] nmum Descrpon Lengh (DL) [7] and AuoClass [3]. The comparson resuls show ha he proposed mehod mproves boh he recognon raes and he generalzaon ably of G based handwren dg classfers. Compared wh hese G srucure selecon mehods our mehod brngs () 7.78% o 5.85% reducon n he error rae on he CENPARI es se and 5.87% o 33.75% reducon for he NIST es se; () 0.8% o 0.45% ncrease n he generalzaon ably whch s measured as he rao of he recognon rae on he es se o ha on he ranng se for he CENPARI daabase and 0.06% o 0.7% ncrease for he NIST daabase. urhermore o our bes knowledge our mehod brngs he bes error rae so far on he CENPARI daabase and he error rae comparable o he currenly bes ones on he NIST daabase. In he work of Lu e al. [4-5] he sae-of-he-ar echnques of handwren dg recognon ncludng feaures and classfers s horoughly nvesgaed on boh CENPARI and NIST daabase. They use 8-drecon graden feaures (abbrevaed o e-grg here) and he classfer of eher SV wh RB kernel or Dscrmnave Learnng Quadrac Dscrmnan uncon (DLQD) o repor he error rae of 0.95% on he es se of he CENPARI daabase. Usng he same e-grg feaures by couresy of Lu we acheve he beer error rae of 0.65% on he same es se. Ths resul also ouperforms he oher up-o-dae resuls repored on he CENPARI daabase by usng varous feaures and classfers [4-9] ncludng prevous bes resul of 0.85% from SV wh RB kernel and deslan chancode feaure [5]. or he NIST daabase our mehod acheves he error rae of 0.53% on he es se by usng e-grg feaure. Ths resul s comparable o he bes error rae of 0.4% for e-grg feaure [4] and he overall bes error rae of 0.39% [30] on he same daabase. The res of hs paper s organzed as follows. Secon descrbes Bayesan model srucure selecon framework and Sof-P dscrmnave learnng creron. Secon 3 presens our dscrmnave mehod of G srucure selecon. Secon 4 repors he expermenal evaluaon of our mehod for handwren dg recognon. We dscuss our conclusons and fuure work n Secon 5. 3

5 . Prelmnares In hs secon we brefly nroduce Bayesan srucure selecon and Sof-P. The reader s referred o Lu and Gales [9] for more deals of Bayesan model srucure selecon and Chen [0] for more deals of Sof-P... Bayesan model srucure selecon Le and be he number of componens and he se of unknown parameers of a G X x be a ranng daa se of N p be he parameer pror dsrbuon examples. Then he negraed lkelhood for he model s p X px p d x N. () In Bayesan model srucure selecon mehods he opmal model srucure wll be deermned by maxmzng he negraed lkelhood: arg max px p d. () ollowng [9] p s reaed as unnformave. Therefore he opmal model srucure s compued by arg max px d. (3) The negraed lkelhood n Eq. (3) s usually a hgh-dmensonal and nracable negral. Varous analyc and numercal approxmaons have been proposed. We use Laplace s approxmaon [0] n hs paper whch can be expressed as where p X p X log p X S (4) * s he LE of parameers S s he number of parameers and denoes he deermnan of a marx... Sof-P The Sof-P s developed o esmae parameers n he poseror pseudo-probably based classfer [7] a recenly proposed varan of Bayesan classfer. Le x be a feaure vecor be he -h class and px C be he class-condonal probably densy funcon. Then he poseror pseudo-probably of beng f C for x s compued by px C exp p x (5) C C 4

6 where and are posve numbers. or any npu paern we compue he correspondng poseror pseudo-probables of all he classes under consderaon. Then he npu paern s classfed as he class * C wh he maxmum poseror pseudo-probably: C C * arg max f p x C. (6) Accordng o Eq. (6) he poseror pseudo-probably s n drec proporonal o he class-condonal probably densy so he classfcaon decson made by poseror pseudo-probables s conssen wh ha by radonal Bayesan counerpar whch assumes he pror probables of all he classes are equal. However poseror pseudo-probables ake values n [ 0] by nroducng whch dscrmnave learnng approaches such as P [7] and Sof-P [0] can be developed for Bayesan classfers. urhermore he poseror pseudo-probably s a naural smlary measure and s useful for () makng rejecon decson () combnng classfers and (3) assessng he performance of a classfer n a much more accurae way han ha of counng he number of paerns classfed correcly [3]. The class-condonal probably densy funcon n Eq. (6) should be provded for consrucng a poseror pseudo-probables based classfer whch s assumed o be he G n hs paper. Le be he number of G componens; w k k and k be he wegh he mean and he k covarance marx of he k -h Gaussan componen respecvely. w. Then we have where p x C w k Nk x k k (7) k d T Nk x k k k exp x k k x k. (8) By subsung Eq. (7) no Eq. (6) we ge a poseror pseudo-probables based classfer. The orgnal Sof-P learnng mehod s able o esmae he parameers n hs classfer bu he G srucure needs o be se manually. Le Ĥ and H be wo adapve sof arges whch ake values n [ 0] ; xˆ and x be he feaure vecor of arbrary posve and negave example of a class respecvely; m and n be he number of posve and negave examples of he class n he ranng se respecvely. Then he oal emprcal loss of he poseror pseudo-probables based classfer s measured o be m n L( ) l ˆ xˆ ; l x; (9) m n where s he parameer se l ˆx ˆ; s he emprcal lose of he classfer on posve examples: k 5

7 and l ; f pxˆ C pxˆ C f pxˆ C Hˆ lˆ 0 x ˆ; (0) Hˆ f Hˆ x s he emprcal lose of he classfer on negave examples: f px C px C H f px C 0 H l x;. () f H The objecve of Sof-P s o mnmze he emprcal loss and maxmze he dfference beween Ĥ and H whch can be formally descrbed as d ( ) L. () In Eq. () d H ˆ H and s a non-negave consan o conrol he radeoff beween he emprcal loss and he dfference beween wo sof arges. Consequenly he ask of Sof-P learnng s o fnd ou he opmal parameer se : mnmzng by arg mn. (3) In he nex secon hs mnmzaon problem s ransformed o he maxmzaon one for defnng our model srucure selecon creron. 3. The proposed mehod In hs secon we presen our dscrmnave mehod of G srucure selecon based on Laplace s approxmaon of he negraed Sof-P funcon. We frsly descrbe our model srucure selecon creron and s Laplace approxmaon. We hen gve a lne search algorhm for fndng ou he opmal G srucure and parameers. 3.. Inverse Sof-P funcon Our evaluaon creron of G s defned by replacng he lkelhood funcon n he Bayesan evdence negral () wh he dscrmnave Sof-P funcon. However he orgnal Sof-P learnng s a mnmzaon problem. The negraed Sof-P funcon canno be approxmaed wh Laplace s mehod. In order o remove hs obsacle he orgnal Sof-P funcon () s rewren as m n n m.0 H Hˆ mn lˆ xˆ ; l x ; ( ). (4) j The frs erm n he rgh par of Eq. (4) sands for he dsance beween wo sof arges and he second erm for he emprcal loss. These wo erms are balanced by he radeoff parameer. We can oban he opmal parameer se by maxmzng ( ) : arg max. (5) 6

8 The objecve expressed n Eq. (5) s he same as he orgnal one n Eq. (3). Boh of hem ry o mnmze he emprcal loss and maxmze he dfference beween wo sof arges. However he Sof-P learnng based on Eq. (5) s a maxmzaon problem. I s feasble o compue Laplace s approxmaon of he correspondng negraed funcon. 3.. Dscrmnave creron for G srucure selecon Accordng o Eq. (5) we can deermne he opmal parameer se for specfc G srucure. Then he opmal G srucure wll be seleced as S arg max log ( ) ( ) (6) S where ( ) s he second order paral dervaves of ( ) wh respec o each parameer n. Ths selecon creron of G srucure s obaned by subsung ( ) for p X n Eq. (3) and hen performng compuaonal smplfcaons. The dealed dervaon s gven as follows. X n Eq. (3) wh ( ) we ge rsly replacng p Laplace s approxmaon of d arg max d. (7) s ( ) d ( ) S log ( ). (8) Snce he arhmec overflow s possble o occur when compung d log d s consdered here. We furher nroduce exp whch s drecly proporonal o. Laplace s approxmaon of d log d log (9) S log log s S ( ) log. (0) ( ) 7

9 8 As shown n Eq. (9) and (0) we can ge he same srucure selecon resul based on d log or d log bu he compuaon s smplfed by nroducng. A complex model ofen conans many parameers such as more han 000 n our expermens. So he cos of calculang he Hessan marx ) ( s very expensve. In hs paper he Hessan marx s assumed o have a dagonal srucure for makng he problem racable. The assumpon works well n our expermens. Thus we have ) ( log ) ( log d S ) ( log ) ( S S ) ( log ) ( S ) ( log ) (. () nally we ge our G srucure selecon creron: d arg max log S ) ( log ) ( max arg. () 3.3. Opmzaon algorhm Srucure selecon In mos of exsng model srucure selecon mehods he exhausve search sraegy s used and he search nerval s decded manually. Snce Laplace s mehod approxmaes he negral of a funcon by compung he volume under a Gaussan Laplace s approxmaon of he negraed Sof-P funcon.e. Eq. () s a unmodal funcon wh respec o. So we employ he lne search algorhm o seek he maxmum value of Eq. () whch ncludes wo sages. In he frs sage an nal search nerval of he number of G componens s deermned by he advance-rerea mehod [3]. The algorhm sars from a rggerng number o fnd hree numbers n a monoonc drecon. The values of Eq. () correspondng o hese hree numbers should show low-hgh-low rend. If he algorhm fals o fnd numbers sasfyng he condon

10 rereas o he rggerng number and performs he smlar search n he oppose drecon. In he second sage he search nerval s reduced connuously by he golden secon mehod [33] unl a maxmum value of Eq. () s reached. The golden secon mehod compares funcon values on wo enave numbers n and wo end-numbers of search nerval. Suppose he four numbers are arranged n ascendng order from lef o rgh. If he maxmum funcon value comes from one of he lef-hand wo pons he search nerval s reduced by dscardng he rgh end-number. Oherwse he search nerval s reduced by dscardng he lef end-number. Ths procedure s eravely performed unl none enave pons can be seleced n he search nerval. The number correspondng o he maxmum value of Eq. () n he fnal search nerval s oupued as he opmal resul. I should be noed ha he dscree numbers are searched n he process above. Two enave value calculaed n each eraon of he golden secon mehod wll be rounded down (lower value) or up (hgher value). The deals of he wo sages n he lne search algorhm are gven n Table. The compuaonal complexy of he lne search algorhm and he exhausve search algorhm are analyzed and compared wh each oher n he followng. Suppose he lengh of he nal search nerval s L. By usng he golden secon mehod he search nerval wll be reduced o 0.68 n L afer n eraons. Snce he dscree space s explored he searchng mus ermnae f n 0.68 L. So he upper bound of eraon me for he golden secon mehod s L n log (3) As for he exhausve search algorhm he compuaonal complexy s usually measured by he mean acquson me E L whch s he expeced value of fndng each pon whn he search nerval. Snce n eraon me s requred by he exhausve search sraegy o fnd he n -h pon we have 3 L L EL. (4) L Comparng Eq. (3) o Eq. (4) we can conclude ha he effcency of he golden secon mehod s beer han he exhausve search sraegy f he search nerval s large enough. g. shows ha he advanage of he golden secon mehod becomes more and more obvous when he search nerval s ncreased. 9

11 g.. The relaonshp beween he lengh of search nerval (abscssa) and he eraon me of he golden secon algorhm or he exhausve search algorhm (ordnae) Parameer esmaon or each explored G srucure n he lne search process he opmal parameers of he correspondng G are esmaed accordng o Eq. (5). We apply he graden ascen mehod o solve hs maxmzaon problem. Le and be he parameer se and he sep sze n he wh respec o -h eraon respecvely; be he paral dervaves of each parameer n. Then we have. (5) ncludes he classfer parameers and wo sof arges.e. Ĥ and H. So he wo sof arges are adapvely adjused accordng o Eq. (5) n each ranng eraon. In order o reduce he overfng problem and accelerae he speed of parameer esmaon we nvolve only ranng examples easly confused wh each oher n he parameer esmaon procedure. I s realzed by emporarly removng ranng examples for whch poseror pseudo-probables have dsncly exceeded he correspondng sof arge. Le Ŝ and S be he se of posve and negave examples of he class C whch are nvolved n he -h ranng eraon respecvely. Then he daa removal schema can be expressed as S ˆ S xˆ f x f p xˆ C p x C Hˆ xˆ Sˆ H x S (6) 0

12 where s a hreshold value for deermnng wheher he poseror pseudo-probably s dsncly exceed he sof arge. Le max be he maxmum mes of ranng eraons; max and mn be he maxmum and mnmum value of respecvely. Then max max mn ) n he -h ranng eraon s (. (7) The removed examples n he -h ranng eraon wll be rensered no he ranng se n he R-h ranng eraon. R s he requred span of ranng eraons n whch he example s excluded from he ranng. Le R 0 be he mnmum span be he mes of an example beng removed from he ranng se. Then R for hs example s R R 0. (8) max Algorhm seps We summarze our G srucure selecon algorhm for each class n Table -. The algorhm s composed of wo-layer opmzaon procedure he ouer s for G srucure selecon and he nner s for parameer esmaon of he G wh fxed srucure. The whole process of dscrmnave G srucure selecon s o perform hs algorhm one by one for all he classes under consderaon.

13 Table. Sof-P parameer esmaon algorhm. Inpu: ranng daa se nal parameers of poseror pseudo-probably measure funcon nal values of wo sof arges and he eraon number 0. Opmzaon: Repea Sep Compue he emprcal loss of he curren classfer on he ranng daa se. Sep Remove he examples from he ranng se accordng o Eq. (6). Sep 3 Compue he paral dervave of ( ) wh respec o each parameer usng remanng examples. Sep 4 Updae he unknown parameers usng Eq. (5). Sep 5 Updae usng Eq. (7). Sep 6 Renser he examples removed n he prevous eraons based on Eq. (8). Sep 7. Unl convergence or. max. Le be an nfnesmal hen he convergence condon s Oupu: he esmaed sof arges and parameers of poseror pseudo-probably measure funcon.

14 Table. The G srucure selecon algorhm. : rggerng number of G componens; : search sep sze; : Inpu: 0 acceleraon facor; : ermnal lengh of search nerval. odel Srucure Selecon: Sep Inalze he search nerval a b by he advance-rerea mehod Sep. Le he eraon number 0. Esmae parameers n he G by he Sof-P parameer esmaon algorhm shown n Table and compue log d usng Eq. (). Sep. Le. Compue ( ). If ( ) ( ) hen go o Sep.3 or else go o Sep.4. Sep.3 Le and go o Sep.. Sep.4 If 0 hen le and go o Sep.. Oherwse le mn b max a and go o Sep. Sep Reducng he search nerval by he golden secon mehod Sep. Le a a b b. Sep. Calculae wo numbers n he nerval a b: p a. 38b a q a 0. 68b a. Sep.3 Esmae parameers n he G wh p and 0 and q componens by he Sof-P parameer esmaon algorhm shown n Table respecvely. Compue ( p ) and ( q ). Sep.4 If p ) ( q ) go o Sep.5 or else go o Sep.6. ( Sep.5 If b p ermnae he search process and oupu he opmal G wh q componens. Oherwse le a p b b p q a b a q compue ) and go o Sep.7. ( q Sep.6 If q a ermnae he search process and oupu he opmal G wh p componens. Oherwse le a a b q q p a b a p compue ) hen go o Sep.7. Sep.7 Le and go o Sep.. ( p Oupu: he G wh opmal srucure and parameers 3

15 4. Expermens We evaluae our mehod of G srucure selecon by applyng o handwren dg recognon. The resulan dg classfer s esed on he well-known CENPARI daabase [] and NIST daabase []. The CENPARI daabase conans 4000 ranng examples and 000 es examples and he NIST daabase conans ranng examples and 0000 es examples. 4.. Dg odelng and Learnng The 8-drecon graden feaures (e-grg) [4] are used o represen dgs n he expermens for boh CENPAI and NIST daabase. The orgnal 00-D e-grg s compressed o 00-D by he Prncpal Componen Analyss (PCA) echnque o mprove he compuaon effcency. The orhogonal G echnque [34] s furher used o reduce he correlaon among elemens n he feaure vecors. Then he feaure vecors for each dg class are assumed o be of he G wh dagonal covarance marx. As a resul he se of unknown parameers n he Sof-P learnng of our dg classfer s wk Hˆ k H k. (9) k Some parameers n Eq. (9) mus sasfy ceran consrans whch are ransformed o unconsraned doman for easer mplemenaon. The consrans and ransformaon of parameers are lsed n Table 3. A ny varance value n covarance marces of he G wll lead o he compuaonal nsably of class-condonal probably densy funcon. So we mpose a posve mnmum lm on varance value whch s denoed as n Table 3. Consequenly he ransformed parameer se s { w h h } k. (30) k k k We use he dscrmnave model srucure selecon algorhm shown n Table o deermne he opmal as well as oher parameers n Eq. (30) and hen ransform hem no he orgnal ones. 4

16 Table 3 The consrans and ransformaon of parameers n he learnng of dg classfers. Orgnal parameers and consrans Transformaon of parameers 0 H ˆ ; 0 H Hˆ h e ; H e 0 ; 0 h exp ; exp exp kj kj kj w k w k w k w k e e 4.. Expermenal resuls The parameers n our algorhm were se by expermens and lsed n Table 4. These parameer values were used for boh CENPARI and NIST es. Table 4 Algorhm parameer seng. Parameer max max R mn 0 Value The role and experenal seng mehod of each parameer n Table 4 are explaned as follows. () and max are he sep sze and he maxmum erave mes for he graden ascen opmzaon respecvely. They have mporan nfluence on ranng effcency and effecveness. Alhough some heursc mehods for seng and max have been presened n leraures [35] he choce of hem s manly daa dependen a presen. () The parameer conrols he radeoff beween wo sub-objecves n he Sof-P learnng crera.e. he mnmum emprcal loss and he maxmum dfference beween wo sof arges. So should be adjused o make he weghed values of wo pars n Eq. (4) be close o each oher. (3) The parameer s used o preven he compuaonal nsably of class-condonal probably densy funcon. I mus be posve and be se as small as possble. (4) The oher algorhm parameers ncludng max mn and R 0 are used for daa selecon n he ranng process. The values of max and mn should be large enough and small enough respecvely so ha he ranng se can be exploed suffcenly a nal sages of ranng whle more and more examples whch have been learned well can be gnored a succeden ranng sages. As for R 0 he ncrease of s value wll lead o more effcen algorhm bu more rsks of 5

17 worsenng ranng resuls. We deermned he deal R 0 by expermens. Alhough he proposed algorhm can work whou he daa selecon procedure he ncluson of can mprove ranng effcency and effecveness. In a prevous work we conduced he handwren dg recognon expermens on he NIST daabase by usng he orgnal Sof-P wh or whou daa selecon procedure. The expermenal resuls show ha he use of daa selecon can lead o beer ranng effcency and generalzaon ably whch s measured as he rao of he recognon rae on he es se o ha on he ranng se. On a same compuaon plaform he ranng me was decreased from 7805 seconds o 3549 seconds whle he generalzaon ably was ncreased from o Comparsons of model srucure selecon mehods Our dscrmnave model srucure selecon mehod s compared wh manual seng mehod and hree generave counerpars ncludng BIC [0] DL [7] and AuoClass [3]. The G srucures seleced by auomac mehods vary wh dfferen dg classes. We ls he G srucure selecon resuls on he CENPAI and NIST daabase n Table 5-6 respecvely. As shown n Table 5 he number of G componens compued by our mehod averages around 4 for he CENPAI daabase. So we perform hree ess of manual seng on he CENPARI daabase n whch 3 o 5 numbers of he G componens are assgned o all he dg classes respecvely. or he NIST daabase 6-8 numbers of G componens are consdered n he manual seng ess because of he same reason. Table 5 The G srucure seleced by BIC DL AuoClass and our mehod for each dg class on he CENPAI daabase. Classes Average ehods BIC DL AuoClass Our

18 Table 6 The G srucure seleced by BIC DL AuoClass and our mehod for each dg class on he NIST daabase. Classes Average ehods BIC DL AuoClass Our Besdes srucure selecon he parameer esmaon s anoher mporan problem for G modelng. In our mehod he srucure selecon and he parameer esmaon are compleed smulaneously n a dscrmnave manner. However he generave E algorhm s used o esmae parameers n orgnal BIC DL and AuoClass. In order o farly compare hese hree model srucure selecon mehods and ours he Sof-P dscrmnave learnng algorhm s used o revse he parameers from orgnal BIC DL and AuoClass respecvely. Based on hree model srucures se manually and four model srucures deermned auomacally we ge seven dg classfers for he CENPARI and NIST daabase respecvely. Then he handwren dg recognon s performed by usng each of hese seven classfers on he ranng se and he es se respecvely. Table 7 shows he correspondng error rae on he ranng se (Tran) and he es se (Tes) for he CENPAI daabase and Table 8 for he NIST daabase. The generalzaon ably of each classfer s furher measured as he rao of he recognon rae on he es se o ha on he ranng se. I s denoed as Tes/Tran n Table 7-8. The larger rao value means he beer generalzaon ably. In Table 7-8 we also ls he reducon n he error rae on he es se (Reducon_es) whch s brough by our mehod compared wh oher mehods. As shown n Table 7-8 our dscrmnave mehod of G srucure selecon acheves he beer resul han manual mehod as well as generave counerpars. Compared wh BIC DL and AuoClass our mehod brngs 7.78% 40.9% and 38.0% reducon n he error rae on he CENPARI es se and 7.40% 3.9% and 5.87% reducon n he error rae on he NIST es se respecvely. urhermore our mehod mproves he generalzaon ably from 0.99 (BIC) 0.99 (DL) and (AuoClass) o (ours) on he CENPARI daabase and from (BIC) (DL) and (AuoClass) o (ours) on he NIST daabase. 7

19 Table 7 Error raes from four auomac mehods and he manual mehod of srucure selecon on he CENPAI daabase where 3-componen o 5-componen mean ha 3 o 5 numbers of G componens are manually assgned o all he dg classes respecvely. Srucure Selecon ehod Tran (%) Tes (%) Reducon_es(%) Tes / Tran 3-Componens Componens Componens BIC DL AuoClass Our Table 8 Error raes from four auomac mehods and he manual mehod of srucure selecon on he NIST daabase where 6-componen o 8-componen mean ha 6 o 8 numbers of G componens are manually assgned o all he dg classes respecvely. Srucure Selecon ehod Tran (%) Tes (%) Reducon_es(%) Tes / Tran 6-Componens Componens Componens BIC DL AuoClass Our

20 4... Comparsons o he sae-of-he-ar dg classfers The handwren dg recognon rae acheved by our mehod s furher compared wh he sae-of-he-ar on he CENPAI and NIST daabase respecvely. () CEPAI daabase In he paper of Lu e al. [4-5] sae-of-he-ar echnques of handwren dg recognon ncludng feaures and classfers are horoughly nvesgaed on he CENPARI daabase. They repored her bes error rae of 0.95% on he CENPARI es se for e-grg feaures by usng eher SV wh RB kernel or DLQD. They also repored he overall bes error rae of 0.85% on he CENPARI es se whch comes from 8-drecon deslan chancode feaure (des) nsead of e-grg [5]. Our dg classfer acheves beer resul based on e-grg feaures.e. he error rae of 0.65% on he es se. urhermore we collec oher up-o-dae resuls on he CENPARI daabase and compare hem wh ours n Table 9. I shows ha he dg classfer raned by he proposed G srucure selecon mehod expermenally ouperforms oher counerpars. Table 9 Error raes of varous up-o-dae dg classfers on he CENPARI daabase. Classfcaon ehod eaure Tes (%) odular Neural Nework [6] class dependen feaures.5 Local Learnng ramework[7] 3-drecon graden feaures.90 Neural Nework[8] random feaures.70 Vrual SV [9] 3-drecon graden feaures.30 SVC-rbf [4] e-grg 0.95 SVC-rbf [5] des 0.85 Our e-grg 0.65 () NIST daabase Lu e al. [4-5] also compared sae-of-he-ar echnques of handwren dg recognon on he NIST daabase. They repored he bes error rae of 0.4% on he es se for e-grg feaures by usng SV wh RB kernel. Our mehod acheves he comparable error rae of 0.53% by usng he same feaures. urhermore we also collec oher up-o-dae resuls on he NIST daabase and compare hem wh ours n Table 0. I shows ha he performance of our mehod ouperforms mos of he sae-of-ar echnques and comparable o he currenly bes ones. 9

21 Table 0 Error raes of varous up-o-dae dg classfers on he NIST daabase. Classfcaon ehod eaure Tes (%) Convoluonal Ne LeNe- [] Subsamplng.7 Polynomal SV [36] 3-drecon graden feaures.4 Boosed LeNe4 [37] Subsamplng 0.70 Large Convoluonal Ne [38] Unsup feaures 0.6 SV [39] Vson-based feaure 0.59 SV [40] Tranable feaure 0.54 K-NN [4] Shfable edges 0.5 VSV [9] 3 drecon graden feaures 0.44 SVC-rbf [4] e-grg 0.4 Large Convoluonal Ne [30] Tranable feaure 0.39 Our e-grg Conclusons In hs paper a dscrmnave srucure selecon mehod of Gaussan xure odel (G) has been proposed based on Bayesan srucure selecon framework and a dscrmnave learnng creron of Bayesan classfers called Sof arge based ax-n poseror Pseudo-probables (Sof-P). Our man conrbuon s o alor and negrae he Sof-P objecve funcon no Bayesan model srucure selecon framework wh Laplace s approxmaon. The resulan model srucure selecon creron s he maxmum value of Laplace s approxmaon of negraed Sof-P funcon. By developng a lne search algorhm o fnd ou hs maxmum value we smulaneously deermne he srucure of and he parameers n he opmal G. The proposed G srucure selecon mehod was esed n handwren dg recognon asks. The expermens were conduced on he well-known CENPAI and NIST handwren dg daabases. Our mehod expermenally ouperforms manual seng mehod and generave counerpars ncludng Bayesan Informaon Creron (BIC) nmum Descrpon Lengh (DL) and AuoClass boh n recognon accuracy and generalzaon ably. urhermore o our bes knowledge he handwren dg classfer raned by our mehod acheves he bes recognon rae so far on he CENPARI daabase and he comparable resul o he currenly bes ones on he NIST daabase. The advanages of he proposed mehod are hree-fold: () he dscrmnave creron of srucure selecon s drecly relaed o classfcaon lose so he mehod can work well on small daa ses; () 0

22 by usng he lne search sraegy nsead of commonly used exhausve search sraegy he mehod s suable for large-scale srucure selecon problems; and (3) wh he help of daa selecon schema he compuaon s racable even for ranng on large daa ses. However he proposed mehod gves more emphass on he ranng daa whch are confused wh each oher so s robusness o nose daa seems nferor o ha of generave counerpars. In he fuure we wll evaluae he effecveness of he proposed mehod n more applcaons on more daabases and for oher fne mxure models besdes G. Acknowledgemens Ths research was suppored by Naonal Naural Scence oundaon of Chna (Gran No ) Program for New Cenury Excellen Talens n Unversy of Chna (Gran No. NCET ) and Excellen Young Scholars Research und of Bejng Insue of Technology (Gran No. 008YS03). The auhors would lke o hank Dr. Cheng-Ln Lu for provdng he e-grg feaures of handwren dgs and he referees for her valuable commens and suggesons.

23 References [].A.T. gueredo A.K. Jan Unsupervsed Learnng of ne xure odels IEEE Trans. Paern Analyss and achne Inellgence 4(3) (00) [] G. J. clachlan On Boosrappng he Lkelhood Rao Tes Sasc for he Number of Componens n a Normal xure J. Royal Sasc Soc. C 36(987) [3] P. Smyh odel Selecon for Probablsc Cluserng Usng Cross-Valdaed Lkelhood Sascs and Compung 0() (000) [4] H. Bensma G. Celeux A. Rafery and C. Rober Inference n odel-based Cluser Analyss Sascs and Compung 7(00) -0. [5] H. Bozdogan S.L. Sclove ul-sample cluser analyss usng Akake s nformaon creron Annals of he Insue of Sascal ahemacs 36(984) [6] C. Wallace and D. Dowe nmum message lengh and Kolmogorov complexy Compuer Journal 4(4) (999) [7] J.J. Rssanen Informaon and Complexy n Sascal odelng (Sprnger-Verlag New York 007). [8] C. E. Rasmussen The Infne Gaussan xure odel Advances n Neural Informaon Processng Sysems (000) [9] H. Aas Inferrng Parameers and Srucure of Laen Varable odels by Varaonal Bayes n: Proceedngs of he feenh Conference on Uncerany n Arfcal Inellgence (999). [0] G. Schwarz Esmang he dmenson of a model Annals of Sascs 6(978) [] D. J. C. ackay Choces of Bass for Laplace Approxmaon achne Learnng 33() (998) [] A. Corduneanu C.. Bshop Varaonal Bayesan odel Selecon for xure Dsrbuons Arfcal Inellgence and Sascs (T. Jaakkola & T.Rchardson organ Kaufmann (00). [3] S.J. Robers D. Husmeer I. Rezek and W. Penny Bayesan approaches o Gaussan modelng IEEE Trans. Paern Analyss and achne Inellgence 0(998) [4] B.H. Juang W. Chou and C.H. Lee nmum classfcaon error rae mehods for speech recognon IEEE Trans. Speech Audo Processng 5(3) (997) [5] R. Nopsuwancha A. Bem and W.. Clocksn axmzaon of uual Informaon for Offlne Tha Handwrng Recognon IEEE Trans. Paern Analyss and achne Inellgence 8(8) (006) [6] D. Povey and P.C. Woodland nmum phone error and I-smoohng for mproved dscrmnave ranng In: Proc. ICASSP (00)

24 [7] X.B. Lu Y.D. Ja X.. Chen Y. Deng and H. u Image Classfcaon Usng he ax-n Poseror Pseudo-Probables ehod Techncal Repor BIT-CS Bejng Insue of Technology hp://mcslab.cs.b.edu.cn/member/xab/papers/008_.pd 008. [8] A. Klauau N. Jevc and A. Orlsky. Dscrmnave Gaussan xure odels: A comparson wh kernel classfers. Proceedngs of he Tweneh Inernaonal Conferenceo n achnel earnng (ICL-PO03)W ashngond C 003. [9] X.Y. Lu and. Gales Auomac odel Complexy Conrol Usng argnalzed Dscrmnave Growh uncons IEEE Trans. Audo Speech and Language Processng (4) (007) [0] X.. Chen X.B. Lu and Y.D. Ja A Sof Targe ehod of Learnng Poseror Pseudo-probables based Classfers wh s Applcaon o Handwren Dg Recognon In 008 h Inernaonal Conference on roners n Handwrng Recognon 008. [] C.Y. Suen e al. Compuer recognon of unconsraned handwren numerals n: Proc. IEEE 80(7) (99) [] Y. LeCun e al. Comparson of Learnng Algorhms for Handwren Dg Recognon Proceedngs of The Inernaonal conference on Arfcal Neural Neworks Nanerre rance (995) [3] P. Cheeseman and J. Suz Bayesan Classfcaon (AuoClass): heory and resuls. Advances n knowledge dscovery and daa mnng (AAAI Press enlo Park CA. USA 996) [4] C.L. Lu K. Nakashma H. Sako and H. ujsawa Handwren dg recognon: benchmarkng of sae-of-he-ar echnques Paern Recognon 36(003) [5] C.L. Lu K. Nakashma H. Sako and H. ujsawa Handwren dg recognon: nvesgaon of normalzaon and feaure exracon echnques Paern Recognon 37(004) [6] I.S. Oh J.S. Lee and C.Y. Suen Analyss of class separaon and combnaon of class-dependen feaures for handwrng recognon IEEE Trans. Paern Analyss and achne Inellgence (0) (999) [7] J.X. Dong A. Krzyzak C.Y. Suen Local Learnng framework for handwren characer recognon. Engneerng Applcaons of Arfcal Inellgence 5(00) [8] P.D. Gader.A. Khabou Auomac feaure generaon for handwren dg recognon IEEE Trans. Paern Analyss and achne Inellgence 8() (996) [9] J.X. Dong A. Krzyzk C.Y. Suen as SV Tranng Algorhm wh Decomposon on Very Large Daases IEEE Trans. Paern Analyss and achne Inellgence 7(4) (005) hp:// [30] Ranzao arc Aurelo Chrsopher Poulney Sum Chopra and Yann LeCun Effcen Learnng of Sparse Represenaons wh an Energy-Based odel Advances n Neural Informaon Processng Sysems IT Press 006. [3] G. Lugos and. Pawlak On he poseror-probably esmae of he error rae of nonparamerc classfcaon rules IEEE Trans. Informaon Theory 40() (994)

25 [3] G.J. clachlan and S.K. Ng A comparson of some nformaon crera for he number of componens n a mxure model Techncal Repor Deparmen of ahemacs Unversy of Queensland 000. [33] E. Polak Opmzaon Algorhms and Conssen Approxmaons (Sprnger-Verlag New York USA 997). [34] R. Zhang X.Q. Dng Offlne Handwren Numeral Recognon Usng Orhogonal Gaussan xure odel n: Proceedngs 6h In. Conference documen Analyss and Recognon Seale USA (00) 6-9. [35] P. Bald Graden descen learnng algorhm overvew: A general dynamcal sysems perspecve IEEE Transacons on Neural Neworks 6() (995) [36] C.J.C. Burges and B. Scholkopf Improvng he accuracy and speed of suppor vecor leanng machnes Advances n Neural nformaon Processng Sysems IT Press (997). [37] Y. Lecun L. BOou Y. Bengo and P. Haffner Graden-Based Learnng Appled o Documen Recognon Proceedngs of he IEEE 86() (998) [38].A. Ranzao u-jie Huang Y.L. Boureau and Yann Lecun Unsupervsed Learnng of Invaran eaure Herarches wh Applcaons o Objec Recognon Proc. Compuer Vson and Paern Recognon Conf. (007). [39] L.N. Teow and K.. Loe Robus vson-based feaures and classfcaon schemes for off-lne handwren dg recognon Paern Recognon40(00) [40].Lauer C.Y. Suen and G. Bloch A ranable feaure exracor for handwren dg recognon Paern Recognon 40 (007) [4] Danel Keysers Thomas Deselaers Chrsan Gollan and Hermann Ney Deformaon odels for Image Recognon IEEE Trans. Paern Analyss and achne Inellgence 9(8) (007)

26 g.

27 g. 3

28 g. 5

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

On computing differential transform of nonlinear non-autonomous functions and its applications

On computing differential transform of nonlinear non-autonomous functions and its applications On compung dfferenal ransform of nonlnear non-auonomous funcons and s applcaons Essam. R. El-Zahar, and Abdelhalm Ebad Deparmen of Mahemacs, Faculy of Scences and Humanes, Prnce Saam Bn Abdulazz Unversy,

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

CHAPTER 7: CLUSTERING

CHAPTER 7: CLUSTERING CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION

A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION S19 A NEW TECHNIQUE FOR SOLVING THE 1-D BURGERS EQUATION by Xaojun YANG a,b, Yugu YANG a*, Carlo CATTANI c, and Mngzheng ZHU b a Sae Key Laboraory for Geomechancs and Deep Underground Engneerng, Chna Unversy

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

Structural Optimization Using Metamodels

Structural Optimization Using Metamodels Srucural Opmzaon Usng Meamodels 30 Mar. 007 Dep. o Mechancal Engneerng Dong-A Unvers Korea Kwon-Hee Lee Conens. Numercal Opmzaon. Opmzaon Usng Meamodels Impac beam desgn WB Door desgn 3. Robus Opmzaon

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys Dual Approxmae Dynamc Programmng for Large Scale Hydro Valleys Perre Carpener and Jean-Phlppe Chanceler 1 ENSTA ParsTech and ENPC ParsTech CMM Workshop, January 2016 1 Jon work wh J.-C. Alas, suppored

More information

MANY real-world applications (e.g. production

MANY real-world applications (e.g. production Barebones Parcle Swarm for Ineger Programmng Problems Mahamed G. H. Omran, Andres Engelbrech and Ayed Salman Absrac The performance of wo recen varans of Parcle Swarm Opmzaon (PSO) when appled o Ineger

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

Forecasting customer behaviour in a multi-service financial organisation: a profitability perspective

Forecasting customer behaviour in a multi-service financial organisation: a profitability perspective Forecasng cusomer behavour n a mul-servce fnancal organsaon: a profably perspecve A. Audzeyeva, Unversy of Leeds & Naonal Ausrala Group Europe, UK B. Summers, Unversy of Leeds, UK K.R. Schenk-Hoppé, Unversy

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

Kernel-Based Bayesian Filtering for Object Tracking

Kernel-Based Bayesian Filtering for Object Tracking Kernel-Based Bayesan Flerng for Objec Trackng Bohyung Han Yng Zhu Dorn Comancu Larry Davs Dep. of Compuer Scence Real-Tme Vson and Modelng Inegraed Daa and Sysems Unversy of Maryland Semens Corporae Research

More information

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes

Anisotropic Behaviors and Its Application on Sheet Metal Stamping Processes Ansoropc Behavors and Is Applcaon on Shee Meal Sampng Processes Welong Hu ETA-Engneerng Technology Assocaes, Inc. 33 E. Maple oad, Sue 00 Troy, MI 48083 USA 48-79-300 whu@ea.com Jeanne He ETA-Engneerng

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field

Bernoulli process with 282 ky periodicity is detected in the R-N reversals of the earth s magnetic field Submed o: Suden Essay Awards n Magnecs Bernoull process wh 8 ky perodcy s deeced n he R-N reversals of he earh s magnec feld Jozsef Gara Deparmen of Earh Scences Florda Inernaonal Unversy Unversy Park,

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES H. Abrsham Moghaddam K. N. Toos Unversy of Technology, P.O. Box 635-355, Tehran, Iran moghadam@saba.knu.ac.r M. Ghayoum Islamc Azad Unversy,

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence

Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Proceedngs of he weny-second Inernaonal Jon Conference on Arfcal Inellgence l, -Norm Regularzed Dscrmnave Feaure Selecon for Unsupervsed Learnng Y Yang, Heng ao Shen, Zhgang Ma, Z Huang, Xaofang Zhou School

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information

A Novel Object Detection Method Using Gaussian Mixture Codebook Model of RGB-D Information A Novel Objec Deecon Mehod Usng Gaussan Mxure Codebook Model of RGB-D Informaon Lujang LIU 1, Gaopeng ZHAO *,1, Yumng BO 1 1 School of Auomaon, Nanjng Unversy of Scence and Technology, Nanjng, Jangsu 10094,

More information

Improved Classification Based on Predictive Association Rules

Improved Classification Based on Predictive Association Rules Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 Improved Classfcaon Based on Predcve Assocaon Rules Zhxn Hao, Xuan Wang, Ln Yao, Yaoyun Zhang

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection

An Effective TCM-KNN Scheme for High-Speed Network Anomaly Detection Vol. 24, November,, 200 An Effecve TCM-KNN Scheme for Hgh-Speed Nework Anomaly eecon Yang L Chnese Academy of Scences, Bejng Chna, 00080 lyang@sofware.c.ac.cn Absrac. Nework anomaly deecon has been a ho

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information