A Hybrid Learning Algorithm for Locally Recurrent Neural Networks

Size: px
Start display at page:

Download "A Hybrid Learning Algorithm for Locally Recurrent Neural Networks"

Transcription

1 Contemporary Engneerng Scences, Vo. 11, 2018, no. 1, 1-13 HIKARI Ltd, A Hybrd Learnng Agorthm for Locay Recurrent Neura Networks Dmtrs Varsams and Evangeos Outsos Department of Informatcs Engneerng Technoogca Educatona Insttute of Centra Macedona - Serres 62124, Serres, Greece Pars Mastorocostas Department of Computer Systems Engneerng, Praeus Unversty of Apped Scences, 12244, Egaeo, Greece Copyrght c 2018 Dmtrs Varsams, Evangeos Outsos and Pars Mastorocostas. Ths artce s dstrbuted under the Creatve Commons Attrbuton Lcense, whch permts unrestrcted use, dstrbuton, and reproducton n any medum, provded the orgna work s propery cted. Abstract In ths work a fast and effcent tranng method for bock-dagona recurrent neura networks s proposed. The method modfes and extends the Smuated Anneang RPROP agorthm, orgnay deveoped for statc modes, by takng nto consderaton the archtectura characterstcs and the tempora nature of ths category of recurrent neura modes. The performance of the proposed agorthm s evauated through a comparatve anayss wth a seres of agorthms and recurrent modes. Keywords: bock-dagona recurrent neura network, nterna feedback, resent back-propagaton, smuated anneang, ordered dervatves 1 Introducton Recurrent neura networks has become a popuar research fed of Computatona Integence durng the ast. Due to ther tempora capabtes, they

2 2 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos have been extensvey empoyed n rea-word appcatons ke system dentfcaton and pattern recognton. The ocay recurrent neura networks wth nterna feedback connectons consttute a speca subcass, where the feedback nternks are mted excusvey between neghbourng neurons. Thus, these neura modes have sgnfcanty reduced compexty wth respect to fuy recurrent networks [13]. A speca subcass of ocay recurrent networks s the Dagona Recurrent Neura Network (DRNN) [2], where there are no nternks among neurons n the hdden ayer. A modfed DRNN s the Bock-Dagona Recurrent Neura Network (BDRNN) [10], where dynamcs s ntroduced between pars of neurons n the hdden ayer. The BDRNNs have been proved to be an effcent modeng too [4], [5], [11]. Due to the tempora reatons of BDRNN, a parameter optmzaton method shoud unfod n tme. Most of the tmes these tempora reatons are negected and parameter earnng s attempted by consderng a few prevous tme steps. The Back Propagaton Through Tme agorthm (BPTT) [7] s the most common agorthm for tranng BDRNNs. However, the BPTT exhbts two major dsadvantages: (a) t shows a ow speed of convergence, (b) most often t becomes trapped to oca mnma of the error surface. In order to overcome the above fangs of gradent-based methods ke BPTT, the Resent Propagaton agorthm (RPROP, [8]) has been proved to be one of the best performng earnng methods for statc neura networks [1]. However, n RPROP the probem of poor convergence to oca mnma s not fuy emnated. Hence, n an attempt to aevate ths drawback, a hybrd scheme combnng the goba search technque of Smuated Anneang (SA) and RPROP was ntroduced n [12]. The resuted agorthm, named SARPROP, was proved to be an effcent earnng method for statc neura networks. Stem from the above deveopments n tranng statc neura modes, ths work proposes an extenson of the standard SARPROP method that takes nto consderaton the tempora reatons exstng n a ocay recurrent neura networks. Snce the agorthm s adapted to the speca features of BDRNN, t s entted Hybrd Learnng Agorthm for BDRNNs (HLA-BDRNN). The rest of ths paper s organzed as foows: In Secton 2 the structure and characterstcs of the BDRNN are ustrated. The earnng agorthm s deveoped n Secton 3. In Secton 4 a comparatve anayss of the proposed method wth other earnng schemes and recurrent modes s conducted. The paper concudes wth a bref dscusson of the proposed method.

3 Hybrd earnng agorthm 3 2 The Bock-Dagona Recurrent Neura Network The BDRNN s a speca case of recurrent neura networks. It s not fuy connected and t beongs to the cass of ocay-recurrent-gobay-feedforward neura networks [13]. It conssts of two ayers, where the output ayer s statc and the hdden ayer s dynamc. The hdden ayer conssts of pars of neurons (bocks); there are feedback connectons between the neurons of each par, ntroducng dynamcs to the network. For the sake of smpcty, a sngenputsngeoutput BDRNN wth four bocks of neurons s shown n Fgure 1. Fgure 1: Confguraton of BDRNN wth four bocks of neurons n the hdden ayer A BDRNN wth m nputs, r outputs, and N neurons at the hdden ayer operates accordng to the foowng state equatons: where x(k) f a (W x(k 1) + B u(k)) (1a) y(k) f b (C x(k)) (1b) f a, f b are the neuron actvaton functons of the hdden and the output ayers, respectvey. In the foowng, the actvaton functons are both chosen to be the sgmod functon f(z) 1 e an z 1 + e an z.

4 4 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos u(k) [u (k)] s a m-eement nput vector, wth k beng the tme varabe. x(k) [x (k)] s a N-eement vector, comprsng the outputs of the hdden ayer. In partcuar, x (k) s the output of the -th hdden neuron at tme k. y(k) [y (k)] s a r-eement output vector. B [b,j ] and C [c,j ] are N m and r N nput and output weght matrces, respectvey. W [w,j ] s the N N bock dagona feedback matrx. In partcuar, 0 f j 0 f j and j 1 and s odd w,j 0 f j and j + 1 and s even 0 otherwse (2) The feedback matrx, W, s bock dagona: N W dag W (1),..., W ( 2 ) (3) Each dagona eement, correspondng to a bock of recurrent neurons, has a bock sub-matrx n the form: [ ] W () w2,2 w 2,2+1 1, 2,... N (4) w 2+1,2 w 2+1,2+1 2 Equaton (4) descrbes the genera case of BDRNN, whch s caed BDRNN wth free-form sub-matrces. A speca case of BDRNN conssts of scaed orthogona sub-matrces n the form [ ] [ W () w2,2 w 2,2+1 w 2,2+1 w 2,2 w (1) w (2) w (2) w (1) ] 1, 2,... N 2 From (4) and (5) t s concuded that the Free-Form BDRNN conssts of feedback sub-matrces wth four dstnct eements and provdes a greater degree of freedom compared to the Scaed Orthogona BDRNN, whch has two weghts at each feedback sub-matrx. Nevertheess, as dscussed n [10], the atter network exhbts superor modeng capabtes than the Free-Form BDRNN, and the forthcomng earnng method w be deveoped for ths network. (5)

5 Hybrd earnng agorthm 5 Combnng (1)-(5), the state equatons for the Scaed Orthogona BDRNN can take the foowng form: m x 2 1 (k) f a b 2 1,j u j (k) + w (1) x 2 1 (k 1) + w (2) x 2 (k 1), 1,..., N 2 j1 j1 (6a) m x 2 (k) f a b 2,j u j (k) w (2) x 2 1 (k 1) + w (1) x 2 (k 1), 1,..., N 2 N y (k) f b c,j x j (k), 1,..., r j1 where w (1), w (2) are the feedback weghts at the hdden ayer. 3 The HLA-BDRNN Agorthm (6b) In gradent-based optmzaton methods, the weght changes are proportona to the sze of the gradent of an error functon E: (6c) w (t) µ E(t) (7) where E s the parta dervatve of E wth respect to a weght w and t represents the epoch ndex. When t comes to recurrent modes, the common parta dervatve shoud be substtuted by the ordered parta dervatve, due to the exstence of tempora reatons through the feedback connectons of BDRNN, as w be dscussed n the seque. The term µ n (7) s the earnng rate, whch n BPTT s kept fxed throughout the earnng process and s common to a weght updates. Therefore, an approprate seecton of the earnng rate s cruca to the evouton of the earnng process and consttutes a sgnfcant constrant. The RPROP earnng scheme attempted to aevate ths dsadvantage of BPTT by aowng each fttng parameter to have ts ndvdua step sze, whch s adjusted durng the earnng process based on the sgn of the respectve parta dervatve at the current and the prevous epoch. Therefore, the effect of the adaptaton process w not burred by the nfuence of the sze of the parameter gradent but s ony dependent on the tempora behavor of the gradent ([8]). Partcuary, et E(t) and E(t 1) denote the dervatves of E wth respect to w at the present and the precedng epochs, respectvey. RPROP s descrbed n pseudo-code as foows:

6 6 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos Some very ong text... (a) For a weghts w ntaze the step szes (1) 0 Repeat (b) For a weghts w compute the error gradent: E(t) (c) For a weghts w, update step szes: (c.1) If E(t) (c.2) Ese f E(t) E(t 1) E(t 1) > 0 Then (t) < 0 Then (t) mn η + (t 1), max max η (t 1), mn (c.3) Ese (t) (t 1) ( ) (d) Update the weghts w : w (t) sgn E(t) (t) Unt convergence where the step szes are bounded by mn, max. The nta vaues of the step szes (1) 0 are chosen rather moderatey (e.g. 0.1), snce these vaues drecty determne the szes of the frst parameter changes. The ncrease and attenuaton factors are set to n + [1.01, 1.3] and n [0.5, 0.9], respectvey. The HLA-BDRNN agorthm performs the foowng modfcatons: () Substtutes the error gradents wth the ordered dervatves + E(t) ([14]), n order to take nto consderaton the tempora dependences exstng n a dynamc mode. () Modfes steps (b) and (c.2) of RPROP as shown: (b ) Compute the HLA-BDRNN error gradent: + E(t) 0.01 SA (c.2 ) Ese f + E(t) If ( (t) Ese (t) + E(t 1) < 0 Then < 0.4 SA 2 ) Then (t) max max η (t 1), mn w 1+w 2 η (t 1) 0.8 r SA 2, mn where SA 2 t T emp s the smuated anneang term, parameter r takes random vaues wthn the nterva [0, 1] and T emp s the temperature. The modfed step (c.2 ) ams at addng nose to the weghts, accordng to the concept of smuated anneang, n order to ncrease the convergence speed of the earnng process. In HLA-BDRNN nose s added to the weght update vaues when the error gradent changes sgn n two successve epochs, and the magntude of the update vaue s ess than a vaue that s proportona to the SA term. In ths way, the weght update s modfed by nose ony when t has a reatvey sma vaue, thus aowng the weght to move out of oca mnma, whe mnmzng the dsturbance to the adaptaton process. In the modfed step (b ), a weght decay term s added to the error gradent, as proposed n [1]. The effect of ths form of weght decay s to modfy the error surface such that ntay weghts wth ower vaues are favoured. As tranng proceeds, the magntude of weght decay s reduced, factatng the ncrease of bgger weghts and aowng the mode to expore regons of the error surface that were prevousy unavaabe. Addtonay, as mentoned n [1], the use of weght decay has been proved to mprove the generazaton capabty of the

7 Hybrd earnng agorthm 7 mode. The adaptaton mechansm descrbed above has the advantage of correatng the step szes not to the sze of the dervatves but to ther sgns. Hence, whenever a parameter moves aong a drecton reducng E (the dervatves at successve epochs have the same sgn), ts step sze s ncreasng ndependenty of the sze of the dervatve. In ths way, the step szes can suffcenty ncrease when needed, even at the fna stage of the earnng process when the szes of the dervatves are rather sma. Addtonay, when changes n the sgn of the dervatve occur, the step sze s dmnshng to prevent the error measure from oscatng. A key ssue n the case of ocay recurrent networks ke BDRNN s the accurate extracton of the error gradents. In the proposed agorthm the tme dependences are fuy taken nto consderaton, and no approxmaton pocy to a few steps back s apped. The error measure used s the Mean Squared Error (MSE), defned by E 1 k f k f k1 [y (k) ŷ (k)] 2 (8) where y (k) s the -th mode output, ŷ (k) s the -th desred (actua) output of the system at tme step k. Contrary to statc networks, the extracton of the ordered parta dervatves n BDRNN s not straghtforward and s accompshed va a set of recursve equatons. In order to determne the error gradents of the dynamc part of BDRNN, et us ntroduce (a) the state vector st(t), defned as: st(k) [x 1 (k),..., x N (k), y 1 (k),..., y r (k)] T comprsng the outputs of the hdden and the output ayer. (b) the contro vector θ comprsng the synaptc and feedback weghts (N (m + r + 1) weghts) [ ] T θ b 1,1,..., b N,m, w (1) 1, w (1) N, w (2) 1, w (2) N, c 1,1,..., c r,n 2 For a data set ncudng k f pars, the state equatons are wrtten f (st(k), θ(k)) 0, k 1,..., k f wth f (1) 2 1 (k) 1,..., N 2 (k f N 2 equatons): f a ( m j1 b 2 1,j u j (k) + w (1) x 2 1 (k 1) + w (2) x 2 (k 1) 2 ) x 2 1 (k) 0 (9a)

8 8 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos f (1) 2 (k) 1,..., N 2 (k f N 2 equatons): f a ( m j1 b 2,j u j (k) w (2) x 2 1 (k 1) + w (1) x 2 (k 1) f (2) (k) 1,..., r (k f r equatons): ( N ) f b c,j x j (k) y (k) 0 j1 ) x 2 (k) 0 (9b) (9c) where the extracton of the La- f + λt 0. θ The error gradents are gven by + E λ T f θ θ grange mutpers λ s based on the formua E After cacuatons are conducted n (19), the mutpers are determned through the foowng recursve equatons: x λ (2) (k) 1 k f [y (k) ŷ (k)] (10a) λ (1) 2 1 (k) 1 k f λ (1) 2 (k) 1 k f c,2 1 f b () (k) [y (k) ŷ (k)] + (2 1) (2) () λ E (k) c,2 1 f b (k) + λ (1) 2 1 (k + 1) w(1) f a (k + 1) λ (1) 2 (k + 1) w(2) f a (k + 1) (10b) () c,2 f b (k) [y (k) ŷ (k)] + + λ (1) 2 1 (k + 1) w(2) f a (2 1) (k + 1) + λ (1) 2 λ (2) (k + 1) w(1) (2) () (k) c,2 f b (k) (2) + f a (k + 1) (10c) where 1,..., N, 1,..., r and f 2 b (() k ), f 1( k (2 1) + j), f 1( k (2) + j) are the dervatves of y j (k + ) and x 2 1 (k + j), x 2 (k + j), respectvey, wth respect to ther arguments. Equatons (10) are backward dfference equatons that can be soved for k k f, k f 1,..., 1 usng the foowng boundary condtons: λ (1) 2 1 (k f) 1 k f λ (2) (k f ) 1 k f [y (k f ) ŷ (k f )] (11a) c,2 1 f b () (k f ) [y (k f ) ŷ (k f )] + λ (2) () (k f ) c,2 1 f b (k f ) (11b)

9 Hybrd earnng agorthm 9 λ (1) 2 (k f) 1 k f c,2 f b () (k f ) [y (k f ) ŷ (k f )] + λ (2) () (k f ) c,2 f b (k f ) (11c) Substtutng (17) and (21) to (18), and takng nto consderaton (12), the error gradents are gven by + E c + E b j k f k1 k f k1 λ (2) () (k) x (k) f b (k) 1,..., r, 1,..., N λ (1) () (k) u j (k) f a (k) 1,..., N, j 1,..., m (12a) (12b) + E w (1) + E w (2) k f k1 k f k1 λ (1) 2 1 (k) x 2 1(k 1) f a (2 1) (k) +λ (1) 2 (k) x 2(k 1) f a (2) (k), 1,..., N 2 (12c) λ (1) 2 1 (k) x 2(k 1) f a (2 1) (k) λ (1) 2 (k) x 2 1(k 1) f a (2) (k), 1,..., N 2 (12d) 4 Performance Tests and Resuts The stabzng propertes and the performance of the M-SARPROP approach are hghghted by use of a benchmark dentfcaton probem of a dynamca system [6]. The actua system s descrbed by the dfference equaton: y p (k + 1) y p(k) y p (k 1) y p (k 2) u(k 1) [y p (k 2) 1] + u(k) 1 + y 2 p(k 1) + y 2 p(k 2) A parae dentfcaton system s consdered, wth the nput u(k) beng the soe nput to the network. The BDRNN comprses four bocks of neurons n the hdden ayer and has a tota number of 32 weghts. The frst objectve of the expermentaton s to compare the performance of the HLA-BDRNN method to BPTT [5], n tranng of the BDRNN. A second objectve s compare the BDRNN and the earnng agorthm wth other recurrent modes, n terms of approxmaton accuracy and generazaton capabtes. In compance wth prevous resuts reported n the terature, the tranng data set contans ten batches of 900 patterns. For each data batch, the nput u(k) s an ndependent and dentcay dstrbuted unform sequence for the frst haf of the 900 tme steps and a snusod gven by 1.05 sn(πk/45)

10 10 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos for the rest of the tme nstants. The checkng data set s composed of 1000 sampes wth a sgna descrbed by sn(πk/25) k < k < 500 u(k) k < sn(πk/25) sn(πk/32) sn(πk/10) 750 k < 1000 In order to seect the tranng parameters of HLA-BDRNN and BPTT, severa runs are performed wth the same nta weghts but dfferent parameter combnatons. Then, the parameter combnaton that has exhbted the fastest convergence and ow vaues of the error functon s seected. Thus, we are ed to , η and η 0.85, T emp 1.15 for HLA-BDRNN, and a earnng rate of µ for BPTT, respectvey. Next, a seres of 100 ndependent tras wth dfferent weght ntazatons are attempted. Partcuary, the feedback weghts W and the weght matrces B and C are randomy seected wthn the range [ 0.5, 0.5]. The sope pertanng to the actvaton functons of the network neurons s set to 2. For each partcuar tra and for far comparson, the same weght ntazatons are used for HLA-BDRNN and BPTT. The competng rvas are recurrent neura networks and ther archtectures and earnng parameters are determned as foows: The IIR-MLP s a mutayered network [13], where the synaptc connectons are mpemented through IIR fters, ncudng a movng average (MA) and an auto-regressve (AR) part. We seected a 1x8x1 IIR-MLP mode wth unt deays n the MA and the AR parts, both for the nputto-hdden and the hdden-to-output synaptc fters, respectvey. The dagona recurrent neura network (DRNN, [3]) has one hdden ayer, contanng sef-recurrent neurons. A 1x8x1 DRNN mode s seected. The weghts of the IIR-MLP and DRNN are randomy ntazed n that range [ 0.5, 0.5]. The earnng rate s set to 0.01, seected as best vaue after severa tranng runs. The IIR-MLP and DRNN modes are traned by BPTT, whe the memory neura network (MNN, [9]) s traned usng the rea tme recurrent earnng (RTRL) method [7]. A network modes and earnng schemes are traned foowng a parae mode approach, wth the excepton of the MNN where the seres-parae confguraton s adopted, as reported n [9].

11 Hybrd earnng agorthm 11 Tabe 1 hosts the comparatve resuts attaned after fve tranng epochs of the entre data set, wth the weght updates takng pace at the end of each one of the ten batches. The resuts for the MNN are taken from [9]. As shown, the BDRNN traned by the HLA-BDRNN method exhbts the best performance among the competng schemes, wth regard to both the average and, especay, the standard devaton of the checkng data sets error. The former crteron ndcates the accuracy of the HLA-BDRNN agorthm whe the ater one shows ts robustness to weght ntazatons. The BPTT scheme for the BDRNN s consderaby nferor to the HLA-BDRNN method regardng the accuracy and the generazaton property. Furthermore t has an error standard devaton amost twce as arge compared to the one attaned by HLA-BDRNN, eadng to the concuson that HLA-BDRNN acceerates the earnng process for the BDRNN sgnfcanty, whe exhbtng nsenstvty to nta weght settngs. Tabe 1: Resuts of the comparatve anayss, averaged over 100 ndependent tras wth dfferent weght ntazatons Network Tranng Checkng Checkng No. of type method MSE Avg MSE StD weghts BDRNN HLA-BDRNN BDRNN BPTT IIR-MLP BPTT DRNN BPTT MNN RTRL It shoud be ponted out that the proposed agorthm has been deveoped for Scaed Orthogona BDRNNs. However, t can be easy modfed to take nto consderaton the archtectura dfferences of the Free-Form BDRNN (.e. four tunabe feedback weghts at each bock of neurons of the hdden ayer). 5 Concuson A nove earnng agorthm for tranng the speca cass of Bock-Dagona Recurrent Neura Networks has been proposed, entted HLA-BDRNN. The hybrd method combnes gradent descent and the random search technque of Smuated Anneang, and s taored to BDRNN, by takng nto account the tempora reatons exstng n ths partcuar dynamc system. It shoud be mentoned that the agorthm can be easy adapted wth moderate modfcatons to reevant archtectures ke the tranguar recurrent neura network [6].

12 12 Dmtrs Varsams, Pars Mastorocostas and Evangeos Outsos The earnng scheme has been compared wth a seres of agorthms and recurrent networks n the context of a nonnear system dentfcaton benchmark probem, where ts earnng characterstcs have been hghghted. Acknowedgements. The authors wsh to acknowedge fnanca support provded by the Research Commttee of the Technoogca Educaton Insttute of Centra Macedona, under grant SAT/IC/ /12. References [1] C. Ige, H. Husken, Emprca Evauaton of the Improved RPROP Learnng Agorthms, Neurocomputng, 50 (2003), [2] C.-C. Ku, K.Y. Lee, Dagona Recurrent Neura Networks for Dynamc Systems Contro, IEEE Transactons on Neura Networks, 6 (1995), no. 1, [3] R. Kumar, S. Srvastava, J.R.P. Gupta, Dagona Recurrent Neura Network Based Adaptve Contro of Nonnear Dynamca Systems Usng Lyapunov Stabty Crtera, ISA Transactons, 67 (2017), [4] P. Mastorocostas, C. Has, D. Varsams, S. Dova, A Recurrent Neura Network-based Forecastng System for Teecommuncatons Ca Voume, Apped Mathematcs & Informaton Scences, 7 (2013), no. 5, [5] P. Mastorocostas, J.B. Theochars, A Stabe Learnng Agorthm for Bock-Dagona Recurrent Neura Networks: Appcaton to the Anayss of Lung Sounds, IEEE Transactons on Systems, Man, and Cybernetcs, Part B: Cybernetcs, 36 (2006), no. 2, [6] K.S. Narendra, K. Parthasarathy, Identfcaton and Contro of Dynamca Systems usng Neura Networks, IEEE Transactons on Neura Networks, 1 (1990), no. 1, [7] S. Pche, Steepest Descent Agorthms for Neura Network Controers and Fters, IEEE Transactons on Neura Networks, 5 (1994), no. 2,

13 Hybrd earnng agorthm 13 [8] M. Redmer, H. Braun, A Drect Adaptve Method for Faster Backpropagaton Learnng: The RPROP Agorthm, IEEE Internatona Conference on Neura Networks, (1993), [9] P.S. Sastry, G. Santharam, K.P. Unnkrshnan, Memory Neuron Networks for Identfcaton and Contro of Dynamca Systems, IEEE Transactons on Neura Networks, 5 (1994), no. 2, [10] S. Svakumar, W. Robertson, W.J. Phps, Onne Stabzaton of Bock- Dagona Recurrent Neura Networks, IEEE Transactons on Neura Networks, 10 (1999), no. 1, [11] S. Svakumar, Sh. Svakumar, Margnay Stabe Tranguar Recurrent Neura Network Archtecture for Tme Seres Predcton, IEEE Transactons on Cybernetcs, (2017), [12] N.K. Treadgod, T.D. Gedeon, Smuated Anneang and Weght Decay n Adaptve Learnng: The SARPROP Agorthm, IEEE Transactons on Neura Networks, 9 (1998), no. 4, [13] A.C. Tso, A.D. Back, Locay Recurrent Gobay Feedforward Networks: A Crtca Revew of Archtectures, IEEE Transactons on Neura Networks, 5 (1994), no. 2, [14] P. Werbos, Beyond Regresson: New Toos for Predcton and Anayss n the Behavora Scences, Ph.D. Thess, Harvard Unv., Receved: December 9, 2017; Pubshed: January 5, 2018

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory Advanced Scence and Technoogy Letters Vo.83 (ISA 205), pp.60-65 http://dx.do.org/0.4257/ast.205.83.2 Research on Compex etworks Contro Based on Fuzzy Integra Sdng Theory Dongsheng Yang, Bngqng L, 2, He

More information

Neural network-based athletics performance prediction optimization model applied research

Neural network-based athletics performance prediction optimization model applied research Avaabe onne www.jocpr.com Journa of Chemca and Pharmaceutca Research, 04, 6(6):8-5 Research Artce ISSN : 0975-784 CODEN(USA) : JCPRC5 Neura networ-based athetcs performance predcton optmzaton mode apped

More information

A Recurrent Neural Network based Forecasting System for Telecommunications Call Volume

A Recurrent Neural Network based Forecasting System for Telecommunications Call Volume App. Math. Inf. Sc. 7, No. 5, 1643-1650 (2013) 1643 Apped Mathematcs & Informaton Scences An Internatona Journa http://dx.do.org/10.12785/ams/070501 A Recurrent Neura Network based Forecastng System for

More information

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks Shengyang Sun, Changyou Chen, Lawrence Carn Suppementary Matera: Learnng Structured Weght Uncertanty n Bayesan Neura Networks Shengyang Sun Changyou Chen Lawrence Carn Tsnghua Unversty Duke Unversty Duke

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System.

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System. Part 7: Neura Networ & earnng /2/05 Superved earnng Neura Networ and Bac-Propagaton earnng Produce dered output for tranng nput Generaze reaonaby & appropratey to other nput Good exampe: pattern recognton

More information

MARKOV CHAIN AND HIDDEN MARKOV MODEL

MARKOV CHAIN AND HIDDEN MARKOV MODEL MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not

More information

Image Classification Using EM And JE algorithms

Image Classification Using EM And JE algorithms Machne earnng project report Fa, 2 Xaojn Sh, jennfer@soe Image Cassfcaton Usng EM And JE agorthms Xaojn Sh Department of Computer Engneerng, Unversty of Caforna, Santa Cruz, CA, 9564 jennfer@soe.ucsc.edu

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques Energes 20, 4, 73-84; do:0.3390/en40073 Artce OPEN ACCESS energes ISSN 996-073 www.mdp.com/journa/energes Short-Term Load Forecastng for Eectrc Power Systems Usng the PSO-SVR and FCM Custerng Technques

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Application of support vector machine in health monitoring of plate structures

Application of support vector machine in health monitoring of plate structures Appcaton of support vector machne n heath montorng of pate structures *Satsh Satpa 1), Yogesh Khandare ), Sauvk Banerjee 3) and Anrban Guha 4) 1), ), 4) Department of Mechanca Engneerng, Indan Insttute

More information

The Jacobsthal and Jacobsthal-Lucas Numbers via Square Roots of Matrices

The Jacobsthal and Jacobsthal-Lucas Numbers via Square Roots of Matrices Internatonal Mathematcal Forum, Vol 11, 2016, no 11, 513-520 HIKARI Ltd, wwwm-hkarcom http://dxdoorg/1012988/mf20166442 The Jacobsthal and Jacobsthal-Lucas Numbers va Square Roots of Matrces Saadet Arslan

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory Proceedngs of the 2009 IEEE Internatona Conference on Systems Man and Cybernetcs San Antono TX USA - October 2009 Mutspectra Remote Sensng Image Cassfcaton Agorthm Based on Rough Set Theory Yng Wang Xaoyun

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident ICTCT Extra Workshop, Bejng Proceedngs The Appcaton of BP Neura Network prncpa component anayss n Forecastng Road Traffc Accdent He Mng, GuoXucheng &LuGuangmng Transportaton Coege of Souast Unversty 07

More information

COXREG. Estimation (1)

COXREG. Estimation (1) COXREG Cox (972) frst suggested the modes n whch factors reated to fetme have a mutpcatve effect on the hazard functon. These modes are caed proportona hazards (PH) modes. Under the proportona hazards

More information

A finite difference method for heat equation in the unbounded domain

A finite difference method for heat equation in the unbounded domain Internatona Conerence on Advanced ectronc Scence and Technoogy (AST 6) A nte derence method or heat equaton n the unbounded doman a Quan Zheng and Xn Zhao Coege o Scence North Chna nversty o Technoogy

More information

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang Dstrbuted Movng Horzon State Estmaton of Nonnear Systems by Jng Zhang A thess submtted n parta fufment of the requrements for the degree of Master of Scence n Chemca Engneerng Department of Chemca and

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic parametrc Lnear Programmng Mode Descrbng Bandwdth Sharng Poces for BR Traffc I. Moschoos, M. Logothets and G. Kokknaks Wre ommuncatons Laboratory, Dept. of Eectrca & omputer Engneerng, Unversty of Patras,

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

Sensitivity Analysis Using Neural Network for Estimating Aircraft Stability and Control Derivatives

Sensitivity Analysis Using Neural Network for Estimating Aircraft Stability and Control Derivatives Internatona Conference on Integent and Advanced Systems 27 Senstvty Anayss Usng Neura Networ for Estmatng Arcraft Stabty and Contro Dervatves Roht Garhwa a, Abhshe Hader b and Dr. Manoranan Snha c Department

More information

Adaptive LRBP Using Learning Automata for Neural Networks

Adaptive LRBP Using Learning Automata for Neural Networks Adaptve LRBP Usng Learnng Automata for eura etworks *B. MASHOUFI, *MOHAMMAD B. MEHAJ (#, *SAYED A. MOTAMEDI and **MOHAMMAD R. MEYBODI *Eectrca Engneerng Department **Computer Engneerng Department Amrkabr

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions Decentrazed Adaptve Contro for a Cass of Large-Scae onnear Systems wth Unknown Interactons Bahram Karm 1, Fatemeh Jahangr, Mohammad B. Menhaj 3, Iman Saboor 4 1. Center of Advanced Computatona Integence,

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

NONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM

NONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM Journa of heoretca and Apped Informaton echnoogy th February 3. Vo. 48 No. 5-3 JAI & LLS. A rghts reserved. ISSN: 99-8645 www.jatt.org E-ISSN: 87-395 NONLINEAR SYSEM IDENIFICAION BASE ON FW-LSSVM, XIANFANG

More information

Networked Cooperative Distributed Model Predictive Control Based on State Observer

Networked Cooperative Distributed Model Predictive Control Based on State Observer Apped Mathematcs, 6, 7, 48-64 ubshed Onne June 6 n ScRes. http://www.scrp.org/journa/am http://dx.do.org/.436/am.6.73 Networed Cooperatve Dstrbuted Mode redctve Contro Based on State Observer Ba Su, Yanan

More information

Cyclic Codes BCH Codes

Cyclic Codes BCH Codes Cycc Codes BCH Codes Gaos Feds GF m A Gaos fed of m eements can be obtaned usng the symbos 0,, á, and the eements beng 0,, á, á, á 3 m,... so that fed F* s cosed under mutpcaton wth m eements. The operator

More information

Delay tomography for large scale networks

Delay tomography for large scale networks Deay tomography for arge scae networks MENG-FU SHIH ALFRED O. HERO III Communcatons and Sgna Processng Laboratory Eectrca Engneerng and Computer Scence Department Unversty of Mchgan, 30 Bea. Ave., Ann

More information

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems Mathematca Probems n Engneerng Voume 213 Artce ID 945342 7 pages http://dxdoorg/11155/213/945342 Research Artce H Estmates for Dscrete-Tme Markovan Jump Lnear Systems Marco H Terra 1 Gdson Jesus 2 and

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Greyworld White Balancing with Low Computation Cost for On- Board Video Capturing

Greyworld White Balancing with Low Computation Cost for On- Board Video Capturing reyword Whte aancng wth Low Computaton Cost for On- oard Vdeo Capturng Peng Wu Yuxn Zoe) Lu Hewett-Packard Laboratores Hewett-Packard Co. Pao Ato CA 94304 USA Abstract Whte baancng s a process commony

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints Internatona Journa Optma of Contro, Guaranteed Automaton, Cost Contro and Systems, of Lnear vo Uncertan 3, no Systems 3, pp 397-4, wth Input September Constrants 5 397 Optma Guaranteed Cost Contro of Lnear

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

On the Power Function of the Likelihood Ratio Test for MANOVA

On the Power Function of the Likelihood Ratio Test for MANOVA Journa of Mutvarate Anayss 8, 416 41 (00) do:10.1006/jmva.001.036 On the Power Functon of the Lkehood Rato Test for MANOVA Dua Kumar Bhaumk Unversty of South Aabama and Unversty of Inos at Chcago and Sanat

More information

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA RESEARCH ARTICLE MOELING FIXE OS BETTING FOR FUTURE EVENT PREICTION Weyun Chen eartment of Educatona Informaton Technoogy, Facuty of Educaton, East Chna Norma Unversty, Shangha, CHINA {weyun.chen@qq.com}

More information

RESEARCH ARTICLE. Solving Polynomial Systems Using a Fast Adaptive Back Propagation-type Neural Network Algorithm

RESEARCH ARTICLE. Solving Polynomial Systems Using a Fast Adaptive Back Propagation-type Neural Network Algorithm Juy 8, 6 8:57 Internatona Journa of Computer Mathematcs poynomas Internatona Journa of Computer Mathematcs Vo., No., Month, 9 RESEARCH ARTICLE Sovng Poynoma Systems Usng a Fast Adaptve Back Propagaton-type

More information

Asymptotics of the Solution of a Boundary Value. Problem for One-Characteristic Differential. Equation Degenerating into a Parabolic Equation

Asymptotics of the Solution of a Boundary Value. Problem for One-Characteristic Differential. Equation Degenerating into a Parabolic Equation Nonl. Analyss and Dfferental Equatons, ol., 4, no., 5 - HIKARI Ltd, www.m-har.com http://dx.do.org/.988/nade.4.456 Asymptotcs of the Soluton of a Boundary alue Problem for One-Characterstc Dfferental Equaton

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

Nested case-control and case-cohort studies

Nested case-control and case-cohort studies Outne: Nested case-contro and case-cohort studes Ørnuf Borgan Department of Mathematcs Unversty of Oso NORBIS course Unversty of Oso 4-8 December 217 1 Radaton and breast cancer data Nested case contro

More information

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS Mohd Yusoff Mashor School of Electrcal and Electronc Engneerng, Unversty Scence Malaysa, Pera Branch Campus,

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM

IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Existence of Two Conjugate Classes of A 5 within S 6. by Use of Character Table of S 6

Existence of Two Conjugate Classes of A 5 within S 6. by Use of Character Table of S 6 Internatonal Mathematcal Forum, Vol. 8, 2013, no. 32, 1591-159 HIKARI Ltd, www.m-hkar.com http://dx.do.org/10.12988/mf.2013.3359 Exstence of Two Conjugate Classes of A 5 wthn S by Use of Character Table

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Interference Alignment and Degrees of Freedom Region of Cellular Sigma Channel

Interference Alignment and Degrees of Freedom Region of Cellular Sigma Channel 2011 IEEE Internatona Symposum on Informaton Theory Proceedngs Interference Agnment and Degrees of Freedom Regon of Ceuar Sgma Channe Huaru Yn 1 Le Ke 2 Zhengdao Wang 2 1 WINLAB Dept of EEIS Unv. of Sc.

More information

3. Stress-strain relationships of a composite layer

3. Stress-strain relationships of a composite layer OM PO I O U P U N I V I Y O F W N ompostes ourse 8-9 Unversty of wente ng. &ech... tress-stran reatonshps of a composte ayer - Laurent Warnet & emo Aerman.. tress-stran reatonshps of a composte ayer Introducton

More information

The line method combined with spectral chebyshev for space-time fractional diffusion equation

The line method combined with spectral chebyshev for space-time fractional diffusion equation Apped and Computatona Mathematcs 014; 3(6): 330-336 Pubshed onne December 31, 014 (http://www.scencepubshnggroup.com/j/acm) do: 10.1164/j.acm.0140306.17 ISS: 3-5605 (Prnt); ISS: 3-5613 (Onne) The ne method

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Numerical Investigation of Power Tunability in Two-Section QD Superluminescent Diodes

Numerical Investigation of Power Tunability in Two-Section QD Superluminescent Diodes Numerca Investgaton of Power Tunabty n Two-Secton QD Superumnescent Dodes Matta Rossett Paoo Bardea Ivo Montrosset POLITECNICO DI TORINO DELEN Summary 1. A smpfed mode for QD Super Lumnescent Dodes (SLD)

More information

A Solution of the Harry-Dym Equation Using Lattice-Boltzmannn and a Solitary Wave Methods

A Solution of the Harry-Dym Equation Using Lattice-Boltzmannn and a Solitary Wave Methods Appled Mathematcal Scences, Vol. 11, 2017, no. 52, 2579-2586 HIKARI Ltd, www.m-hkar.com https://do.org/10.12988/ams.2017.79280 A Soluton of the Harry-Dym Equaton Usng Lattce-Boltzmannn and a Soltary Wave

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Development of a General Purpose On-Line Update Multiple Layer Feedforward Backpropagation Neural Network

Development of a General Purpose On-Line Update Multiple Layer Feedforward Backpropagation Neural Network Master Thess MEE 97-4 Made by Development of a General Purpose On-Lne Update Multple Layer Feedforward Backpropagaton Neural Network Master Program n Electrcal Scence 997 College/Unversty of Karlskrona/Ronneby

More information

Code_Aster. Identification of the model of Weibull

Code_Aster. Identification of the model of Weibull Verson Ttre : Identfcaton du modèle de Webull Date : 2/09/2009 Page : /8 Responsable : PARROT Aurore Clé : R70209 Révson : Identfcaton of the model of Webull Summary One tackles here the problem of the

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala

More information

Approximate merging of a pair of BeÂzier curves

Approximate merging of a pair of BeÂzier curves COMPUTER-AIDED DESIGN Computer-Aded Desgn 33 (1) 15±136 www.esever.com/ocate/cad Approxmate mergng of a par of BeÂzer curves Sh-Mn Hu a,b, *, Rou-Feng Tong c, Tao Ju a,b, Ja-Guang Sun a,b a Natona CAD

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

WAVELET-BASED IMAGE COMPRESSION USING SUPPORT VECTOR MACHINE LEARNING AND ENCODING TECHNIQUES

WAVELET-BASED IMAGE COMPRESSION USING SUPPORT VECTOR MACHINE LEARNING AND ENCODING TECHNIQUES WAVELE-BASED IMAGE COMPRESSION USING SUPPOR VECOR MACHINE LEARNING AND ENCODING ECHNIQUES Rakb Ahmed Gppsand Schoo of Computng and Informaton echnoogy Monash Unversty, Gppsand Campus Austraa. Rakb.Ahmed@nfotech.monash.edu.au

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

Monica Purcaru and Nicoleta Aldea. Abstract

Monica Purcaru and Nicoleta Aldea. Abstract FILOMAT (Nš) 16 (22), 7 17 GENERAL CONFORMAL ALMOST SYMPLECTIC N-LINEAR CONNECTIONS IN THE BUNDLE OF ACCELERATIONS Monca Purcaru and Ncoeta Adea Abstract The am of ths paper 1 s to fnd the transformaton

More information

MODEL TUNING WITH THE USE OF HEURISTIC-FREE GMDH (GROUP METHOD OF DATA HANDLING) NETWORKS

MODEL TUNING WITH THE USE OF HEURISTIC-FREE GMDH (GROUP METHOD OF DATA HANDLING) NETWORKS MODEL TUNING WITH THE USE OF HEURISTIC-FREE (GROUP METHOD OF DATA HANDLING) NETWORKS M.C. Schrver (), E.J.H. Kerchoffs (), P.J. Water (), K.D. Saman () () Rswaterstaat Drecte Zeeand () Deft Unversty of

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION European Journa of Mathematcs and Computer Scence Vo. No. 1, 2017 ON AUTOMATC CONTNUTY OF DERVATONS FOR BANACH ALGEBRAS WTH NVOLUTON Mohamed BELAM & Youssef T DL MATC Laboratory Hassan Unversty MORO CCO

More information

Multi-layer neural networks

Multi-layer neural networks Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent

More information

A new P system with hybrid MDE- k -means algorithm for data. clustering. 1 Introduction

A new P system with hybrid MDE- k -means algorithm for data. clustering. 1 Introduction Wesun, Lasheng Xang, Xyu Lu A new P system wth hybrd MDE- agorthm for data custerng WEISUN, LAISHENG XIANG, XIYU LIU Schoo of Management Scence and Engneerng Shandong Norma Unversty Jnan, Shandong CHINA

More information

Typical Neuron Error Back-Propagation

Typical Neuron Error Back-Propagation x Mutayer Notaton 1 2 2 2 y Notaton ayer of neuron abeed 1,, N neuron n ayer = vector of output from neuron n ayer nput ayer = x (the nput pattern) output ayer = y (the actua output) = weght between ayer

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Chapter 6. Rotations and Tensors

Chapter 6. Rotations and Tensors Vector Spaces n Physcs 8/6/5 Chapter 6. Rotatons and ensors here s a speca knd of near transformaton whch s used to transforms coordnates from one set of axes to another set of axes (wth the same orgn).

More information

Neuro-Adaptive Design II:

Neuro-Adaptive Design II: Lecture 37 Neuro-Adaptve Desgn II: A Robustfyng Tool for Any Desgn Dr. Radhakant Padh Asst. Professor Dept. of Aerospace Engneerng Indan Insttute of Scence - Bangalore Motvaton Perfect system modelng s

More information

DISTRIBUTED PROCESSING OVER ADAPTIVE NETWORKS. Cassio G. Lopes and Ali H. Sayed

DISTRIBUTED PROCESSING OVER ADAPTIVE NETWORKS. Cassio G. Lopes and Ali H. Sayed DISTRIBUTED PROCESSIG OVER ADAPTIVE ETWORKS Casso G Lopes and A H Sayed Department of Eectrca Engneerng Unversty of Caforna Los Angees, CA, 995 Ema: {casso, sayed@eeucaedu ABSTRACT Dstrbuted adaptve agorthms

More information

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages Appcaton of Partce Swarm Optmzaton to Economc Dspatch Probem: Advantages and Dsadvantages Kwang Y. Lee, Feow, IEEE, and Jong-Bae Par, Member, IEEE Abstract--Ths paper summarzes the state-of-art partce

More information

An Effective Training Method For Deep Convolutional Neural Network

An Effective Training Method For Deep Convolutional Neural Network An Effectve Tranng Method For Deep Convoutona Neura Network Yang Jang 1*, Zeyang Dou 1*, Qun Hao 1,, Je Cao 1, Kun Gao 1, X Chen 3 1. Schoo of Optoeectronc, Bejng Insttute of Technoogy. Bejng, Chna, 100081.

More information

Optimization of JK Flip Flop Layout with Minimal Average Power of Consumption based on ACOR, Fuzzy-ACOR, GA, and Fuzzy-GA

Optimization of JK Flip Flop Layout with Minimal Average Power of Consumption based on ACOR, Fuzzy-ACOR, GA, and Fuzzy-GA Journa of mathematcs and computer Scence 4 (05) - 5 Optmzaton of JK Fp Fop Layout wth Mnma Average Power of Consumpton based on ACOR, Fuzzy-ACOR, GA, and Fuzzy-GA Farshd Kevanan *,, A Yekta *,, Nasser

More information

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1 P. Guterrez Physcs 5153 Classcal Mechancs D Alembert s Prncple and The Lagrangan 1 Introducton The prncple of vrtual work provdes a method of solvng problems of statc equlbrum wthout havng to consder the

More information