A Recurrent Neural Network based Forecasting System for Telecommunications Call Volume

Size: px
Start display at page:

Download "A Recurrent Neural Network based Forecasting System for Telecommunications Call Volume"

Transcription

1 App. Math. Inf. Sc. 7, No. 5, (2013) 1643 Apped Mathematcs & Informaton Scences An Internatona Journa A Recurrent Neura Network based Forecastng System for Teecommuncatons Ca Voume Pars Mastorocostas, Constantnos Has, Dmtrs Varsams and Stergan Dova Department of Informatcs and Communcatons, Technoogca Educaton Insttute of Serres, Serres, Greece Receved: 27 Dec. 2012, Revsed: 29 Apr. 2013, Accepted: 2 May Pubshed onne: 1 Sep Abstract: A recurrent neura network based forecastng system for teecommuncatons ca voume s proposed n ths work. In partcuar, the forecaster s a Bock Dagona Recurrent Neura Network wth nterna feedback. Mode s performance s evauated by use of rea word teecommuncatons data, where an extensve comparatve anayss wth a seres of exstng forecasters s conducted, ncudng both tradtona modes as we as neura and fuzzy approaches. Keywords: teecommuncatons data voume forecastng, neura modeng, bock dagona recurrent neura network 1 Introducton Durng the ast twenty years the proferaton of mobe phones, aong wth the exposve growth n teecommuncatons servces have turned ths scentfc fed to an mportant and deveopng ndustry. Lke any other busness, carrers seek proft maxmzaton and cost reducton, by managng network traffc effectvey, by optmzng ther nfrastructure usage and by predctng future trends. In order to acheve ther objectves, teecommuncatons managers and operaton offcers are n contnuous need of accurate forecastng modes of the ca voume. In ths perspectve, ths work ams at addressng the probem of ca voume forecastng for the case of a arge organzaton: a arge Greek Unversty s nvestgated, where new teephone numbers are added day and a varyng demand for outgong trunks exsts. The Unversty s database stores the tota number, as we as the number of the natona, the nternatona and the mobe cas per empoyee and per month. It has been notced that the ca cassfcaton nto dfferent categores reveas certan and dfferent patterns between destnatons. Intay, tradtona forecastng methods - mosty statstca - were apped to ths partcuar probem. These methods can be dvded nto three major categores: The frst method empoyed s the Na ve Forecast 1 [1], whch uses the most recent observaton as a forecast for the next tme nterva. A method that takes nto account the seasona factors s the Lnear Extrapoaton wth Seasona Adjustment (LESA, [2]): After seasonaty has been removed from the orgna data, near extrapoaton s apped n order to forecast the future vaues of the seres by empoyng the trend cyce component. Fnay, the projected trend cyce component s adjusted, makng use of the dentfed seasona factors. LESA M assumes mutpcatve seasonaty, whe LESA ADD s based on addtve seasonaty. The second cassc group of tme seres anayss technques ncudes the exponenta smoothng methods, where a partcuar observaton of the tme seres s expressed as a weghted sum of the prevous observatons. The weghts for the prevous data vaues consttute a geometrc seres and become smaer as the observatons move further nto the past. In processes wthout trend Smpe Exponenta Smoothng (SES) s apped, whe near trend and seasona data are accommodated by Hot s [3] and Wnters [4] methods, respectvey. Addtonay, mutpcatve seasona modes (Wnters MS) as we as addtve seasona modes (Wnters AS) exst [5]. The ast category concerns tme seres that exhbt damped trend, whch refers to a regresson component for the trend n the updatng equaton, expressed by means of a dampenng factor. In these cases, some modfcatons of SES can be apped n order to dea wth compex types of trend. An exponenta smoothng mode wth damped trend and addtve seasonaty (DAMP AS) and ts Correspondng author e-ma: mast@teser.gr Natura Scences Pubshng Cor.

2 1644 P. Mastorocostas et a.: A Recurrent Neura Network based Forecastng System... mutpcatve seasonaty counterpart (DAMP MS) aso exst, whe a damped trend mode on tme seres wth no seasonaty (DAMP NoS) can be ftted [5]. Fnay, the Auto Regressve Integrated Movng Average method (ARIMA) and Seasona ARIMA (SARIMA), whch anayze statonary unvarate tme seres data, have been very popuar [6]. It presumes weak statonarty, equay spaced ntervas or observatons, and at east 30 to 50 observatons. The we estabshed methods mentoned above have been studed n [2]. Lnear modes are aso suggested by the ITU Recommendaton E.507 for trend forecastng n teecommuncatons data [7]. It has been proved that Damped trend modes and SARIMA are the most effectve, when apped to the probem n hand, compared to the rest of the tradtona methods. Computatona Integence methods were frst apped n [8,9,10], where fuzzy and neurofuzzy systems were empoyed. In [8] a statc fuzzy mode s presented (OLS FFM, Orthogona Least Squares based Fuzzy Forecastng Mode), where an automated mode budng process, based on the Orthogona Least Squares agorthm, performs nput seecton, determnes the number of fuzzy rues, whch are of Takag Sugeno Kang type [11], forms the consequent parts of the fuzzy rues and tunes the consequent parameters. Recurrent neurofuzzy forecasters are deveoped n [9] and [10]. In [9] the Locay Recurrent Neurofuzzy Forecastng System (LR NFFS) s presented: the consequent part of each rue consst of a three ayer recurrent neura network, wth nterna feedback at the neurons of the hdden and output ayers, and a snge nput common to the premse and consequent parts. In [10] the Recurrent Neurofuzzy Forecaster (ReNNFOR) s ntroduced, whch s a respectve recurrent system of reduced compexty, wth unt feedback at the hdden ayer s neurons. A three modes exhbted sgnfcanty ameorated predcton capabtes wth respect to the tradtona statstca methods, wth the recurrent forecasters requrng ony the vaue of the most recent observaton, snce they are capabe of dentfyng the tempora reatons va the nterna feedback connecton. A sparse recurrent neura network, where the feedback connectons are restrcted to between pars of state varabes s suggested n [12], referred to as the Bock dagona recurrent neura network (BDRNN). It s a two ayer network, a feedback ayer comprsng the bock dagona nks, and an output ayer where the state varabes are combned to formuate the network s output. The BDRNN archtecture offers a sgnfcant feature: t consttutes a sutabe choce for modeng nonnear processes, where the overa dynamcs can be decomposed nto a set of ower order dynamcs. Under these condtons, the bocks of BDRNN can be used to mode these sub dynamcs by adaptng propery the respectve bock weghts. In ths perspectve, a forecastng system, based on BDRNN, s proposed n ths work. Its performance s compared wth famar forecastng approaches, ke a seres of seasonay adjusted near extrapoaton methods, Exponenta Smoothng Methods, the SARIMA method, aong wth the aforementoned fuzzy and neura forecasters. The rest of the paper s organzed as foows: n Secton 2, a bref presentaton of the BDRNN s gven and the modeng method s descrbed. Secton 3 hosts the expermenta resuts and a comparatve anayss, whe the concudng remarks are gven n Secton 4. 2 The forecaster s structure and the modeng method 2.1 BDRNN The bock dagona recurrent neura network s a two ayer network. The hdden ayer conssts of pars of neurons (bocks); there are feedback connectons between the neurons of each par, ntroducng dynamcs to the network, whe the output ayer s statc. Therefore, the BDRNN can be consdered as a speca form of ocay recurrent gobay feedforward network [13, 14]. The mpementaton of the proposed agorthm s straghtforward, wth no precondtons requred. The confguraton of BDRNN s presented n Fg. 1, where a snge nput snge output (SISO) BDRNN s shown, snce a SISO network w be used n the seque. For the sake of smpcty, a BDRNN wth four bocks of neurons s shown. Fg. 1: Confguraton of the Bock Dagona Recurrent Neura Network. The operaton of a SISO BDRNN wth N neurons at the hdden ayer s descrbed by the foowng set of state equatons: x(k) f a (W x(k 1)+B u(k)) y(k) f b (C x(k)) (1a) (1b) Natura Scences Pubshng Cor.

3 App. Math. Inf. Sc. 7, No. 5, (2013) / where f a s a N-eement vector ncudng the neuron actvaton functons of the hdden ayer, and f b s the actvaton functon of the output ayer. The actvaton functons are both chosen to be the sgmod functon 1 e 2 z f(z) 1+e 2 z. u(k) s the nput of the network at tme k. x(k) [x (k)] s a N-eement vector, comprsng the outputs of the hdden ayer. In partcuar, x (k) s the output of the -th hdden neuron at tme k. y(k) s the output of the network at tme k. B [b ] and C [c ] are N-eement nput and output weght vectors, respectvey. W [w, j ] s the N N bock dagona feedback matrx. In partcuar, 0 f j 0 f j and j 1 and s odd w, j 0 f j and j+ 1 and s even 0 otherwse The feedback [ matrx, ] W, s bock dagona: W dag W (1),...,W (N 2 ) ; each dagona eement, correspondng to a bock of recurrent neurons, has a bock submatrx n the form [ W () w2,2 w 2,2+1 ], 1,2,..., N (2) w 2+1,2 w 2+1,2+1 2 The above equaton descrbes the genera case of BDRNN, caed the BDRNN wth free form submatrces. A speca case of BDRNN conssts of scaed orthogona submatrces takng the foowng form W () [ ] w2,2 w 2,2+1 w 2,2+1 w 2,2 [ ] w (1) w (2) w (2) w (1), 1,2,..., N 2 (3) From (2) and (3) becomes evdent that the Free Form BDRNN conssts of feedback submatrces wth four dstnct eements, thus provdng a greater degree of freedom compared to the Scaed Orthogona BDRNN, whch has two weghts at each feedback submatrx. Nevertheess, as dscussed n [12], the atter network exhbts superor modeng capabtes than the Free Form BDRNN. In vew of the above, the state equatons (1) for the case of the Scaed Orthogona BDRNN are wrtten n the foowng form: x 2 1 (k) f a ( b 2 1 u(k)+w (1) w (2) x 2 (k 1) x 2 1 (k 1)+ ) 1,2,..., N 2 ( x 2 (k) f a b 2 u(k) w (2) x 2 1 (k 1)+ w (1) x 2 (k 1) ) 1,2,..., N 2 (4a) (4b) y(k) f b ( N j1c j x j (k) 2.2 The modeng method ) (4c) In terature there exst two earnng schemes especay desgned for ths knd of neura network [12, 15]. However, they are both desgned for ensurng network stabty: n [12] a smpe gradent descent agorthm s used, wth the addton of a feedfoward neura network whch ams at keeps stabe earnng, operatng n parae. A smar task s performed more effcenty n [15] wthout the need for an addtona network, by transformng the whoe earnng process to a constraned optmzaton probem. In ths work the stabty ssue s not nvestgated, snce the earnng process has an adaptaton scheme for the magntude of the weght changes, thus confnng the weght space. The earnng agorthm s based on Resent Backpropagaton (RPROP, [16]). RPROP agorthm was orgnay deveoped for statc neura networks and consttutes one of the best performng frst order earnng methods for neura networks, snce t exhbts mproved performance characterstcs by remedyng the drawbacks nherent to gradent descent earnng [17]. A varaton of RPROP for recurrent modes s gven n [18], where a smar behavor has been reported. In the case of BDRNN, the agorthm s modfed n order to take nto consderaton the speca features of the BDRNN, requrng cacuaton of the error gradents for the feedback weghts. Let us consder a tranng data set of nput output pars. The Mean Squared Error s seected as the error measure, where y d (k) s the actua observaton: E 1 (y(k) y d (k)) 2 (5) (t) and + E(t 1) are the ordered dervatves [19] θ θ of E wth respect} to a network s weght θ (θ b,c,w (1),w (2) ) at the present, t, and the precedng, t 1, epochs, respectvey. The earnng method s descrbed as foows: (a) For a weghts θ, ntaze the step szes (1) 0 Repeat (b) For a weghts θ, compute the error gradent: + E(t) θ (c) For a weghts θ, update step szes: (c.1) If + E(t) θ + E(t 1) θ (t) > 0 then mn η + } (t 1), max (6a) Natura Scences Pubshng Cor.

4 1646 P. Mastorocostas et a.: A Recurrent Neura Network based Forecastng System... (c.2) Ese f + E(t) θ (c.3) Ese (t) + E(t 1) θ max (t) < 0 then η (t 1), mn } (t 1) (6b) (6c) (d) Update weghts θ : ( + ) E(t) θ (t+ 1)θ (t)+ θ (t)θ (t) sgn (t) θ (7) Unt convergence where the step szes are bounded by mn, max n order to avod overfow/underfow probems of foatng pont varabes. The ncrease and attenuaton factors are usuay set n + [1.1,1.6] and n [0.4,0.95], respectvey. A four parameters are determned usng the tra and error approach. The adaptaton mechansm descrbed above has the advantage of correatng the step szes not to the sze of the dervatves but to ther sgns. Hence, whenever a parameter moves aong a drecton reducng E (the dervatves at successve epochs have the same sgn), ts step sze s ncreasng ndependenty of the sze of the dervatve. In ths way, the step szes can suffcenty ncrease when needed, even at the fna stage of the earnng process when the szes of the dervatves are rather sma. Addtonay, when changes n the sgn of the dervatve occur, the step sze s dmnshng to prevent the error measure from oscatng. As mentoned n [20], due to the tempora reatons exstng n a dynamc system, the extracton of the ordered parta dervatves s not straghtforward and s accompshed va a set of recursve equatons. Cacuaton of the error gradents s based on the use of Lagrange mutpers, as shown beow: b 2 1 w (1) b 2 c 2 1 λx,2 1 (k) u(k) f a(k,2 1) } (8a) c 2 λx,2 (k) u(k) f a(k,2) } (8b) λy (k) x 2 1 (k) f b(k) } (9a) λy (k) x 2 (k) f b(k) } (9b) λx,2 1 (k) x 2 1 (k 1) f a(k,2 1)+ +λ x,2 (k) x 2 (k 1) f a(k,2) } (10a) w (2) λx,2 1 (k) x 2 (k 1) f a(k,2 1) λ x,2 (k) x 2 1 (k 1) f a(k,2) } (10b) wth the Lagrange mutpers derved as foows: λ y (k) 2 (y(k) y d (k)) (11) λ x,2 1 (k)λ y (k) c 2 1 f b(k)+ + λ x,2 1 (k+ 1) w (1) f a(k+ 1,2 1) λ x,2 (k+ 1) w (2) f a(k+ 1,2) (12a) λ x,2 (k)λ y (k) c 2 f b(k)+ + λ x,2 1 (k+ 1) w (2) f a(k+ 1,2 1)+ + λ x,2 (k+ 1) w (1) f a(k+ 1,2) (12b) where f a(k + 1,) and f b(k) are the dervatves of x (k + 1) and y(k), wth respect to ther arguments. Equatons (12) are backward dfference equatons that can be soved for k 1,...,1 usng the boundary condtons: λ x,2 1 ( )λ y ( ) c 2 1 f b( ) λ x,2 ( )λ y ( ) c 2 f b( ) (13a) (13b) It shoud be noted that the earnng method s deveoped for the SISO Scaed Orthogona BDRNN. Nevertheess, the suggested method s genera, aso appcabe to mut nput mutpe output BDRNN s of any form, provded that sght modfcatons are made, based on the structura dfferences. 3 Expermenta resuts 3.1 Data presentaton accuracy measures The ca data come from the Ca Deta Records (CDR) of the Prvate Branch Exchange (PBX) of a arge Unversty wth more than empoyees and students, and an extended teecommuncatons nfrastructure wth more than teephones. The data set covers a perod of 10 years, January 1998 to December 2007, and conssts of the monthy cas to natona and mobe destnatons. Cas to natona destnatons comprse amost haf the voume of the tota outgong cas from the campus. On the other hand, cas to mobe destnatons are subject to hgher tarffs and they demonstrate an ncreasng trend durng the ast ffteen years. Accordng to the Internatona Teecommuncaton Unon (ITU), mobe servces Natura Scences Pubshng Cor.

5 App. Math. Inf. Sc. 7, No. 5, (2013) / attaned a penetraton rate of % for Greece n 2009 [21]. The data set s dvded to two subsets: The tranng set, whch s used to perform parameter tunng, and the testng set, whch s used for the evauaton of the forecasts. The tranng set s chosen to be 9 years ong (108 observatons) and the vadaton set 1 year ong (12 observatons). Due to the varaton of days beongng n dfferent months,.e. February has 28 whe January has 31 days, a data are normazed accordng to (X k s the actua vaue): y d (k)x k /12 no ofdaysnmonthk (14) Due to ts recurrent nature, the forecaster s traned n parae mode. Snce the testng set s not used n mode fttng, the testng set s forecasts are genune forecasts and can be used to evauate the forecastng abty of each mode. The forecastng accuracy can be evauated by means of three accuracy measures ( s the sze of the data set and y(k) s the forecaster s output). The smaer vaue of each statstc ndcates the better ft of the method to the observed data. The Root Mean Squared Error (RMSE): k E 1 f (y(k) y d (k)) 2 The Mean Absoute Percentage Error (MAPE): MAPE 100 y d (k) y(k) y d (k) (15a) (15b) The The s U-statstc, whch aows a reatve comparson of forma methods wth nput output approaches and aso squares the errors nvoved, so that arge errors are gven much more weght than sma errors. It s gven by: U 1 1 ( ) y(k+1) yd (k+1) 2 y d (k) (15c) ( ) yd (k+1) y d (k) 2 y d (k) The tme seres of natona and mobe cas are hosted n Fg. 2a and 2b, respectvey. It s evdent that there exsts a dstnct seasona pattern, whch s made prevaent from the mnmum that occurs n August. Moreover, the number of cas to mobe destnatons shows an ncreasng trend whch comports wth reports on mobe servces penetraton [21]. 3.2 Comparatve anayss In order to nvestgate whether the recurrent neura system s capabe of dscoverng the tempora Number of natona cas Number of mobe cas JUL 2001 JAN 2001 JUL 2000 JAN 2000 JUL 1999 JAN 1999 JUL 1998 JAN 1998 JUL 2001 JAN 2001 JUL 2000 JAN 2000 JUL 1999 JAN 1999 JUL 1998 JAN 1998 JUL 2002 JAN 2002 (a) JUL 2002 JAN 2002 (b) DEC 2007 JUL 2007 JAN 2007 JUL 2006 JAN 2006 JUL 2005 JAN 2005 JUL 2004 JAN 2004 JUL 2003 JAN 2003 Date Tranng set DEC 2007 JUL 2007 JAN 2007 JUL 2006 JAN 2006 JUL 2005 JAN 2005 JUL 2004 JAN 2004 JUL 2003 JAN 2003 Date Tranng set Vadaton set Vadaton set Fg. 2: Monthy number of outgong cas to (a) natona and (b) mobe destnatons. dependences of both tme seres of outgong cas, a snge nput snge output structure has been seected. The nput s the number of the natona/mobe cas of the prevous month (u(k)y d (k 1)). Severa BDRNN s wth dfferent structura characterstcs are examned and the seecton of the fna mode parameter combnaton s based on the crtera of (a) effectve predcton of the ca voume and (b) forecasters of moderate compexty. The seected structura characterstcs and the earnng parameters vaues of the fna modes are gven n Tabe 1. In order to compare the proposed forecaster wth exstng estabshed forecasters that were apped to ths partcuar probem n the past, a comparatve anayss wth fourteen other modes s conducted, on the bass of the accuracy measures mentoned n Subsecton 3.1. Three of the modes come from the fed of computatona ntegence (the statc fuzzy system descrbed n [8] and the recurrent neurofuzzy systems LR NFFS and ReNFFOR), whe the other eeven are we estabshed Natura Scences Pubshng Cor.

6 1648 P. Mastorocostas et a.: A Recurrent Neura Network based Forecastng System... Tabe 1: Structura characterstcs and earnng parameters Type of cas No of bocks n+ n mn max o Epochs Natona E Mobe E Tabe 2: Comparatve anayss (testng data set) Type of cas Natona cas Mobe cas Mode RMSE MAPE Thes U RMSE MAPE Thes U Proposed Forecaster LR NFFS ReNFFOR OLS FM NF LESA M LESA ADD SES Hots Lnear Wnter s MS Wnter s AS Damped NoS Damped MS Damped AS SARIMA statstca forecastng modes. The resuts for each one of these modes are presented n Tabe 2; bod numbers ndcate best ft. The performance of the competng rvas s taken from the correspondng references. The best ft modes for each ca data category are depcted n the foowng pots. We choose to present the forecasts produced by the proposed BDRNN mode, aong wth ts cosest fuzzy/neurofuzzy and ts cosest cassc compettors. In Fg. 3 the reader may see a comparson for the best ft modes for the case of natona cas. The pot reveas that the proposed mode foows the varaton of the actua tme seres coser than ts compettors, a fact that s aso evdent from ts better performance statstcs. In the same pot the 95% upper (UCL) and ower (LCL) confdence eves are depcted. These were estmated durng the SARIMA fttng process. It shoud aso be stressed that a three forecasts ft we wthn these confdence ntervas and woud bear scrutny wth even tghter confdence. Accordng to Fdes et a., damped trend modes are consdered a benchmarorecastng method for a others to beat [22]. In ths sense the proposed BDRNN mode exhbts exceptona performance. Vsua observaton of Fg. 4 reveas the dfferences between the proposed forecaster and ts best rvas for the case of mobe cas. The BDRNN gves better forecast, n the sense that t foows the pattern of the evouton of the seres more cosey as t dentfes a mnma and maxma. The 95% confdence ntervas for the forecasts Number of natona cas JAN DEC Actua data BDRNN forecast LR-NFFS forecast SARIMA forecast LCL UCL JUN MAY APR MAR FEB Date DEC NOV OCT SEP AUG JUL Fg. 3: Comparson of the forecastng abty of the proposed BDRNN forecaster wth the best representatves of the two rva mode fames and the observed number of cas to natona destnatons. 95% confdence nterva s aso depcted. were estmated durng the fttng process of the Damped MS mode. 4 Concusons In ths paper a recurrent neura network has been proposed and has been evauated by havng been apped Natura Scences Pubshng Cor.

7 App. Math. Inf. Sc. 7, No. 5, (2013) / servces n ther organzaton, ether for proft maxmzaton or for unnecessary cost reducton. Number of mobe cas JAN DEC Actua data BDRNN forecast LR-NFFS forecast DAMPED MS forecast LCL UCL JUN MAY APR MAR FEB DEC NOV OCT SEP AUG JUL Acknowedgement The authors woud ke to thank the members of the Arstote Unversty s Teecommuncatons Center for ther contrbuton of anonymsed data. The authors wsh to acknowedge fnanca support provded by the Research Commttee of the Technoogca Educaton Insttute of Serres under grant SAT/IC/ /10. Date Fg. 4: Comparson of the forecastng abty of the proposed BDRNN forecaster wth the best representatves of the two rva mode fames and the observed number of cas to mobe destnatons. 95% confdence nterva s aso depcted. on rea word teecommuncatons data. The forecaster s a two ayer network, whose hdden ayer conssts of pars of nterconnected neurons wth nterna feedback. Its modeng quates have been nvestgated through a comparatve anayss wth a seres of we estabshed forecastng modes and recent fuzzy and neurofuzzy forecasters. Two dfferent types of cas, accordng to ther destnaton, have been examned. These are cas to natona and cas to mobe destnatons. They were chosen because the former represents more than 50% of the outgong ca voume and the ater corresponds to more than 1/3 of the teecommuncatons costs for the organzaton under study. Accordng to the resuts, the proposed forecaster s a promsng computatona ntegence s approach to the probem of teecommuncatons ca voume forecastng, snce t s capabe of capturng the tme seres dynamcs through ts nterna feedback connectons. Therefore t does not requre pror knowedge of the order of the tme seres or computatonay expensve nput seecton agorthms. Moreover, ts structure s qute smper than those of fuy recurrent networks or networks wth output feedback. From the anayss t s concuded that t can cope wth the probem better than ts rvas, ether the tradtona forecastng methods or the computatona ntegence ones. Forecastng may act as a vauabe too for teecommuncatons managers and s used for network traffc management, nfrastructure optmzaton and pannng, and the scenaro pannng process. Makng use of hstorca data, managers may predct future demand by creatng a reasonaby accurate forecast of the ca voume. Ths may be used to make educated strategc decsons on how to charge and b teecommuncatons References [1] S. Makrdaks, S. Wheewrght, V. McGee, Forecastng: Methods and Appcatons, 3rd ed., Wey, New York, (1998). [2] C. Has, S. Goudos, J. Sahaos, Seasona Decomposton and Forecastng of Teecommuncaton Data: A comparatve Case Study, Technoogca Forecastng & Soca Change, 73, (2006). [3] C. Hot, Forecastng Trends and Seasonas by Exponentay Weghted Movng Averages, ONR Memorandum, 52 Carnege Insttute of Technoogy, Pttsburg (1957). Reprnted: Internatona Journa of Forecastng, 20, 5-10 (2004). [4] P. Wnters, Forecastng Saes by Exponentay Weghted Movng Averages, Management Scence, 6, (1960). [5] E. Gardner Jr., Exponenta Smoothng: The State of the Art, Journa of Forecastng, 4, 1-28 (1985). [6] G. Box, G. Jenkns, Tme Seres Anayss: Forecastng and Contro, 2nd ed., Hoden-Day, San Francsco, (1976). [7] G. Madden, T. Joachm, Forecastng Teecommuncatons Data wth Lnear Modes, Teecommuncatons Pocy, 31, (2007). [8] P. Mastorocostas, C. Has, S. Dova, D. Varsams, Forecastng of Teecommuncatons Tme-seres va an Orthogona Least Squares-based Fuzzy Mode, Proceedngs of 21st IEEE Internatona Conference on Fuzzy Systems, (2012). [9] P. Mastorocostas, C. Has, A Computatona Integence Forecastng System for Teecommuncatons Tme Seres, Engneerng Appcatons of Artfca Integence 25, (2012). [10] P. Mastorocostas, C. Has, ReNFFor: A Recurrent Neurofuzzy Forecaster for Teecommuncatons Data, Neura Computng and Appcatons, (2012). [11] T. Takag, M. Sugeno, Fuzzy dentfcaton of systems and ts appcatons to modeng and contro, IEEE Transactons on Systems, Man, and Cybernetcs, 15, (1985). [12] S. Svakumar, W. Robertson, W. Phps, On-Lne Stabzaton of Bock-Dagona Recurrent Neura Networks, IEEE Transactons on Neura Networks, 10, (1999). [13] A. Tso, A. Back, Locay Recurrent Gobay Feedforward Networks: A Crtca Revew of Archtectures, IEEE Transactons on Neura Networks, 5, (1994). [14] P. Frascon, M. Gor, and G. Soda, Loca feedback mutayered networks, Neura Computaton, 4, (1992). Natura Scences Pubshng Cor.

8 1650 P. Mastorocostas et a.: A Recurrent Neura Network based Forecastng System... [15] P. Mastorocostas, J. Theochars, A Stabe Learnng Method for Bock-Dagona Recurrent Neura Networks: Appcaton to the Anayss of Lung Sounds, IEEE Transactons on Systems, Man, and Cybernetcs Part B, 36, (2006). [16] M. Redmer, H. Braun, A Drect Adaptve Method for Faster Backpropagaton Learnng: The RPROP Agorthm, Proceedngs of IEEE Internatona Jont Conference on Neura Networks, (1993). [17] C. Ige, M. Husken, Emprca Evauaton of the Improved RPROP Learnng Agorthms, Neurocomputng, 50, (2003). [18] P. Mastorocostas, Resent Back Propagaton Learnng Agorthm for Recurrent Fuzzy Neura Networks, IET Eectroncs Letters, 40, (2004). [19] P. Werbos, Beyond Regresson: new toos for predcton and anayss n the behavora scences, Ph.D. Thess, Harvard Unv., Cambrdge, MA (1974). [20] S. Pche, Steepest Descent Agorthms for Neura Network Controers and Fters, IEEE Transactons on Neura Networks, 5, (1994). [21] ITU-T ICTeye Reports, Avaabe onne : (2010). [22] R. Fdes, K. Nkoopouos, S. Crone, A. Syntetos, Forecastng and Operatona Research: a Revew, Journa of the Operatona Research Socety, 59, (2008). Pars Mastorocostas was born n Thessaonk, Greece, n He receved the Dpoma and Ph.D. degrees n Eectrca & Computer Engneerng from Arstote Unversty of Thessaonk, Greece. Presenty he serves as Professor at the Dept. of Informatcs & Communcatons, Technoogca Educaton Insttute of Serres, Greece. Hs research nterests ncude fuzzy and neura systems wth appcatons to dentfcaton and cassfcaton processes, data mnng and scentfc programmng. user behavor modeng and profng, cassfcaton and characterzaton. Dmtrs Varsams was born n Serres, Greece, n He receved the B.Sc. degree n Mathematcs from the Dept. of Mathematcs, Unversty of Ioannna, Greece, the M.Sc. n Theoretca Computer Scence, Contro and Systems Theory, and the Ph.D. degrees from the Dept. of Mathematcs, Arstote Unversty of Thessaonk, Greece. Presenty he serves as Lecturer at the Dept. of Informatcs & Communcatons, Technoogca Educaton Insttute of Serres, Greece. Hs research nterests ncude scentfc programmng, computatona methods, contro system theory and numerca anayss. Stergan Dova was born n Thessaonk, Greece, n She receved the B.Sc. degree from the Dept. of Informatcs & Communcatons, Technoogca Educaton Insttute of Serres, Greece. Presenty, she s competng her M.Sc. n Informaton Systems at the Unversty of Macedona, Greece. Her research nterests ncude fuzzy systems, neura networks, agorthms anayss and scentfc computng. Constantnos Has was born n Thessaonk, Greece, n He receved the B.Sc., M.Sc. and Ph.D. degrees from the Dept. of Physcs, Arstote Unversty of Thessaonk, Greece, and the M.Sc. degree n Informaton Systems from the Unversty of Macedona, Greece. Presenty he serves as Assstant Professor at the Dept. of Informatcs & Communcatons, Technoogca Educaton Insttute of Serres, Greece. Hs research nterests ncude the appcaton of data mnng technques on communcatons & networkng probems, forecastng, Natura Scences Pubshng Cor.

A Hybrid Learning Algorithm for Locally Recurrent Neural Networks

A Hybrid Learning Algorithm for Locally Recurrent Neural Networks Contemporary Engneerng Scences, Vo. 11, 2018, no. 1, 1-13 HIKARI Ltd, www.m-hkar.com https://do.org/10.12988/ces.2018.711194 A Hybrd Learnng Agorthm for Locay Recurrent Neura Networks Dmtrs Varsams and

More information

Neural network-based athletics performance prediction optimization model applied research

Neural network-based athletics performance prediction optimization model applied research Avaabe onne www.jocpr.com Journa of Chemca and Pharmaceutca Research, 04, 6(6):8-5 Research Artce ISSN : 0975-784 CODEN(USA) : JCPRC5 Neura networ-based athetcs performance predcton optmzaton mode apped

More information

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory Advanced Scence and Technoogy Letters Vo.83 (ISA 205), pp.60-65 http://dx.do.org/0.4257/ast.205.83.2 Research on Compex etworks Contro Based on Fuzzy Integra Sdng Theory Dongsheng Yang, Bngqng L, 2, He

More information

A finite difference method for heat equation in the unbounded domain

A finite difference method for heat equation in the unbounded domain Internatona Conerence on Advanced ectronc Scence and Technoogy (AST 6) A nte derence method or heat equaton n the unbounded doman a Quan Zheng and Xn Zhao Coege o Scence North Chna nversty o Technoogy

More information

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory

Multispectral Remote Sensing Image Classification Algorithm Based on Rough Set Theory Proceedngs of the 2009 IEEE Internatona Conference on Systems Man and Cybernetcs San Antono TX USA - October 2009 Mutspectra Remote Sensng Image Cassfcaton Agorthm Based on Rough Set Theory Yng Wang Xaoyun

More information

MARKOV CHAIN AND HIDDEN MARKOV MODEL

MARKOV CHAIN AND HIDDEN MARKOV MODEL MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not

More information

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident ICTCT Extra Workshop, Bejng Proceedngs The Appcaton of BP Neura Network prncpa component anayss n Forecastng Road Traffc Accdent He Mng, GuoXucheng &LuGuangmng Transportaton Coege of Souast Unversty 07

More information

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques Energes 20, 4, 73-84; do:0.3390/en40073 Artce OPEN ACCESS energes ISSN 996-073 www.mdp.com/journa/energes Short-Term Load Forecastng for Eectrc Power Systems Usng the PSO-SVR and FCM Custerng Technques

More information

COXREG. Estimation (1)

COXREG. Estimation (1) COXREG Cox (972) frst suggested the modes n whch factors reated to fetme have a mutpcatve effect on the hazard functon. These modes are caed proportona hazards (PH) modes. Under the proportona hazards

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Predicting Model of Traffic Volume Based on Grey-Markov

Predicting Model of Traffic Volume Based on Grey-Markov Vo. No. Modern Apped Scence Predctng Mode of Traffc Voume Based on Grey-Marov Ynpeng Zhang Zhengzhou Muncpa Engneerng Desgn & Research Insttute Zhengzhou 5005 Chna Abstract Grey-marov forecastng mode of

More information

Greyworld White Balancing with Low Computation Cost for On- Board Video Capturing

Greyworld White Balancing with Low Computation Cost for On- Board Video Capturing reyword Whte aancng wth Low Computaton Cost for On- oard Vdeo Capturng Peng Wu Yuxn Zoe) Lu Hewett-Packard Laboratores Hewett-Packard Co. Pao Ato CA 94304 USA Abstract Whte baancng s a process commony

More information

Image Classification Using EM And JE algorithms

Image Classification Using EM And JE algorithms Machne earnng project report Fa, 2 Xaojn Sh, jennfer@soe Image Cassfcaton Usng EM And JE agorthms Xaojn Sh Department of Computer Engneerng, Unversty of Caforna, Santa Cruz, CA, 9564 jennfer@soe.ucsc.edu

More information

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks Shengyang Sun, Changyou Chen, Lawrence Carn Suppementary Matera: Learnng Structured Weght Uncertanty n Bayesan Neura Networks Shengyang Sun Changyou Chen Lawrence Carn Tsnghua Unversty Duke Unversty Duke

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

Nested case-control and case-cohort studies

Nested case-control and case-cohort studies Outne: Nested case-contro and case-cohort studes Ørnuf Borgan Department of Mathematcs Unversty of Oso NORBIS course Unversty of Oso 4-8 December 217 1 Radaton and breast cancer data Nested case contro

More information

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System.

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System. Part 7: Neura Networ & earnng /2/05 Superved earnng Neura Networ and Bac-Propagaton earnng Produce dered output for tranng nput Generaze reaonaby & appropratey to other nput Good exampe: pattern recognton

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

On the Power Function of the Likelihood Ratio Test for MANOVA

On the Power Function of the Likelihood Ratio Test for MANOVA Journa of Mutvarate Anayss 8, 416 41 (00) do:10.1006/jmva.001.036 On the Power Functon of the Lkehood Rato Test for MANOVA Dua Kumar Bhaumk Unversty of South Aabama and Unversty of Inos at Chcago and Sanat

More information

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION

ON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION European Journa of Mathematcs and Computer Scence Vo. No. 1, 2017 ON AUTOMATC CONTNUTY OF DERVATONS FOR BANACH ALGEBRAS WTH NVOLUTON Mohamed BELAM & Youssef T DL MATC Laboratory Hassan Unversty MORO CCO

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students.

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students. Exampe: Suppose we want to bud a cassfer that recognzes WebPages of graduate students. How can we fnd tranng data? We can browse the web and coect a sampe of WebPages of graduate students of varous unverstes.

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Support Vector Machine Technique for Wind Speed Prediction

Support Vector Machine Technique for Wind Speed Prediction Internatona Proceedngs of Chemca, Boogca and Envronmenta Engneerng, Vo. 93 (016) DOI: 10.7763/IPCBEE. 016. V93. Support Vector Machne Technque for Wnd Speed Predcton Yusuf S. Turkan 1 and Hacer Yumurtacı

More information

Delay tomography for large scale networks

Delay tomography for large scale networks Deay tomography for arge scae networks MENG-FU SHIH ALFRED O. HERO III Communcatons and Sgna Processng Laboratory Eectrca Engneerng and Computer Scence Department Unversty of Mchgan, 30 Bea. Ave., Ann

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

MULTIVARIABLE FUZZY CONTROL WITH ITS APPLICATIONS IN MULTI EVAPORATOR REFRIGERATION SYSTEMS

MULTIVARIABLE FUZZY CONTROL WITH ITS APPLICATIONS IN MULTI EVAPORATOR REFRIGERATION SYSTEMS MULTIVARIABLE FUZZY CONTROL WITH I APPLICATIONS IN MULTI EVAPORATOR REFRIGERATION SYSTEMS LIAO QIANFANG Schoo of Eectrca and Eectronc Engneerng A thess submtted to the Nanyang Technoogca Unversty n parta

More information

Application of support vector machine in health monitoring of plate structures

Application of support vector machine in health monitoring of plate structures Appcaton of support vector machne n heath montorng of pate structures *Satsh Satpa 1), Yogesh Khandare ), Sauvk Banerjee 3) and Anrban Guha 4) 1), ), 4) Department of Mechanca Engneerng, Indan Insttute

More information

Sensitivity Analysis Using Neural Network for Estimating Aircraft Stability and Control Derivatives

Sensitivity Analysis Using Neural Network for Estimating Aircraft Stability and Control Derivatives Internatona Conference on Integent and Advanced Systems 27 Senstvty Anayss Usng Neura Networ for Estmatng Arcraft Stabty and Contro Dervatves Roht Garhwa a, Abhshe Hader b and Dr. Manoranan Snha c Department

More information

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic

A parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic parametrc Lnear Programmng Mode Descrbng Bandwdth Sharng Poces for BR Traffc I. Moschoos, M. Logothets and G. Kokknaks Wre ommuncatons Laboratory, Dept. of Eectrca & omputer Engneerng, Unversty of Patras,

More information

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) , A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS Dr. Derald E. Wentzen, Wesley College, (302) 736-2574, wentzde@wesley.edu ABSTRACT A lnear programmng model s developed and used to compare

More information

Development of whole CORe Thermal Hydraulic analysis code CORTH Pan JunJie, Tang QiFen, Chai XiaoMing, Lu Wei, Liu Dong

Development of whole CORe Thermal Hydraulic analysis code CORTH Pan JunJie, Tang QiFen, Chai XiaoMing, Lu Wei, Liu Dong Deveopment of whoe CORe Therma Hydrauc anayss code CORTH Pan JunJe, Tang QFen, Cha XaoMng, Lu We, Lu Dong cence and technoogy on reactor system desgn technoogy, Nucear Power Insttute of Chna, Chengdu,

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Lower Bounding Procedures for the Single Allocation Hub Location Problem

Lower Bounding Procedures for the Single Allocation Hub Location Problem Lower Boundng Procedures for the Snge Aocaton Hub Locaton Probem Borzou Rostam 1,2 Chrstoph Buchhem 1,4 Fautät für Mathemat, TU Dortmund, Germany J. Faban Meer 1,3 Uwe Causen 1 Insttute of Transport Logstcs,

More information

MODEL TUNING WITH THE USE OF HEURISTIC-FREE GMDH (GROUP METHOD OF DATA HANDLING) NETWORKS

MODEL TUNING WITH THE USE OF HEURISTIC-FREE GMDH (GROUP METHOD OF DATA HANDLING) NETWORKS MODEL TUNING WITH THE USE OF HEURISTIC-FREE (GROUP METHOD OF DATA HANDLING) NETWORKS M.C. Schrver (), E.J.H. Kerchoffs (), P.J. Water (), K.D. Saman () () Rswaterstaat Drecte Zeeand () Deft Unversty of

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages

Application of Particle Swarm Optimization to Economic Dispatch Problem: Advantages and Disadvantages Appcaton of Partce Swarm Optmzaton to Economc Dspatch Probem: Advantages and Dsadvantages Kwang Y. Lee, Feow, IEEE, and Jong-Bae Par, Member, IEEE Abstract--Ths paper summarzes the state-of-art partce

More information

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS A DIMESIO-REDUCTIO METHOD FOR STOCHASTIC AALYSIS SECOD-MOMET AALYSIS S. Rahman Department of Mechanca Engneerng and Center for Computer-Aded Desgn The Unversty of Iowa Iowa Cty, IA 52245 June 2003 OUTLIE

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

NONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM

NONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM Journa of heoretca and Apped Informaton echnoogy th February 3. Vo. 48 No. 5-3 JAI & LLS. A rghts reserved. ISSN: 99-8645 www.jatt.org E-ISSN: 87-395 NONLINEAR SYSEM IDENIFICAION BASE ON FW-LSSVM, XIANFANG

More information

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang

Distributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang Dstrbuted Movng Horzon State Estmaton of Nonnear Systems by Jng Zhang A thess submtted n parta fufment of the requrements for the degree of Master of Scence n Chemca Engneerng Department of Chemca and

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

L-Edge Chromatic Number Of A Graph

L-Edge Chromatic Number Of A Graph IJISET - Internatona Journa of Innovatve Scence Engneerng & Technoogy Vo. 3 Issue 3 March 06. ISSN 348 7968 L-Edge Chromatc Number Of A Graph Dr.R.B.Gnana Joth Assocate Professor of Mathematcs V.V.Vannaperuma

More information

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL The Synchronous 8th-Order Dfferental Attack on 12 Rounds of the Block Cpher HyRAL Yasutaka Igarash, Sej Fukushma, and Tomohro Hachno Kagoshma Unversty, Kagoshma, Japan Emal: {garash, fukushma, hachno}@eee.kagoshma-u.ac.jp

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

QUARTERLY OF APPLIED MATHEMATICS

QUARTERLY OF APPLIED MATHEMATICS QUARTERLY OF APPLIED MATHEMATICS Voume XLI October 983 Number 3 DIAKOPTICS OR TEARING-A MATHEMATICAL APPROACH* By P. W. AITCHISON Unversty of Mantoba Abstract. The method of dakoptcs or tearng was ntroduced

More information

REAL-TIME IMPACT FORCE IDENTIFICATION OF CFRP LAMINATED PLATES USING SOUND WAVES

REAL-TIME IMPACT FORCE IDENTIFICATION OF CFRP LAMINATED PLATES USING SOUND WAVES 8 TH INTERNATIONAL CONERENCE ON COMPOSITE MATERIALS REAL-TIME IMPACT ORCE IDENTIICATION O CRP LAMINATED PLATES USING SOUND WAVES S. Atobe *, H. Kobayash, N. Hu 3 and H. ukunaga Department of Aerospace

More information

Numerical Investigation of Power Tunability in Two-Section QD Superluminescent Diodes

Numerical Investigation of Power Tunability in Two-Section QD Superluminescent Diodes Numerca Investgaton of Power Tunabty n Two-Secton QD Superumnescent Dodes Matta Rossett Paoo Bardea Ivo Montrosset POLITECNICO DI TORINO DELEN Summary 1. A smpfed mode for QD Super Lumnescent Dodes (SLD)

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 31 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 6. Rdge regresson The OLSE s the best lnear unbased

More information

829. An adaptive method for inertia force identification in cantilever under moving mass

829. An adaptive method for inertia force identification in cantilever under moving mass 89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,

More information

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints

Optimal Guaranteed Cost Control of Linear Uncertain Systems with Input Constraints Internatona Journa Optma of Contro, Guaranteed Automaton, Cost Contro and Systems, of Lnear vo Uncertan 3, no Systems 3, pp 397-4, wth Input September Constrants 5 397 Optma Guaranteed Cost Contro of Lnear

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

The line method combined with spectral chebyshev for space-time fractional diffusion equation

The line method combined with spectral chebyshev for space-time fractional diffusion equation Apped and Computatona Mathematcs 014; 3(6): 330-336 Pubshed onne December 31, 014 (http://www.scencepubshnggroup.com/j/acm) do: 10.1164/j.acm.0140306.17 ISS: 3-5605 (Prnt); ISS: 3-5613 (Onne) The ne method

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

A General Column Generation Algorithm Applied to System Reliability Optimization Problems

A General Column Generation Algorithm Applied to System Reliability Optimization Problems A Genera Coumn Generaton Agorthm Apped to System Reabty Optmzaton Probems Lea Za, Davd W. Cot, Department of Industra and Systems Engneerng, Rutgers Unversty, Pscataway, J 08854, USA Abstract A genera

More information

The internal structure of natural numbers and one method for the definition of large prime numbers

The internal structure of natural numbers and one method for the definition of large prime numbers The nternal structure of natural numbers and one method for the defnton of large prme numbers Emmanul Manousos APM Insttute for the Advancement of Physcs and Mathematcs 3 Poulou str. 53 Athens Greece Abstract

More information

Characterizing Probability-based Uniform Sampling for Surrogate Modeling

Characterizing Probability-based Uniform Sampling for Surrogate Modeling th Word Congress on Structura and Mutdscpnary Optmzaton May 9-4, 3, Orando, Forda, USA Characterzng Probabty-based Unform Sampng for Surrogate Modeng Junqang Zhang, Souma Chowdhury, Ache Messac 3 Syracuse

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

3. Stress-strain relationships of a composite layer

3. Stress-strain relationships of a composite layer OM PO I O U P U N I V I Y O F W N ompostes ourse 8-9 Unversty of wente ng. &ech... tress-stran reatonshps of a composte ayer - Laurent Warnet & emo Aerman.. tress-stran reatonshps of a composte ayer Introducton

More information

Short Term Load Forecasting using an Artificial Neural Network

Short Term Load Forecasting using an Artificial Neural Network Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung

More information

Approximate merging of a pair of BeÂzier curves

Approximate merging of a pair of BeÂzier curves COMPUTER-AIDED DESIGN Computer-Aded Desgn 33 (1) 15±136 www.esever.com/ocate/cad Approxmate mergng of a par of BeÂzer curves Sh-Mn Hu a,b, *, Rou-Feng Tong c, Tao Ju a,b, Ja-Guang Sun a,b a Natona CAD

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

On the Equality of Kernel AdaTron and Sequential Minimal Optimization in Classification and Regression Tasks and Alike Algorithms for Kernel

On the Equality of Kernel AdaTron and Sequential Minimal Optimization in Classification and Regression Tasks and Alike Algorithms for Kernel Proceedngs of th European Symposum on Artfca Neura Networks, pp. 25-222, ESANN 2003, Bruges, Begum, 2003 On the Equaty of Kerne AdaTron and Sequenta Mnma Optmzaton n Cassfcaton and Regresson Tasks and

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

LECTURE 21 Mohr s Method for Calculation of General Displacements. 1 The Reciprocal Theorem

LECTURE 21 Mohr s Method for Calculation of General Displacements. 1 The Reciprocal Theorem V. DEMENKO MECHANICS OF MATERIALS 05 LECTURE Mohr s Method for Cacuaton of Genera Dspacements The Recproca Theorem The recproca theorem s one of the genera theorems of strength of materas. It foows drect

More information

Optimum Selection Combining for M-QAM on Fading Channels

Optimum Selection Combining for M-QAM on Fading Channels Optmum Seecton Combnng for M-QAM on Fadng Channes M. Surendra Raju, Ramesh Annavajjaa and A. Chockangam Insca Semconductors Inda Pvt. Ltd, Bangaore-56000, Inda Department of ECE, Unversty of Caforna, San

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

Chapter 6. Rotations and Tensors

Chapter 6. Rotations and Tensors Vector Spaces n Physcs 8/6/5 Chapter 6. Rotatons and ensors here s a speca knd of near transformaton whch s used to transforms coordnates from one set of axes to another set of axes (wth the same orgn).

More information

Chapter 3 Describing Data Using Numerical Measures

Chapter 3 Describing Data Using Numerical Measures Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The

More information

IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM

IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED BY PARTICLE SWARM ALGORITHM Journa of Theoretca and Apped Informaton Technoogy th February 3. Vo. 48 No. 5-3 JATIT & LLS. A rghts reserved. ISSN: 99-8645 www.att.org E-ISSN: 87-395 IDENTIFICATION OF NONLINEAR SYSTEM VIA SVR OPTIMIZED

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions

Decentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions Decentrazed Adaptve Contro for a Cass of Large-Scae onnear Systems wth Unknown Interactons Bahram Karm 1, Fatemeh Jahangr, Mohammad B. Menhaj 3, Iman Saboor 4 1. Center of Advanced Computatona Integence,

More information

CHAPTER 4. Vector Spaces

CHAPTER 4. Vector Spaces man 2007/2/16 page 234 CHAPTER 4 Vector Spaces To crtcze mathematcs for ts abstracton s to mss the pont entrel. Abstracton s what makes mathematcs work. Ian Stewart The man am of ths tet s to stud lnear

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

A Derivative-Free Algorithm for Bound Constrained Optimization

A Derivative-Free Algorithm for Bound Constrained Optimization Computatona Optmzaton and Appcatons, 21, 119 142, 2002 c 2002 Kuwer Academc Pubshers. Manufactured n The Netherands. A Dervatve-Free Agorthm for Bound Constraned Optmzaton STEFANO LUCIDI ucd@ds.unroma.t

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems

Research Article H Estimates for Discrete-Time Markovian Jump Linear Systems Mathematca Probems n Engneerng Voume 213 Artce ID 945342 7 pages http://dxdoorg/11155/213/945342 Research Artce H Estmates for Dscrete-Tme Markovan Jump Lnear Systems Marco H Terra 1 Gdson Jesus 2 and

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

A new P system with hybrid MDE- k -means algorithm for data. clustering. 1 Introduction

A new P system with hybrid MDE- k -means algorithm for data. clustering. 1 Introduction Wesun, Lasheng Xang, Xyu Lu A new P system wth hybrd MDE- agorthm for data custerng WEISUN, LAISHENG XIANG, XIYU LIU Schoo of Management Scence and Engneerng Shandong Norma Unversty Jnan, Shandong CHINA

More information

Adaptive and Iterative Least Squares Support Vector Regression Based on Quadratic Renyi Entropy

Adaptive and Iterative Least Squares Support Vector Regression Based on Quadratic Renyi Entropy daptve and Iteratve Least Squares Support Vector Regresson Based on Quadratc Ren Entrop Jngqng Jang, Chu Song, Haan Zhao, Chunguo u,3 and Yanchun Lang Coege of Mathematcs and Computer Scence, Inner Mongoa

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS Mohd Yusoff Mashor School of Electrcal and Electronc Engneerng, Unversty Scence Malaysa, Pera Branch Campus,

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Lower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle

Lower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle Lower bounds for the Crossng Number of the Cartesan Product of a Vertex-transtve Graph wth a Cyce Junho Won MIT-PRIMES December 4, 013 Abstract. The mnmum number of crossngs for a drawngs of a gven graph

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

Cyclic Codes BCH Codes

Cyclic Codes BCH Codes Cycc Codes BCH Codes Gaos Feds GF m A Gaos fed of m eements can be obtaned usng the symbos 0,, á, and the eements beng 0,, á, á, á 3 m,... so that fed F* s cosed under mutpcaton wth m eements. The operator

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information