Comparison of BPA and LMA methods for Takagi - Sugeno type MIMO Neuro-Fuzzy Network to forecast Electrical Load Time Series

Size: px
Start display at page:

Download "Comparison of BPA and LMA methods for Takagi - Sugeno type MIMO Neuro-Fuzzy Network to forecast Electrical Load Time Series"

Transcription

1 Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres [Fex Pasa] Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres Fex Pasa Eectrca Engneerng Deartment, Petra Chrstan Unversty Surabaya - Indonesa E-ma: fex@etra.ac.d ABSRAC hs aer descrbes an acceerated Backroagaton agorthm (BPA that can be used to tran the akag-sugeno (S tye mut-nut mut-outut (IO neuro-fuzzy network effcenty. Aso other method such as acceerated Levenberg- arquardt agorthm (LA w be comared to BPA. he tranng agorthm s effcent n the sense that t can brng the erformance ndex of the network, such as the sum squared error (SSE, ean Squared Error (SE, and aso Root ean Squared Error (RSE, down to the desred error goa much faster than that the sme BPA or LA. Fnay, the above tranng agorthm s tested on neuro-fuzzy modeng and forecastng acaton of Eectrca oad tme seres. Keywords: S tye ISO neuro-fuzzy network, acceerated Levenberg-arquardt agorthm, acceerated Backroagaton agorthm, tme-seres forecastng IRODUCIO hs aer rooses two extended-tranng agorthms for a mut-nut mut-outut (IO akag- Sugeno tye neuro-fuzzy (F network and neurofuzzy aroach for modeng and forecastng acaton of eectrca oad tme seres. he neurofuzzy aroach attemts to exot the merts of both neura network and fuzzy ogc based modeng technques. For exame, the fuzzy modes are based on fuzzy f-then rues and are to a certan degree transarent to nterretaton and anayss. Whereas, the neura networks based mode has the unque earnng abty. Here, S tye IO neuro-fuzzy network s constructed by mutayer feedforward network reresentaton of the fuzzy ogc system as descrbed n secton, whereas ther tranng agorthms are descrbed n secton. Smuaton exerments and resuts are shown n Secton 4, and fnay bref concudng remarks are resented n secton 5. EURO-FUZZY SYSES SELECIO FOR FORECASIG he common aroach to numerca data drven for neuro-fuzzy modeng and dentfcaton s usng akag-sugeno (S tye fuzzy mode aong wth contnuousy dfferentabe membersh functons such as Gaussan functon and dfferentabe oerators for constructng the fuzzy nference mechansm and defuzzyfcaton of outut data usng the weghted ote: Dscusson s mected before December, st 7, and w be ubced n the urna eknk Eektro voume 8, number arch 8. average defuzzfer. he corresondng outut nference can then be reresented n a mutayer feedforward network structure. In rnce, the neuro-fuzzy network s archtecture used here s dentcay to the mut nut snge outut archtecture of AFIS [], whch s shown n Fgure a. ang ntroduced nuts outut archtecture wth akag- Sugeno tye fuzzy mode wth two rues. he euro- Fuzzy mode AFIS ncororates a fve-ayer network to mement a akag-sugeno tye fuzzy system. Fgure a shows the roosed mode of AFIS and t can rocess a arge number of fuzzy rues. It uses the east mean square method to determne the near consequents of the akag- Sugeno rues. As contnuty from AFIS structure, feedforward mut nut mut outut s roosed by Pat and Babuška [] and Pat and Poovc [] as shown n Fgure b. nuts Layer Layer Layer A Z A B B Degree of fufment Z R : If s A and R : If s A and b Degree of fufment Of rues S S Z b Z b Y S Layer 4 Y S s B hen Y = w w w S Z b Y S Z b Y s B hen Y = w w w Layer 5 Fgure a. AFIS archtecture wth akag-sugeno tye fuzzy mode wth two rues, Inuts, Outut f

2 urna eknk Eektro Vo. 7, o., Setember 7: - 9 n n nuts n G G n G G n Z Z Degree of fufment b Degree of fufment Of rues Z b Z b y y y m y m m oututs Fgure b. Fuzzy system IO feedforward akag- Sugeno-tye eura-fuzzy network F mode as shown n Fgure b s based on Gaussan membersh functons. It uses S tye fuzzy rue, roduct nference, and weghted average defuzzyfcaton. he nodes n the frst ayer cacuate the degree of membersh of the numerca nut vaues n the antecedent fuzzy sets. he roduct nodes (x reresent the antecedent conuncton oerator and the outut of ths node s the corresondng to degree of fufment or frng strength of the rue. he dvson sgn (, together wth summaton nodes (, on to make the normazed degree of fufment ( z b of the corresondng rue, whch after mutcaton wth the corresondng S rue consequent ( y, s used as nut to the ast summaton art ( at the defuzzyfed outut vaue, whch, beng crs, s drecty comatbe wth the actua data. Forecastng of tme seres s based on numerca nut-outut data. Demonstratng ths to the F networks, a S tye mode wth near rues consequent (can be aso sngeton mode for a seca case, s seected []. In ths mode, the number of membersh functons ( to be memented for fuzzy arttonng of nut to be equa to the number of a ror seected rues. In the next secton of ths chater, acceerated BPA and LA w be aed to acceerate the convergence seed of the tranng and to avod other nconvenences. euro mementaton of Fuzzy Logc System he fuzzy ogc system (FLS consdered n Fgure b, can be easy reduced to a ISO-F network by settng number oututs m =, t means ths structure s smar wth Fgure a. akag-sugeno (S tye fuzzy mode, and wth Gaussan membersh f f m functons (GFs, roduct nference rue, and a weghted average defuzzfer can be defned as (-(4 (see [4]. f = = y h ( where, y = W W x W x Wn x ( n z h ( z b =, and b = z = n ( ( x c = µ = x ; x = G G ex σ ( (4 he corresondng th rue from the above fuzzy ogc system (FLS can be wrtten as R : IF x s G AD AD xn s Gn HE (5 y = W Wx W nxn. where, x wth =,,, n; are the n system nuts, f wth =,,, m; are ts m oututs, and G wth =,,, n and =,,, m are the Gaussan membersh functons of form (4 wth the corresondng mean and varance arameters c and σ resectvey and wth y as the outut consequent of the th rue. It must be remembered that the Gaussan membersh functons G actuay reresent ngustc terms such as ow, medum, hgh, very hgh, etc. he rues as wrtten n (5 are known as akag- Sugeno rues. Fgure b shows that the FLS can be reresented as a three ayer feedforward network. Because of the neuro mementaton of the akag-sugeno-tye FLS, ths fgure reresents a akag-sugeno-tye of IO neuro-fuzzy network, where nstead of the connecton weghts and the bases n neura network, we have here the mean c and aso the varance o σ arameters of Gaussan membersh functons, aong wth W, W arameters from the rues consequent, as the equvaent adustabe arameters of the network. If a these arameters of F network are roery seected, then the FLS can correcty aroxmate any nonnear system based on gven data. RAIIG ALGORIH FOR FEEDFORWARD EURAL EWORK he IO Feedforward F network that s reresented n Fgure b can generay be traned usng sutabe tranng agorthms. Some standard tranng agorthms are Backroagaton Agorthm

3 Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres [Fex Pasa] (BPA and Levenberg-arquardt Agorthm (LA. BPA, the smest agorthm for neura network ( tranng, s a suervsed earnng technque used for tranng artfca neura networks. It was frst descrbed by Pau Werbos n 974, and further deveoed by Davd E. Rumehart, Geoffrey E. Hnton and Ronad. Wams n 986. hs agorthm s a earnng rue for mut-ayered neura networks, credted to Rumehart. It s most usefu for feed-forward networks (networks wthout feedback, or smy, that have no connectons that oo. he term s an abbrevaton for "backwards roagaton of errors". BPA s used to cacuate gradent of error of the network wth resect to the network's modfabe weghts. hs gradent s amost aways used n a sme stochastc gradent descent agorthm to fnd weghts that mnmze the error. In forecastng acaton, ure BPA, has sow convergence seed n comarson to other second order tranng []. hat s why ths agorthm needs to be mroved usng momentum and modfed error ndex. Aternatvey, the more effcent tranng agorthm, such as Levenberg-arquardt agorthm (LA can aso be used for tranng the IO Feedforward systems []. BPA and Acceerated BPA d Let us assume that data ars nut-outut x,, wth x as the number of nuts and d as the number of oututs n IO matrx observaton, s gven. he goa s to fnd a FLS ( f n equatons 6-8, so x that the erformance, Sum Squared Error (SSE, a the equatons exaned by Pat and Poovc ([],.4-.45, can be defned as wth SSE ( e =. 5 =. 5 E E, (6 = E, E are error transose and errors for each e n coumn vector from FLS. For tota erformance outut ( m = ( SSE SSE SSE SSE ota = SSE = m, (7 s mnmzed. e = f ( x d or for convenence e = f d (8 he robem s, how BPA works to adust arameters ( W, from the rues consequent and the mean o W c and varance τ arameters from the Gaussan membersh functons, so that SSE s mnmzed. Furthermore, the gradent steeest descent rue for tranng of feedforward neura network s based on the recursve exressons: W ( k = W ( k η ( SSE W (9 W k = W k η SSE W ( c ( ( ( ( k = c ( k η ( SSE c ( k = σ ( k η ( SSE σ σ ( ( Where SSE s the erformance functon at the k th teraton ste and W ( k, W ( k, c ( k, σ ( k th are the free arameters udate at the next k teraton ste, the startng vaues of them are randomy seected. Constant ste sze η or earnng rate shoud be chosen roery, usuayη <<, =,,, n (n = number of nuts, =,,, m (m=number of oututs, and =,,, (=number of Gaussan membersh functons. ow, et us construct agan (- based on Fgure b, It s sure that SSE deends on outut network f, therefore erformance functon SSE, deends on W, ony through (4, and aso deends o W on c, τ ony through (-(5 f = = y h y = W W x W x W h z ( n n (4 ( z b =, and b = z = = n x c ex = σ herefore, the corresondng chan rues are ( SSE W = ( SSE f ( f y ( y W ( W = ( SSE f ( f y ( y W x (5 (6 (7 SSE (8 ( SSE c = SSE z ( z c = = = ( SSE f ( f z ( z c ( SSE σ = SSE z ( z σ = = = z ( SSE f ( f z ( σ (9 (

4 urna eknk Eektro Vo. 7, o., Setember 7: - 9 And equatons (7 ( can be wrtten as ( SSE W = ( f d ( z b ( ( SSE W = ( f d ( z b x ( ( SSE c = A ( z b ( x c ( σ ( ( ( SSE σ = A z b x c ( σ For smcty, we substtute the term A wth A = ( ( y f f d = n = W W x f f = = ( (4 (5 ( d By substtutng equaton (-4 nto the (9-, udated rues for free arameters can be wrtten as W k = W k η f d z b (6 ( ( ( ( ( k = W ( k ( f d ( z b x ( ( { ( ( } k = c k η A h x c σ ( ( { ( ( } k = σ k η A h x c σ W c η (7 (8 σ (9 where h term s the normazed degree of fufment of the th rue. h = z b, wth b z = = ( BPA tranng agorthm for S-tye IO networks s reresented from equatons (6 to (, equvaent wth near fuzzy rues from S y = Wo W x W x Wn xn. ( In case of BPA, to avod the ossbe oscaton n fna hase of tranng, very sma earnng rate η s chosen. herefore, BPA tranng needs a arge number of tranng eochs. For acceeraton of BPA, momentum (adatve verson of earnng rate and modfed error ndex for erformance functon udate can be aed. Addtona momentum (mo can be seen n equaton (-5 wth momentum constant s usuay ess than one ( mo <. W W c ( k = W ( k ( mo {( f d ( z b } η ( mo W ( k ( k = W ( k ( mo { ( f d ( z b } η ( mo W ( k ( k = c ( k ( mo A z ( x c ( σ mo x { } η (4 c ( k ( k = σ ( k η ( mo A z ( x c ( σ { } σ (5 mo σ ( k o mrove the tranng erformance of feedforward, addtona modfed error ndex verson shoud be added to the modfed BPA wth momentum, as roosed by aosong et a. [5]. hs aroach can be seen n equatons (6 4. SSE e avg m ( r eavg ( w = 5 γ e ( w = r =. (6 e r r = ( w (7 hus the new erformance ndex can be wrtten as SSEnew ( w = SSE( w SSEm( w (8 where SSE( w the unmodfed error erformance as s defned n (7 and varabe w reresents the network free arameter vector n genera. he corresondng gradent now becomes to SSE SSE ( w = ( er ( w ( er ( w m SSE r = ( ( e e ( w ( w = er ( w new r = avg ( r (9 γ (4 ( w = ( er ( w γ ( er ( w eavg r= ( e ( w r (4 he constant gamma γ shoud be defned roery. In ractce, gamma aso sma, γ < Acceerated LA o acceerate the convergence seed on neuro-fuzzy network tranng that haened n BPA, the Levenberg-arquardt agorthm (LA was roosed and roved [4]. If a functon ( w V s meant to mnmze wth resect to the arameter vector w usng ewton s method, the udate of arameter vector w s defned as: w = [ V ( w ] V ( w (4 w k = w k (4 ( ( w From equatons (4-4, V ( w s the Hessan matrx and V ( w s the gradent of V ( w. If the functon V ( w s taken to be SSE functon as foows: V r r =! ( w =.5 e ( w (44 hen the gradent of V ( w and the Hessan matrx V ( w are generay defned as: V ( w = ( w e( w (45 4

5 Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres [Fex Pasa] V ( w = ( w ( w er ( w er ( w r = where the acoban matrx ( w as foows e ( w e ( w e ( w ( w e = e ( w e ( w e ( w ( w e ( w e ( w L L L (46 (47 From (47, t s seen that the dmenson of the acoban matrx s (, where s the number of tranng sames and s the number of adustabe arameters n the network. For the Gauss- ewton method, the second term n (46 s assumed to be zero. herefore, the udate equatons accordng to (4 w be: w = w w w e w (48 [ ( ( ] ( ( ow et us see the LA modfcatons of the Gauss- ewton method. w = [ ( w ( w µ I ] ( w e( w (49 where dmenson of I s the ( dentty matrx, and the arameter µ s muted or dvded by some factor whenever the teraton stes ncrease V w. or decrease the vaue of ( Here, the udated equaton accordng to (4 [ ] ( ( ( = w( k ( w ( w µ I w e w (5 w k hs s mortant to know that for arge µ, the agorthm becomes the steeest descent agorthm wth ste sze µ, and for sma µ, t becomes the Gauss-ewton method. For faster convergence reason and aso to overcome the ossbe tra at oca mnma and to reduce oscaton durng the tranng [6], ke n BPA, a sma momentum term mo (ractcay n eectrca oad forecastng, addng mo around 5% to % w gve better resuts aso can be added, so that fna udate (5 becomes w( k = w( k [ ( w ( w µ I ] ( w e( w (5 mo ( w( k w( k Furthermore, ke n BPA, aosong et a [5] aso roosed to add modfed error ndex (EI term n order to mrove tranng convergence. Smar to equatons (9-4, the corresondng gradent wth EI can now be defned by usng a acoban matrx as: [ ( e ] SSEnew( w = ( w e( w γ e( w avg (5 where e ( w s the coumn vector of errors, e avg s sum of error of each coumn dvded by number of tranng, whe γ s a constant factor, γ << has to be chosen aroratey. ow, comes to the comutaton of acoban atrces. hese are the most dffcut ste n mementng LA. We descrbe a sme technque to comute the acoban matrx from the Backroagaton resuts, by takng nto account equaton (46, the gradent V W can be wrtten as ( SSE ( W ( S W = { z b} ( f d V Where (5 f and d are the actua outut of the akag-sugeno tye IO and the corresondng desred outut from matrx nut-outut tranng data. And then by comarng (5 to (45, where the gradent V ( w s exressed wth the transose of the acoban matrx muted wth the network's error vector V ( w = ( w e( w (54 then the acoban matrx, the transose of acoban matrx for the arameter W of the F network can be wrtten by ( W ( z b ( W [ ( W ] = [ z b] = (55 (56 wth redcton error of fuzzy network e ( f d (57 But f the normazed redcton error on F network s consdered, then nstead of equatons (55 and (56, the equatons w be ( W ( z [ W ] [ z ] = (59 ( W ( = (6 hs s because the normazed redcton error of the IO-F network s e normazed f d (6 ( ( b By usng smar technque and takng nto account equaton (54, the transose of acoban matrx and acoban matrx for the arameter W of the F network can be wrtten as 5

6 urna eknk Eektro Vo. 7, o., Setember 7: - 9 ( W ( z b x ( W [ ( W ] = [( z b x ] = (6 (6 Aso, by consderng normazed redcton error from (6, equatons (6-6 then become: W = z x (64 ( ( ( W [ ( W ] = [ z x ] (65 ow, comes to the comutaton of the rest arameters c andτ. Usng the smar equaton to comute the term A n equaton (5, the equaton has to be reorganzed as foows D y f (66 ( By combnng equatons (57 and (66, the term A whch s exaned before can be rewrtten as A ( D e = ( D e D e Dm em m = L (67 Let us defne the terms D and e as A D e = ( D e D e L Dm em (68 Wth e as the same amount of sum squared error that can be found by a the errors e from the IO network. ( e e e em = L (69 Where, =,,,, ; corresondng to as number of tranng data. From (68, the term D can be determned as D = ( A e (7 hs can aso be wrtten n matrx form usng seudo nverse as ( Ε ( Ε Ε D = Α (7 he terms E (s the equvaent error vector, D and A are matrces of sze (x, (x and (x resectvey. ow matrx A can be reaced wth scaar roduct of e and D A = D e (7 Wth equaton (7, we can wrte equatons ( and (4 as SSE c = D e z b x c σ (7 ( ( ( ( ( ( ( ( ( SSE σ = D e z b x c ( σ (74 ow, by consderng normazed equvaent error ke n (6, takng nto account the equaton (54 and comarng t resectvey wth (7 and (74, the transosed acoban matrx, the acoban for the arameters c andτ can be comuted as: ( c ( ( D z x c σ ( c = [ ( c ] = D z ( x c ( σ ( ( σ D z x c ( σ = (75 { } (76 = (77 ( = [ ( σ ] = D z ( x c ( σ σ (78 It s to be noted that normazed redcton error s consdered for comutaton of acoban matrces for the free arameters W andw. eanwhe normazed equvaent error has been consdered for the comutaton of transosed acoban matrces and ther acoban matrces resectvey for the free arameters mean c and varanceτ. odeng of non-near dynamcs data tme-seres he tme seres wth ead tme redcts the vaues at tme ( t L, based on the avaabe tme seres data u to the ont t. o forecast the eectrca oad tme seres usng the neuro-fuzzy aroach, the tme seres data = {,,,, } can be rearranged n a IO - IO ke structure. IO stands for the tme seres s reresented n Inut and Outut form. For the gven tme seres modeng and forecastng acaton the IO neuro-fuzzy redctor to be deveoed s suosed to oerate wth four nuts ( n = 4 and three oututs ( m = ony. ow f the samng nterva and the ead tme of forecast both are taken to be tme unt, t means f tme unt means hour (ths s ony an exame, because another tme unt can equa to 5 mnutes, so ths IO system was takng 4 hours as an nut data and redcted hours n advance, then for each t 4, the nut data n ths case reresent a four dmensona vector and outut data a three dmensona vector as descrbed beow. I = [ ( t, ( t, ( t, ( t ], (79 [ ( t, ( t, ( ] O = t (8 hus, n order to have sequenta outut n each row the vaues of t run as 4, 7,,, (-6; and so on, so that the IO matrx w roose an sort-term and offnebatch mode scenaro, whch s ook ke n equaton (8. In ths equaton, the frst four coumns of the IO matrx reresent the four nuts to the network whereas, ast three coumns reresent the outut from the neuro-fuzzy network. IO = (8 6

7 Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres [Fex Pasa] EPERIES AD RESULS OF BPA AD LA he tranng and forecastng erformances, whch roduced by F-network, are ustrated n Fgures a-c and a-c where the IO matrx s shown n equaton (8. here are 45 eectrca oad data from eectrca oad data used n tranng and forecastng, for whch 5 data are for tranng and the remanng 95 data are for forecastng. Because the frst arameters are random, the nta SSE n BPA starts from and the fna SSE tran becomes.87 by usng eochs. In the other case of LA, the nta SSE starts wth 558. and goes to fna SSE tran=9.764 usng ony eochs. SSE (Log SSE tranng vs Eochs F Outut - Actua F Error F ranng and Forecastng (Red vs Actua (Back, Error (Bue ranng Forecastng me Fgure c. ranng and Forecastng Performance of S-tye IO F network wth BPA, n= 4 Inuts, m= oututs, =5, Eochs=, Lead me d=, Learnng Rate η =., Wdness Factor WF=.5, Gamma γ =., omentum mo =.5, Fna SSEtota=8.664 wth 45 data ranng and Forecastng SSE ranng vs Eochs Eochs Fgure a. Grahc SSE vs. Eochs of S-tye IO F networks wth BPA, n= 4 Inuts, m= oututs, =5, Eochs= SSE (Log Eectrca Load tranng, F Outut (Red vs Actua (Back, Error (Bue F Outut - Actua Eochs Fgure a. SSE tranng vs. Eochs of S-tye IO F networks wth LA, n= 4 Inuts, m= oututs, =5, Eochs= F ranng Error me Fgure b.. ranng Performance of S-tye IO F network wth BPA, n= 4 Inuts, m= oututs, =5, Eochs=, Lead me d=, Learnng Rate η =., Wdness Factor WF=.5, Gamma γ =., omentum mo =.5, Inta SSE= , Fna SSEtran=.87, wth out of 5 Data ranng Fgure b, ustrates the roosed acceerated LA whch brngs the erformance smar to the roosed BPA usng ony eochs, ndcatng hgher convergence seed n comarson wth the BPA. In addton, the erformances n both Fgures and shows that the tranng has sma oscaton because of the mementaton of Wdness Factor (WF whch ony aow.% of oscaton. From Fgure a, t can be seen that LA has much faster tranng comared to BPA. LA needs ony around eochs to acheve the erformance, whereas BPA needs eochs. See abe for detaed comarson between these two methods. 7

8 urna eknk Eektro Vo. 7, o., Setember 7: - 9 F Outut - Actua 5 5 Eectrca Load ranng, F Outut (Red vs Actua (Back, Error (Bue he resut shows that SSE, SSE and SSE ndcate the sum squared error at outut, and, resectvey, of the akag-sugeno tye IO F network, and SSE ndcates the summng of a SSE vaues whch are contrbuted by a oututs. he smar defnton s aso aed to SE, SE, SE, RSE, RSE, and RSE. F ranng Error me Fgure b. ranng Performance of S-tye IO F network wth LA, n= 4 Inuts, m= oututs, =5, Eochs=, Lead me d=, Learnng Rate LVmu µ =, Wdness Factor WF=., Gamma γ =.5, omentum mo =.5, Inta SSE= 558., Fna SSEtran=9.764, wth out of 5 Data ranng F Outut - Actua 5 5 F ranng and Forecastng (Red vs Actua (Back, Error (Bue ranng Forecastng abe. Comarson between BPA and LA on Eectrca Load tranng and forecastng erformance of akag-sugeno-tye IO-F network wth 4 Inuts and Oututs Descrton ranng Data ( to 5 Eectrca Load ranng and Forecastng ( to 45 Performance usng BPA (ormazed Data Inta SSE= o of Eochs = SSEtran =.87 SSE =.998 SSE = SSE = SEtran =.58 SE =.48 SE =.6 SE =.6 RSE =.69 RSE =.8 RSE =.6 SSEtota = SSE =. SSE =.478 SSE = 4.67 SEtota =.6 RSEtota =.8 Performance usng LA (ormazed Data Inta SSE= 558. o of Eochs = SSEtran = SSE =.8777 SSE =.68 SSE = 5.75 SEtran =.4 SE =.5 SE =.7 SE =.9 RSE =.59 RSE =.4 RSE =.5 SSEtota =.786 SSE =.5575 SSE = 6.64 SSE =.7 SEtota =.9 RSEtota =.6 F error me Fgure c. ranng and Forecastng Performance of S-tye IO F network wth LA, n= 4 Inuts, m= oututs, =5, Eochs=, Lead me d=, Learnng Rate for LA µ =, Wdness Factor WF =., Gamma γ =.5, omentum mo =.5, Fna SSEtota=.786 abe demonstrates the dfference of usng BPA and LA n S-tye IO-F network. ote aso that SSE = Sum Squared Error, SE = ean Squared Error, and RSE = Root ean Squared Error as shown n equaton (8. ( en ( en ( en n= SSE =. 5, SE =, RSE = n= n= (8 COCLUSIO REARKS In the aer neuro-fuzzy aroaches wth two tyes of tranng agorthms have been resented for shortterm forecastng of eectrca oad. Performance resuts from Secton 4 roved that the traned F network usng LA s found to be very effcent n modeng and redcton of the varous nonnear dynamcs, comared to BPA. An effcent tranng agorthm based on combnaton of LA wth addtona modfed error ndex extenson (EI and adatve verson of earnng rate (momentum have been deveoed to tran the akag-sugeno tye mut-nut mut-outut (IO and mut-nut snge-outut (ISO euro-fuzzy network, mrovng the tranng erformance. Severa ssues need to be addressed n the future works whch ncudes many transarency and nterretabty of generated fuzzy mode (rues. For the atter ssue set theoretc smarty measures [7] shoud be comuted for each ar of fuzzy sets and the fuzzy sets whch are hghy smar shoud be merged together nto a snge one. 8

9 Comarson of BPA and LA methods for akag - Sugeno tye IO euro-fuzzy etwork to forecast Eectrca Load me Seres [Fex Pasa] REFERECES [] ang SR., AFIS: Adatve network Based Fuzzy Inference System, IEEE rans. On SC., (: , 99. [] Pat AK, Babuška R., Effcent tranng agorthm for akag-sugeno tye euro-fuzzy network, Proc. of FUZZ-IEEE, ebourne, Austraa, vo. : 58-54,. [] Pat AK, Poovc D., onnear combnaton of forecasts usng A, FL and F aroaches, FUZZ-IEEE, :566-57,. [4] Pat AK, Poovc D., Comutatona Integence n me Seres Forecastng, heory and Engneerng Acatons, Srnger, 5. [5] aosong D, Poovc D, Schuz-Ekoff, Oscaton resstng n the earnng of Backroagaton neura networks, Proc. of rd IFACIFIP, Begum, 995. [6] Pasa Fex, Forecastng of Eectrca Load usng akag-sugeno tye IO euro-fuzzy network, aster hess, Unversty of Bremen, 6. [7] Setnes, Babuška R, Kaymark U., Smarty measures n Fuzzy rue base smfcaton, IEEE ransacton on System, an and Cybernetcs, vo.8:77-775,

Long-term Forecasting of Electrical Load using. Gustafson-Kessel clustering algorithm on Takagi-Sugeno type MISO Neuro- Fuzzy network

Long-term Forecasting of Electrical Load using. Gustafson-Kessel clustering algorithm on Takagi-Sugeno type MISO Neuro- Fuzzy network Long-term Forecastng of Eectrca Load usng Gustafson-Kesse custerng agorthm on akag-sugeno type ISO euro- Fuzzy network By: Fex Pasa Eectrca Engneerng Department, Petra Chrstan Unversty, Surabaya fex@petra.ac.d

More information

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory

Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory Advanced Scence and Technoogy Letters Vo.83 (ISA 205), pp.60-65 http://dx.do.org/0.4257/ast.205.83.2 Research on Compex etworks Contro Based on Fuzzy Integra Sdng Theory Dongsheng Yang, Bngqng L, 2, He

More information

MARKOV CHAIN AND HIDDEN MARKOV MODEL

MARKOV CHAIN AND HIDDEN MARKOV MODEL MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not

More information

The Use of Principal Components Analysis in the Assessment of Process Capability Indices

The Use of Principal Components Analysis in the Assessment of Process Capability Indices Jont Statstca Meetngs - Secton on Physca & Engneerng Scences (SPES) The Use of Prnca omonents Anayss n the Assessment of Process aabty Indces Evdoka Xekaak Mchae Peraks Deartment of Statstcs Athens Unversty

More information

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks

Supplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks Shengyang Sun, Changyou Chen, Lawrence Carn Suppementary Matera: Learnng Structured Weght Uncertanty n Bayesan Neura Networks Shengyang Sun Changyou Chen Lawrence Carn Tsnghua Unversty Duke Unversty Duke

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

Elman Neural Network Application with accelerated LMA Training for East Java-Bali Electrical Load Time Series Data Forecasting

Elman Neural Network Application with accelerated LMA Training for East Java-Bali Electrical Load Time Series Data Forecasting Elman eural etwork Application with accelerated LA raining for East Java-Bali Electrical Load ime Series Data Forecasting F. Pasila,,. Lesmana, H. Ferdinando Electrical Engineering Department, Petra Christian

More information

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA

Xin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA RESEARCH ARTICLE MOELING FIXE OS BETTING FOR FUTURE EVENT PREICTION Weyun Chen eartment of Educatona Informaton Technoogy, Facuty of Educaton, East Chna Norma Unversty, Shangha, CHINA {weyun.chen@qq.com}

More information

Controller Design of Nonlinear TITO Systems with Uncertain Delays via Neural Networks and Error Entropy Minimization

Controller Design of Nonlinear TITO Systems with Uncertain Delays via Neural Networks and Error Entropy Minimization Controer Desgn of Nonnear TITO Systes wth Uncertan Deays va Neura Networs Error Entroy Mnzaton J. H. Zhang A. P. Wang* H. Wang** Deartent of Autoaton North Chna Eectrc Power Unversty Bejng 6 P. R. Chna

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

Dmitry A. Zaitsev Odessa National Telecommunication Academy Kuznechnaya, 1, Odessa, Ukraine

Dmitry A. Zaitsev Odessa National Telecommunication Academy Kuznechnaya, 1, Odessa, Ukraine th Worksho on Agorthms and Toos for Petr Nets, Setember - October, 4, Unversty of Paderborn, Germany, 75-8 Sovng the fundamenta equaton of Petr net usng the decomoston nto functona subnets Dmtry A Zatsev

More information

A Fluid-Based Model of Time-Limited TCP Flows 1

A Fluid-Based Model of Time-Limited TCP Flows 1 A Fud-Based Mode of Tme-Lmted TCP Fows Maro Barbera DIIT - Unversty of Catana V.e A. Dora 6 9525 Catana -Itay hone: +39 95 7382375 fax: +39 95 33828 mbarbera@.unct.t Afo Lombardo DIIT - Unversty of Catana

More information

Reconstruction History. Image Reconstruction. Radon Transform. Central Slice Theorem (I)

Reconstruction History. Image Reconstruction. Radon Transform. Central Slice Theorem (I) Reconstructon Hstory wth Bomedca Acatons EEG-475/675 Prof. Barner Reconstructon methods based on Radon s wor 97 cassc mage reconstructon from roectons aer 97 Hounsfed deveo the frst commerca x-ray CT scanner

More information

Neural network-based athletics performance prediction optimization model applied research

Neural network-based athletics performance prediction optimization model applied research Avaabe onne www.jocpr.com Journa of Chemca and Pharmaceutca Research, 04, 6(6):8-5 Research Artce ISSN : 0975-784 CODEN(USA) : JCPRC5 Neura networ-based athetcs performance predcton optmzaton mode apped

More information

REMODELLING OF VIBRATING SYSTEMS VIA FREQUENCY-DOMAIN-BASED VIRTUAL DISTORTION METHOD

REMODELLING OF VIBRATING SYSTEMS VIA FREQUENCY-DOMAIN-BASED VIRTUAL DISTORTION METHOD REMODELLING OF VIBRATING SYSTEMS VIA FREQUENCY-DOMAIN-BASED VIRTUAL DISTORTION METHOD Małgorzata MRÓZ and Jan HOLNICKI-SZULC Insttute of Fundamenta Technoogca Research, Swetokrzyska 21, -9 Warsaw, Poand

More information

Image Classification Using EM And JE algorithms

Image Classification Using EM And JE algorithms Machne earnng project report Fa, 2 Xaojn Sh, jennfer@soe Image Cassfcaton Usng EM And JE agorthms Xaojn Sh Department of Computer Engneerng, Unversty of Caforna, Santa Cruz, CA, 9564 jennfer@soe.ucsc.edu

More information

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System.

Supervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System. Part 7: Neura Networ & earnng /2/05 Superved earnng Neura Networ and Bac-Propagaton earnng Produce dered output for tranng nput Generaze reaonaby & appropratey to other nput Good exampe: pattern recognton

More information

Non-Ideality Through Fugacity and Activity

Non-Ideality Through Fugacity and Activity Non-Idealty Through Fugacty and Actvty S. Patel Deartment of Chemstry and Bochemstry, Unversty of Delaware, Newark, Delaware 19716, USA Corresondng author. E-mal: saatel@udel.edu 1 I. FUGACITY In ths dscusson,

More information

Neural-Network-Based Fuzzy Group Forecasting with Application to Foreign Exchange Rates Prediction

Neural-Network-Based Fuzzy Group Forecasting with Application to Foreign Exchange Rates Prediction Neura-Network-Based Fuy Grou Forecastng wth Acaton to Foregn Echange Rates Predcton Lean Yu,, Kn Keung La, and Shouyang Wang Insttute of Systems Scence, Academy of Mathematcs and Systems Scence, Chnese

More information

Bias Term b in SVMs Again

Bias Term b in SVMs Again Proceedngs of 2 th Euroean Symosum on Artfca Neura Networks,. 44-448, ESANN 2004, Bruges, Begum, 2004 Bas Term b n SVMs Agan Te Mng Huang, Vosav Kecman Schoo of Engneerng, The Unversty of Auckand, Auckand,

More information

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period

2-Adic Complexity of a Sequence Obtained from a Periodic Binary Sequence by Either Inserting or Deleting k Symbols within One Period -Adc Comlexty of a Seuence Obtaned from a Perodc Bnary Seuence by Ether Insertng or Deletng Symbols wthn One Perod ZHAO Lu, WEN Qao-yan (State Key Laboratory of Networng and Swtchng echnology, Bejng Unversty

More information

Digital PI Controller Equations

Digital PI Controller Equations Ver. 4, 9 th March 7 Dgtal PI Controller Equatons Probably the most common tye of controller n ndustral ower electroncs s the PI (Proortonal - Integral) controller. In feld orented motor control, PI controllers

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Quality-of-Service Routing in Heterogeneous Networks with Optimal Buffer and Bandwidth Allocation

Quality-of-Service Routing in Heterogeneous Networks with Optimal Buffer and Bandwidth Allocation Purdue Unversty Purdue e-pubs ECE Technca Reorts Eectrca and Comuter Engneerng -6-007 Quaty-of-Servce Routng n Heterogeneous Networs wth Otma Buffer and Bandwdth Aocaton Waseem Sheh Purdue Unversty, waseem@urdue.edu

More information

COXREG. Estimation (1)

COXREG. Estimation (1) COXREG Cox (972) frst suggested the modes n whch factors reated to fetme have a mutpcatve effect on the hazard functon. These modes are caed proportona hazards (PH) modes. Under the proportona hazards

More information

we have E Y x t ( ( xl)) 1 ( xl), e a in I( Λ ) are as follows:

we have E Y x t ( ( xl)) 1 ( xl), e a in I( Λ ) are as follows: APPENDICES Aendx : the roof of Equaton (6 For j m n we have Smary from Equaton ( note that j '( ( ( j E Y x t ( ( x ( x a V ( ( x a ( ( x ( x b V ( ( x b V x e d ( abx ( ( x e a a bx ( x xe b a bx By usng

More information

Cyclic Codes BCH Codes

Cyclic Codes BCH Codes Cycc Codes BCH Codes Gaos Feds GF m A Gaos fed of m eements can be obtaned usng the symbos 0,, á, and the eements beng 0,, á, á, á 3 m,... so that fed F* s cosed under mutpcaton wth m eements. The operator

More information

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident

The Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident ICTCT Extra Workshop, Bejng Proceedngs The Appcaton of BP Neura Network prncpa component anayss n Forecastng Road Traffc Accdent He Mng, GuoXucheng &LuGuangmng Transportaton Coege of Souast Unversty 07

More information

Supplementary Material for Spectral Clustering based on the graph p-laplacian

Supplementary Material for Spectral Clustering based on the graph p-laplacian Sulementary Materal for Sectral Clusterng based on the grah -Lalacan Thomas Bühler and Matthas Hen Saarland Unversty, Saarbrücken, Germany {tb,hen}@csun-sbde May 009 Corrected verson, June 00 Abstract

More information

A total variation approach

A total variation approach Denosng n dgtal radograhy: A total varaton aroach I. Froso M. Lucchese. A. Borghese htt://as-lab.ds.unm.t / 46 I. Froso, M. Lucchese,. A. Borghese Images are corruted by nose ) When measurement of some

More information

Hidden Markov Model Cheat Sheet

Hidden Markov Model Cheat Sheet Hdden Markov Model Cheat Sheet (GIT ID: dc2f391536d67ed5847290d5250d4baae103487e) Ths document s a cheat sheet on Hdden Markov Models (HMMs). It resembles lecture notes, excet that t cuts to the chase

More information

An Accurate Heave Signal Prediction Using Artificial Neural Network

An Accurate Heave Signal Prediction Using Artificial Neural Network Internatonal Journal of Multdsclnary and Current Research Research Artcle ISSN: 2321-3124 Avalale at: htt://jmcr.com Mohammed El-Dasty 1,2 1 Hydrograhc Surveyng Deartment, Faculty of Martme Studes, Kng

More information

Segmentation Method of MRI Using Fuzzy Gaussian Basis Neural Network

Segmentation Method of MRI Using Fuzzy Gaussian Basis Neural Network Neural Informaton Processng - Letters and Revews Vol.8, No., August 005 LETTER Segmentaton Method of MRI Usng Fuzzy Gaussan Bass Neural Networ We Sun College of Electrcal and Informaton Engneerng, Hunan

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Fuzzy approach to solve multi-objective capacitated transportation problem

Fuzzy approach to solve multi-objective capacitated transportation problem Internatonal Journal of Bonformatcs Research, ISSN: 0975 087, Volume, Issue, 00, -0-4 Fuzzy aroach to solve mult-objectve caactated transortaton roblem Lohgaonkar M. H. and Bajaj V. H.* * Deartment of

More information

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students.

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students. Exampe: Suppose we want to bud a cassfer that recognzes WebPages of graduate students. How can we fnd tranng data? We can browse the web and coect a sampe of WebPages of graduate students of varous unverstes.

More information

ELMAN NEURAL NETWORK WITH ACCELERATED LMA TRAINING FOR EAST JAVA-BALI ELECTRICAL LOAD TIME SERIES DATA FORECASTING

ELMAN NEURAL NETWORK WITH ACCELERATED LMA TRAINING FOR EAST JAVA-BALI ELECTRICAL LOAD TIME SERIES DATA FORECASTING ELMA EURAL EWORK WIH ACCELERAED LMA RAIIG FOR EAS AVA-BALI ELECRICAL LOAD IME SERIES DAA FORECASIG F. Pasila,. Lesmana, H. Ferdinando 3,,3 Electrical Engineering Department, Petra Christian University

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

EP523 Introduction to QFT I

EP523 Introduction to QFT I EP523 Introducton to QFT I Toc 0 INTRODUCTION TO COURSE Deartment of Engneerng Physcs Unversty of Gazante Setember 2011 Sayfa 1 Content Introducton Revew of SR, QM, RQM and EMT Lagrangan Feld Theory An

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Gradient Compared lp-lms Algorithms for Sparse System Identification

Gradient Compared lp-lms Algorithms for Sparse System Identification Gradent Comared l-lms Algorthms for Sarse System Identfcaton Yong Feng 1,, Jasong Wu, Ru Zeng, Lmn Luo, Huazhong Shu 1. School of Bologcal Scence and Medcal Engneerng, Southeast Unversty, Nanjng 10096,

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of

More information

Quantum Runge-Lenz Vector and the Hydrogen Atom, the hidden SO(4) symmetry

Quantum Runge-Lenz Vector and the Hydrogen Atom, the hidden SO(4) symmetry Quantum Runge-Lenz ector and the Hydrogen Atom, the hdden SO(4) symmetry Pasca Szrftgser and Edgardo S. Cheb-Terrab () Laboratore PhLAM, UMR CNRS 85, Unversté Le, F-59655, France () Mapesoft Let's consder

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Multi-dimensional Central Limit Theorem

Multi-dimensional Central Limit Theorem Mult-dmensonal Central Lmt heorem Outlne ( ( ( t as ( + ( + + ( ( ( Consder a sequence of ndependent random proceses t, t, dentcal to some ( t. Assume t = 0. Defne the sum process t t t t = ( t = (; t

More information

Algorithms for factoring

Algorithms for factoring CSA E0 235: Crytograhy Arl 9,2015 Instructor: Arta Patra Algorthms for factorng Submtted by: Jay Oza, Nranjan Sngh Introducton Factorsaton of large ntegers has been a wdely studed toc manly because of

More information

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques

Short-Term Load Forecasting for Electric Power Systems Using the PSO-SVR and FCM Clustering Techniques Energes 20, 4, 73-84; do:0.3390/en40073 Artce OPEN ACCESS energes ISSN 996-073 www.mdp.com/journa/energes Short-Term Load Forecastng for Eectrc Power Systems Usng the PSO-SVR and FCM Custerng Technques

More information

Chapter 6. Rotations and Tensors

Chapter 6. Rotations and Tensors Vector Spaces n Physcs 8/6/5 Chapter 6. Rotatons and ensors here s a speca knd of near transformaton whch s used to transforms coordnates from one set of axes to another set of axes (wth the same orgn).

More information

Adaptive LRBP Using Learning Automata for Neural Networks

Adaptive LRBP Using Learning Automata for Neural Networks Adaptve LRBP Usng Learnng Automata for eura etworks *B. MASHOUFI, *MOHAMMAD B. MEHAJ (#, *SAYED A. MOTAMEDI and **MOHAMMAD R. MEYBODI *Eectrca Engneerng Department **Computer Engneerng Department Amrkabr

More information

MAE140 - Linear Circuits - Fall 13 Midterm, October 31

MAE140 - Linear Circuits - Fall 13 Midterm, October 31 Instructons ME140 - Lnear Crcuts - Fall 13 Mdterm, October 31 () Ths exam s open book. You may use whatever wrtten materals you choose, ncludng your class notes and textbook. You may use a hand calculator

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Short-Term Load Forecasting in Power Systems Using Adaptive Fuzzy Critic Based Neural Network

Short-Term Load Forecasting in Power Systems Using Adaptive Fuzzy Critic Based Neural Network Short-Term Load Forecastng n Power Systems Usng Adatve Fuzzy Crtc Based Neural Network Farzan Rashd Islamc Azad Unversty of Bushehr Bushehr, Iran Abstract: - Load forecastng consttutes an mortant tool

More information

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957)

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957) 4//06 IAI: Lecture 09 Feedforard Mult-Layer Percetron (Doredná vacvrstvová seť) Lubca Benuskova AIMA 3rd ed. Ch. 8.6.4 8.7.5 Classfcaton (klasfkáca) In machne learnng and statstcs, classfcaton s the roblem

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE II LECTURE - GENERAL LINEAR HYPOTHESIS AND ANALYSIS OF VARIANCE Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 3.

More information

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1 Problem Set 4 Suggested Solutons Problem (A) The market demand functon s the soluton to the followng utlty-maxmzaton roblem (UMP): The Lagrangean: ( x, x, x ) = + max U x, x, x x x x st.. x + x + x y x,

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

6. Hamilton s Equations

6. Hamilton s Equations 6. Hamlton s Equatons Mchael Fowler A Dynamcal System s Path n Confguraton Sace and n State Sace The story so far: For a mechancal system wth n degrees of freedom, the satal confguraton at some nstant

More information

Multi-objective Optimal Block Transaction model based Transient Stability Evaluation

Multi-objective Optimal Block Transaction model based Transient Stability Evaluation Proceedngs of the 5th WSEAS Internatona Conference on Aed Comuter Scence, Hangzhou, Chna, Ar 6-8, 006 (907-9) Mut-objectve Otma Boc Transacton mode based Transent Stabty Evauaton Ru Ma, Hongwen Yan Changsha

More information

A finite difference method for heat equation in the unbounded domain

A finite difference method for heat equation in the unbounded domain Internatona Conerence on Advanced ectronc Scence and Technoogy (AST 6) A nte derence method or heat equaton n the unbounded doman a Quan Zheng and Xn Zhao Coege o Scence North Chna nversty o Technoogy

More information

Note 2. Ling fong Li. 1 Klein Gordon Equation Probablity interpretation Solutions to Klein-Gordon Equation... 2

Note 2. Ling fong Li. 1 Klein Gordon Equation Probablity interpretation Solutions to Klein-Gordon Equation... 2 Note 2 Lng fong L Contents Ken Gordon Equaton. Probabty nterpretaton......................................2 Soutons to Ken-Gordon Equaton............................... 2 2 Drac Equaton 3 2. Probabty nterpretaton.....................................

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

Model Reference Adaptive Temperature Control of the Electromagnetic Oven Process in Manufacturing Process

Model Reference Adaptive Temperature Control of the Electromagnetic Oven Process in Manufacturing Process RECENT ADVANCES n SIGNAL PROCESSING, ROBOTICS and AUTOMATION Model Reference Adatve Temerature Control of the Electromagnetc Oven Process n Manufacturng Process JIRAPHON SRISERTPOL SUPOT PHUNGPHIMAI School

More information

MAE140 - Linear Circuits - Winter 16 Midterm, February 5

MAE140 - Linear Circuits - Winter 16 Midterm, February 5 Instructons ME140 - Lnear Crcuts - Wnter 16 Mdterm, February 5 () Ths exam s open book. You may use whatever wrtten materals you choose, ncludng your class notes and textbook. You may use a hand calculator

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application 7 Determnng Transmsson Losses Penalty Factor Usng Adaptve Neuro Fuzzy Inference System (ANFIS) For Economc Dspatch Applcaton Rony Seto Wbowo Maurdh Hery Purnomo Dod Prastanto Electrcal Engneerng Department,

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Machine Remaining Useful Life Prediction Based on Adaptive Neuro-Fuzzy and High-Order Particle Filtering

Machine Remaining Useful Life Prediction Based on Adaptive Neuro-Fuzzy and High-Order Particle Filtering Machne Remanng Useful Lfe Predcton Based on Adatve Neuro-Fuzzy and Hgh-Order Partcle Flterng Chaochao Chen, George Vachtsevanos, and Marcos E. Orchard 2 Georga Insttute of Technology, Atlanta, GA 30332

More information

Outline. EM Algorithm and its Applications. K-Means Classifier. K-Means Classifier (Cont.) Introduction of EM K-Means EM EM Applications.

Outline. EM Algorithm and its Applications. K-Means Classifier. K-Means Classifier (Cont.) Introduction of EM K-Means EM EM Applications. EM Algorthm and ts Alcatons Y L Deartment of omuter Scence and Engneerng Unversty of Washngton utlne Introducton of EM K-Means EM EM Alcatons Image Segmentaton usng EM bect lass Recognton n BIR olor lusterng

More information

AN EXTENDED MPC CONVERGENCE CONDITION

AN EXTENDED MPC CONVERGENCE CONDITION Latn Amercan Aled esearch 36:57-6 (6 AN EXENDED MPC CONVEGENCE CONDIION A. H. GONZÁLEZ and. L. MACHEI Insttuto de Desarrollo ecnológco ara la Industra uímca, INEC (UNL - CONICE alegon@cerde.gov.ar lmarch@cerde.gov.ar

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Translational Equations of Motion for A Body Translational equations of motion (centroidal) for a body are m r = f.

Translational Equations of Motion for A Body Translational equations of motion (centroidal) for a body are m r = f. Lesson 12: Equatons o Moton Newton s Laws Frst Law: A artcle remans at rest or contnues to move n a straght lne wth constant seed there s no orce actng on t Second Law: The acceleraton o a artcle s roortonal

More information

IV. Performance Optimization

IV. Performance Optimization IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Naïve Bayes Classifier

Naïve Bayes Classifier 9/8/07 MIST.6060 Busness Intellgence and Data Mnng Naïve Bayes Classfer Termnology Predctors: the attrbutes (varables) whose values are used for redcton and classfcaton. Predctors are also called nut varables,

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs hyscs 151 Lecture Canoncal Transformatons (Chater 9) What We Dd Last Tme Drect Condtons Q j Q j = = j, Q, j, Q, Necessary and suffcent j j for Canoncal Transf. = = j Q, Q, j Q, Q, Infntesmal CT

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

T-Norm of Yager Class of Subsethood Defuzzification: Improving Enrolment Forecast in Fuzzy Time Series

T-Norm of Yager Class of Subsethood Defuzzification: Improving Enrolment Forecast in Fuzzy Time Series Gadng Busness and Management Journal Vol. 10 No. 2, 1-11, 2006 T-Norm of Yager Class of Subsethood Defuzzfcaton: Imrovng Enrolment Forecast n Fuzzy Tme Seres Nazrah Raml Faculty of Informaton Technology

More information

Backpropagation Based Training Algorithm for Takagi - Sugeno Type MIMO Neuro-Fuzzy Network to Forecast Electrical Load Time Series

Backpropagation Based Training Algorithm for Takagi - Sugeno Type MIMO Neuro-Fuzzy Network to Forecast Electrical Load Time Series Backpropagato Based Trag Agorthm for Takag - Sugeo Type IO Neuro-Fuzzy Network to Forecast Eectrca Load Tme Seres Aoy Kumar Pat, ember, IEEE, ad Gerhard Doedg deeg GmbH, Kurfuersteaee - 30, D-8 Breme,

More information

Solution of Linear System of Equations and Matrix Inversion Gauss Seidel Iteration Method

Solution of Linear System of Equations and Matrix Inversion Gauss Seidel Iteration Method Soluton of Lnear System of Equatons and Matr Inverson Gauss Sedel Iteraton Method It s another well-known teratve method for solvng a system of lnear equatons of the form a + a22 + + ann = b a2 + a222

More information

[WAVES] 1. Waves and wave forces. Definition of waves

[WAVES] 1. Waves and wave forces. Definition of waves 1. Waves and forces Defnton of s In the smuatons on ong-crested s are consdered. The drecton of these s (μ) s defned as sketched beow n the goba co-ordnate sstem: North West East South The eevaton can

More information

A General Class of Selection Procedures and Modified Murthy Estimator

A General Class of Selection Procedures and Modified Murthy Estimator ISS 684-8403 Journal of Statstcs Volume 4, 007,. 3-9 A General Class of Selecton Procedures and Modfed Murthy Estmator Abdul Bast and Muhammad Qasar Shahbaz Abstract A new selecton rocedure for unequal

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Mixture of Gaussians Expectation Maximization (EM) Part 2

Mixture of Gaussians Expectation Maximization (EM) Part 2 Mture of Gaussans Eectaton Mamaton EM Part 2 Most of the sldes are due to Chrstoher Bsho BCS Summer School Eeter 2003. The rest of the sldes are based on lecture notes by A. Ng Lmtatons of K-means Hard

More information

The Dirac Equation for a One-electron atom. In this section we will derive the Dirac equation for a one-electron atom.

The Dirac Equation for a One-electron atom. In this section we will derive the Dirac equation for a one-electron atom. The Drac Equaton for a One-electron atom In ths secton we wll derve the Drac equaton for a one-electron atom. Accordng to Ensten the energy of a artcle wth rest mass m movng wth a velocty V s gven by E

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

SPATIAL KINEMATICS OF GEARS IN ABSOLUTE COORDINATES

SPATIAL KINEMATICS OF GEARS IN ABSOLUTE COORDINATES SATIAL KINEMATICS OF GEARS IN ABSOLUTE COORDINATES Dmtry Vasenko and Roand Kasper Insttute of Mobe Systems (IMS) Otto-von-Guercke-Unversty Magdeburg D-39016, Magdeburg, Germany E-ma: Dmtr.Vasenko@ovgu.de

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition A Neuro-Fuzzy System on System Modelng and Its Applcaton on Character Recognton C. J. Chen 1, S. M. Yang 2, Z. C. Wang 3 1 Department of Avaton Servce Management Alethea Unversty Tawan, ROC 2,3 Department

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Erratum: A Generalized Path Integral Control Approach to Reinforcement Learning

Erratum: A Generalized Path Integral Control Approach to Reinforcement Learning Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of

More information

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS

A DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS A DIMESIO-REDUCTIO METHOD FOR STOCHASTIC AALYSIS SECOD-MOMET AALYSIS S. Rahman Department of Mechanca Engneerng and Center for Computer-Aded Desgn The Unversty of Iowa Iowa Cty, IA 52245 June 2003 OUTLIE

More information

Nonlinear Robust Regression Using Kernel Principal Component Analysis and R-Estimators

Nonlinear Robust Regression Using Kernel Principal Component Analysis and R-Estimators IJCSI Internatona Journa of Comuter Scence Issues, Vo. 8, Issue 5, o, Setember 0 ISS (Onne: 694-084 www.ijcsi.org 75 onnear Robust Regresson Usng Kerne rnca Comonent Anayss and R-Estmators Anton Wbowo

More information

Nonextensibility of energy in Tsallis statistics and the zeroth law of

Nonextensibility of energy in Tsallis statistics and the zeroth law of Nonextensbty of energy n Tsas statstcs and the zeroth a of thermodynamcs onge Ou and Jncan hen* T Word Laboratory, P. O. 870, eng 00080, Peoe s Reubc of hna and Deartment of Physcs, Xamen nversty, Xamen

More information

Using Genetic Algorithms in System Identification

Using Genetic Algorithms in System Identification Usng Genetc Algorthms n System Identfcaton Ecaterna Vladu Deartment of Electrcal Engneerng and Informaton Technology, Unversty of Oradea, Unverstat, 410087 Oradea, Româna Phone: +40259408435, Fax: +40259408408,

More information

From Biot-Savart Law to Divergence of B (1)

From Biot-Savart Law to Divergence of B (1) From Bot-Savart Law to Dvergence of B (1) Let s prove that Bot-Savart gves us B (r ) = 0 for an arbtrary current densty. Frst take the dvergence of both sdes of Bot-Savart. The dervatve s wth respect to

More information

Logistic regression with one predictor. STK4900/ Lecture 7. Program

Logistic regression with one predictor. STK4900/ Lecture 7. Program Logstc regresson wth one redctor STK49/99 - Lecture 7 Program. Logstc regresson wth one redctor 2. Maxmum lkelhood estmaton 3. Logstc regresson wth several redctors 4. Devance and lkelhood rato tests 5.

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information