An Evolutionary Method of Neural Network in System Identification

Size: px
Start display at page:

Download "An Evolutionary Method of Neural Network in System Identification"

Transcription

1 Internatonal Journal of Intellgent Informaton Systems 206; 5(5): do: 0.648/.s ISSN: (Prnt); ISSN: (Onlne) An Evolutonary Method of Neural Network n System Identfcaton Shumng T. Wang, Ch-Yen Shen, Yu-Ju Chen 2, Chuo-Yean Chang 3, Rey-Chue Hwang, * Department of Electrcal Engneerng, I-Shou Unversty, Kaohsung Cty, Tawan, R.O.C. 2 Department of Informaton Management, Cheng Shu Unversty, Kaohsung Cty, Tawan, R.O.C. 3 Department of Electrcal Engneerng, Cheng Shu Unversty, Kaohsung Cty, Tawan, R.O.C. Emal address: smwang@su.edu.tw (Shumng T. Wang), cyshen@su.edu.tw (Ch-Yen Shen ), ychen@csu.edu.tw (Yu-Ju Chen), cychang@csu.edu.tw (Chuo-Yean Chang), rchwang@su.edu.tw (Rey-Chue Hwang) * Correspondng author To cte ths artcle: Shumng T. Wang, Ch-Yen Shen, Yu-Ju Chen, Chuo-Yean Chang, Rey-Chue Hwang. An Evolutonary Method of Neural Network n System Identfcaton. Internatonal Journal of Intellgent Informaton Systems. Vol. 5, No. 5, 206, pp do: 0.648/.s Receved: September 9, 206; Accepted: October 2, 206; Publshed: October 20, 206 Abstract: Ths paper presents an evolutonary method for calculatng the mportant degree (ID) of ndvdual nput varable of well-traned neural network (NN). The mportance of each nput varable of neural network could be dstngushed n accordance wth ID value obtaned. In ths research, several lnear and nonlnear systems dentfcatons were frstly studed and smulated. From the smulaton results shown, the evolutonary method proposed s qute promsng and accurate for the estmaton of system s parameters. In other worlds, the method proposed could be used for data mnng n the real applcatons. In order to verfy our nference vew, the evaporaton process of thn flm was studed ether. It s a real case of ndustral applcaton. Agan, the studed results show that the method proposed ndeed has the superorty and potental n the area of data mnng. Keywords: Evolutonary, Important Degree, Neural Network, System Identfcaton. Introducton It s well known that system dentfcaton s a method whch can dentfy the mathematcal model of an unknown system from the measurements of system s nputs and outputs. It has been employed nto many areas such as ndustral process, control system, economc forecastng, socal scence and so on. Many classes of system dentfcaton have been studed such as Volterra and Wener seres [], NARMAX models [2] and neural network (NN) [3-7]. Generally, the steps lke model hypothess, estmaton of parameters and system verfcaton are ndspensable for dentfyng an unknown system. Recently, data mnng has been wdely used n the applcatons of sgnal processng and system dentfcaton. It s the process of analyzng data from dfferent perspectves and then extracts the hdden nformaton from a large data base. It can help the researcher to grasp the useful nformaton that mght be gnored and mssed n the process of data analyss. The tasks of data mnng generally nclude four classes whch are classfcaton, clusterng, regresson, and assocaton rule learnng [8-4]. In whch, regresson analyss conssts of the graphc and analytc methods. It ams to obtan the relatonshps between response varable (output) and other predctor varables (nputs) for an unknown system. Its goal s to express the response varable as a functon of the predctor varables. The maor uses of regresson methodology nclude model specfcaton and parameter estmaton [5]. Model specfcaton s a very mportant step n the applcaton of system dentfcaton. Its obectve s the assessment of the relatve values of ndvdual predctor varables on the predcton of the response. Bascally, all relevant varables n the database must be analyzed n order to fnd whch varable has the hgher correlaton wth the response. Thus, the parameter estmaton of the model s necessary to the regresson analyss. Due to the powerful learnng and adaptve capabltes, NN model has been wdely studed n the area of data mnng. Many research artcles have been proposed [6-22]. NN has the ablty to deal wth lots of lnear or nonlnear modelng

2 Internatonal Journal of Intellgent Informaton Systems 206; 5(5): problems. Through the learnng from data provded, NN s able to generate a mappng between nput and output pars bypassng the complcated statstcal analyss steps. In general, the applcatons of NN model n data mnng can be classfed nto four maor categores: () Feed-forward NN, whch s manly used n the areas of predcton and pattern recognton; (2) Feedback NN, whch s manly used for assocatve memory and optmzaton calculaton; (3) Self-organzaton NN, whch s manly used for cluster analyss; (4) Random NN, whch has the advantages of assocatve memory and mage processng [23]. In ths research, the studes of NN versus regresson analyss were studed. The assessment of the relatve values of ndvdual predctor varables (nputs) on the predcton of the response (output) was analyzed based on NN technque. The supervsed learnng feed-forward NN was manly studed. And, the error back-propagaton (BP) learnng algorthm was used as the tranng rule of NN model. The detals of NN structure and ts learnng algorthm are descrbed n Secton 2. Secton 3 presents some relevant experments and results. At last, a concluson s gven n Secton NN Model and Its Learnng Algorthm The NN structure commonly known as mult-layered feed-forward net s used n ths study. A three-layered feed-forward NN model as shown n Fgure s the selected topology. Each layer s connected to a layer above t n a feed-forward manner n the sense that no feed-back from the same layer or a layer above. All connectons have a multplyng weght assocated wth them. Tranng s equvalent to fnd the proper weghts for all connectons such that a desred output s generated for a gven nput set. value of node. Fgure 2. The dagram of sgmod functon. Here, we use a smple three-layered NN model to descrbe the evolutonal method we proposed for calculatng the mportant degree (ID) of ndvdual nput varable of a well-traned NN [24-25]. In Fgure, t can be clearly found that the hdden nodes outputs ( H, H 2,..., Hn ) and network output (Y) can be expressed as the followng nonlnear math forms to the nputs ( x, x2..., xm ). Y = /( + exp( H n n = = /( + exp( ω H + θo )) (2) m = ω x + θ )) (3) Y = /( + exp( ω (/( + exp( ωx + θ ))) + θo)) ω f (v) = m = (4) s the strength of connecton between hdden node and nput node and ω s the strength of connecton between hdden node and output node Y. θo and θ are bas terms. The sgmod functon s an ncreasng functon whch has the output values wthn the range [0, ], thus, the followng relatonshps could be nferred. v H ω x + θ (5) Fgure. A three-layered feed-forward neural network.. In NN model, the nonlnear actvaton functon of all nodes s the sgmod functon. Its math form s expressed as the followng equaton and ts dagram s shown n Fgure 2. f ( v) v = + e Where, v ω, ω s the strength of connecton = x between node and node n the layer below and x () s the Y ω H + θ 0 (6) Y ω ( ω x + θ ) + θ (7) Accordng to the nferences of Eqs. (5), (6) and (7), the mportant degree (ID) and the percentage mportant degree (PID) of nput x to output Y are defned by NT ω k = ID = ( k) ω ( k) x ( k) (8) 0

3 77 Shumng T. Wang et al.: An Evolutonary Method of Neural Network n System Identfcaton PID ID = m = ID (9) mean zero and varance three and they have the values n the range of [-3, 3]. Table lsts the statstcs of (x, x 2, x 3, x 4, x 5 ). where, NT s the total number of nput number (category) of nput varables. 3. Experments and Results 3.. Lnear and Nonlnear Regresson Models and m s the The am of regresson s to express the response varable as a functon of the predctor varables. For lnear regresson, the response varable s supposed to have the lnear relatonshp wth the predctor varables. Denotng the response varable by Y and the m predctor varables by x, x 2,, x m, the lnear relatonshp takes the form Y = a0 + ax + a2x amxm + ε (0) In expresson, a0, a, a2,..., am are unknown parameters needed to be determned. ε s the nose term. Nonlnear regresson s a form of regresson analyss n whch the response varable s modeled by a functon whch s a nonlnear combnaton of the predctor varables. For nstance, a nonlnear functon s gven as 2 x2 Y = a0 + ax + a2 + ε () + exp( x ) In expresson, the relatonshp between Y and (x, x 2 ) are nonlnear. In general, t s very dffcult to obtan an exact closed-form expresson for the unknown model n the nonlnear regresson analyss. The data are usually ftted by a method of successve approxmatons. The present adaptve methods are very successful n lnear regresson. The applcaton of these methods to nonlnear regresson, however, requres several smplfcatons. And, only the approxmate model could be obtaned NN vs. Lnear Regresson Model A lnear regresson model s gven as [24] Y = x 0 + x (2) +.x The ratos of the coeffcents among nput varables are x :x 2 =0:, x :x 3 = 00:, and x 2 :x 3 =0:. That means the degrees of mportance of x to Y s 0 tmes x 2 to Y, and 00 tmes x 3 to Y, f x, x 2 and x 3 have the same statstc dstrbutons. In our studes, Eq. (2) was frstly consdered and 500 ponts were generated from the equaton. Then, we assume ths lnear model s an unknown system whch needs to be dentfed. In system s dentfcaton process, fve varables, x, x 2, x 3, x 4 and x 5 are collected and assumed to be the possble nfluencng factors. Fve varables are generated by unformly dstrbuted random numbers wth 3 x Table. The statstcs of (x, x 2, x 3, x 4, x 5). Varable Range Mean Varance x -3~ x 2-3~ x 3-3~ x 4-3~ x 5-3~ A NN wth sze m-4- was used to model ths lnear equaton. The number of nput nodes (m) of network s based on the varables used. In ths study, 400 ponts were used for NN s tranng and 00 ponts were used for testng. Table 2 lsts the testng mean absolute errors (MAE) of NN modelng wth dfferent nput varables. From the values of MAE, we conclude that NN has the effcent learnng to ths unknown system. Table 2. The MAEs of NN modelng wth dfferent nput varables. NN Sze Input Varables (x,x 2,x 3) (x,x 2,x 3,x 4) (x,x 2,x 3,x 4,x 5) MAE The correspondng ID value of each nput varable x n the well-traned NN models s then calculated. Table 3 s ID and PID values calculated for NN models wth three, four and fve nputs, respectvely. From the table results shown, a phenomenon could be observed. For sze 3-4- NN, the PID rato of (x, x 2, x 3 ) s (:0.0:0.00). For sze 4-4- NN, the PID rato of (x, x 2, x 3, x 4 ) s (:0.3:0.0:0.007). For sze 5-4- NN, the PID rato of (x, x 2, x 3, x 4, x 5 ) s (:0.:0.007: :0.0035). It s clearly found that the PID ratos of (x, x 2, x 3 ) n three NN models are all very close to the rato (:0.:0.0) of the coeffcents of (x, x 2, x 3 ) n the orgnal lnear equaton (2). The ID and PID values of x 4 and x 5 are much smaller than other varables. Ths phenomenon shows that the nonlnear NN model s capable of evaluatng the mportance degree of each nput varable to the output. And, x 4 and x 5 could be treated as the useless terms or noses due to ther low ID and PID values. Table 3. ID and PID values calculated by three NN models. NN Sze x ID (PID) (89.265%) (88.766%) (89.036%) x 2 ID (PID) (9.846%) (0.067%) (9.872%) x 3 ID (PID) (0.889%) (.03%) (0.952%) x 4 ID (PID) (0.53%) (0.02%) x 5 ID (PID) (0.2%)

4 Internatonal Journal of Intellgent Informaton Systems 206; 5(5): In order to evdence the valdty of the method we proposed, 500 ponts were regenerated. x, x 2, x 3, x 4 and x 5 are unformly dstrbuted random numbers wth dfferent ranges. Table 4 lsts the statstcs of (x, x 2, x 3, x 4, x 5 ). Table 4. The statstcs of (x, x 2, x 3, x 4, x 5). Varable Range Mean x -~ x 2-0~ x 3-5~ x 4 -~ x 5-5~ From the statstcs of Table 4, we conclude that the average absolute value of x 2 should be 0 tmes the average absolute value of x and the average absolute value of x 3 should be 5 tmes the average absolute value of x. Multplyng the coeffcent of each varable n Eq. (2), the mportant degrees of x, x 2 and x 3 to Y should be ::0.05. Table 5 lsts the testng mean absolute errors (MAE) of NN modelng wth dfferent nput varables. From the values of MAE, we also conclude that NN has the effcent learnng to the system. Table 5. The MAEs of NN modelng wth dfferent nput varables. NN Sze Input Varables (x,x 2,x 3) (x,x 2,x 3,x 4) (x,x 2,x 3,x 4,x 5) MAE Agan, we calculated ID and PID values from three well-traned NN models. Table 6 lsts ID and PID values calculated by NN models wth three, four and fve nputs, respectvely. From the table results shown, the rato of PID of (x, x 2, x 3 ) s (:0.982:0.049) for sze 3-4- NN model. For sze 4-4- NN model, the rato of NPID of (x, x 2, x 3, x 4 ) s (:0.990:0.0498:0.0027). For sze 5-4- NN, the rato of PID of (x, x 2, x 3, x 4, x 5 ) s (:0.985:0.0475:0.0006: ). It s clearly found that the ratos of (x, x 2, x 3 ) n three NN models are all very close to (::0.05) of (x, x 2, x 3 ) n the orgnal lnear equaton. The ID and PID values of x 4 and x 5 are much smaller than x, x 2 and x 3. Table 6. ID and PID values calculated by three NN models. NN Sze x ID (PID) (49.24%) (48.95%) (49.0%) x 2 ID (PID) (48.36%) (48.47%) (48.38%) x 3 ID (PID) (2.40%) (2.44%) (2.33%) x 4 ID (PID) (0.3%) (0.03%) x 5 ID (PID) (0.5%) Same as the prevous studes, ths phenomenon shows that the nonlnear NN model s able to evaluate the mportant degree of each nput varable to the output. And, x 4 and x 5 are nferred to be the useless terms or noses due to ther low ID and PID values NN vs. Nonlnear Regresson Model In ths sesson, the nonlnear system dentfcaton by NN model was studed contnuously. A nonlnear equaton gven by [24] Y = x + (3) x2 + xx2 x3 500 ponts were generated from the equaton. x, x 2, x 3, x 4 and x 5 are assumed to be the possble nfluencng factors. All fve varables are unformly dstrbuted random numbers wth the statstcs lsted n Table 7. Table 7. The statstcs of (x, x 2, x 3, x 4, x 5). Varable Range Mean Varance x +~ x 2 +~ x 3 +~ x 4 +~ x 5 +~ In our study, 400 ponts were used for NN s tranng and 00 ponts were used for testng. Table 8 presents the MAEs of NN modelng wth dfferent nput varables. It shows that NN has the effcent learnng to the system. Table 8. The MAEs of NN modelng wth dfferent nput varables. NN Sze Input Varables (x,x2,x3) (x,x2,x3,x4) (x,x2,x3,x4,x5) MAE Table 9 presents ID and PID values calculated by NN models wth three, four and fve nputs, respectvely. From the table results shown, the ratos of ID and PID of (x, x 2, x 3 ) n three NN modelng systems are almost equal. Same as the studes n lnear system, the ID and PID values of x 4 and x 5 are much smaller than x, x 2 and x 3. In other words, the varables (x, x 2, x 3 ) can be concluded as the real useful nputs to Y and (x 4, x 5 ) could be nferred to be the nose terms. Table 9. ID and PID values calculated by three NN models. NN Sze x ID (PID) (46.03%) (45.47%) (45.62%) x 2 ID (PID) (45.57%) (46.2%) (46.03%) x 3 ID (PID) (8.3%) (8.33%) (8.8%) x 4 ID (PID) (0.08%) (0.0%) x 5 ID (PID) (0.06%) In fact, unlke lnear modelng, no clear nformaton about the mportant degree of ndvdual nput varable to output could be obtaned n nonlnear modelng. However, from the PID results shown n Table 9, we found that the mportant degrees of x and x 2 are almost the same. Ths phenomenon could be observed from Eq. (3) when both x and x 2 have the same statstc dstrbuton.

5 79 Shumng T. Wang et al.: An Evolutonary Method of Neural Network n System Identfcaton Smlar as prevous study, the nonlnear system dentfcaton by NN model s studed contnuously. A nonlnear equaton gven by [25] y = * sn( π * x * sn( π * x 3 / 2) / 2) + 4 * sn( π * x 2 / 2) (4) 000 ponts were generated from the equaton and then the system was assumed to be unknown. 800 ponts were used for NN s tranng and 200 ponts were used for testng. Four varables, x, x 2, x 3 and x 4 are assumed to be the possble nfluencng factors to the system s output y. Four nput varables are unformly dstrbuted random numbers wth the statstcs lsted n Table 0. Table 0. The statstcs of (x, x 2, x 3, x 4). Varable Range Mean Varance x 0~ x 2 0~ x 3 0~ x 4 0~ A sze of 4-5- NN was used to perform the model s dentfcaton. Table presents the smulaton results ncludng MAEs of the tranng and test of NN and ID and PID calculatons of (x, x 2, x 3, x 4 ) for NN s modelng wth four nputs. Table. The smulaton results of NN s modelng for nonlnear equaton. Tranng MAE Test MAE Varables x x 2 x 3 x 4 ID PID 53.63% 28.53% 7.2% 0.63% In ths study, due to the ranges of (x, x 2, x 3, x 4 ) are all n [0, ], thus the angles of sne functon are wthn the range of [0, π/2]. From the nonlnear Eq. (4), t can be observed that the mportant degree of (x, x 2, x 3 ) should be (:0.5: ). From the smulaton results shown, PID rato of (x, x 2, x 3, x 4 ) s (:0.5320:0.3209:0.07). The PID rato of (x, x 2, x 3 ) s very close to the rato (:0.5: ) of the coeffcents (x, x 2, x 3 ) n the orgnal nonlnear equaton. The ID and PID values of x 4 are very small. Thus, x 4 could be treated as the useless term or nose. In above studes of NN versus nonlnear regresson models, the smulaton results show that NN model s also capable of obtanng the mportant degree of each nput varable to the system output. In order to further prove our pont of vew, the example of real ndustral system dentfcaton s contnuously studed n the followng secton NN vs. Industral System It s known that touch panel (TP) has become an ndspensable part for many electronc applances such as computer, moble phone, tcket vendor, etc. Usually, a thn plate wth coatng flm wll be stuck on the screen of TP as a decoraton and protecton flm. Bascally, the flm s coatng process s accomplshed by the evaporator under the vacuum condton. However, before the evaporaton process s taken, the relevant control parameters of evaporator are needed to be set precsely n order to ensure the whole coatng process could be accomplshed successfully. Thus, these control parameters could be treated as the nfluencng factors of evaporaton process. Transmttance s an mportant optcal property of TP flm. Its value s hghly correlated wth the nfluencng factors of evaporaton process. The relatonshp between transmttance and ts relevant nfluencng factors s very complex and nonlnear. Thus, how to use NN to fnd the real nfluencng factors of TP flm s transmttance s the am of the study n ths secton. In our research, TP flm wth two layers coatng s studed. The coatng targets are Cr and Cr O. The nformaton of data 2 3 collected for the experments ncludes the value of quartz crystal deposton montor (x ), the rotaton speed of holder (x 2 ), the substrate poston of panel placed (x 3 ), Cr thckness (x 4 ), Cr O thckness (x 5 ) and TP transmttance (y). The complex 2 3 relatonshp between y and ts possble nfluencng factors (x, x 2, x 3, x 4, x 5 ) s expected to be obtaned by NN model. In order to farly demonstrate the feasblty of NN model and the computatonal technque proposed, two data sets named -, -2, are re-organzed randomly from the orgnal data base. For each data set, 00 ponts are used for NN s tranng and 44 ponts are used for testng. Except MAE, the mean absolute percentage error (MAPE) s also used as the measurement of NN performance. Table 2. The error statstcs performed by fve-nput NN model. Inputs: x, x 2, x 3, x 4, x % 2.03% % 2.8% Avg % 2.0% Table 3. ID and PID values calculated by fve-nput NN model. Tranng Data x x 2 x 3 x 4 x 5 ID(-) ID(-2) Avg PID(-) 8.24% 2.77%.% 45.5% 42.37% PID(-2) 4.27% 8.53% 0.42% 42.56% 34.22% Avg. 6.26% 0.65% 0.76% 44.03% 38.30% Test Data x x 2 x 3 x 4 x 5 ID(-) ID(-2) Avg PID(-) 8.68% 3.93%.3% 44.67% 4.59% PID(-2) 4.34% 7.94% 0.45% 42.83% 34.44% Avg. 6.5% 0.93% 0.79% 43.75% 38.0% Table 2 lsts the MAEs and MAPEs of transmttance estmatons performed by a fve-nput NN model wth sze From MAPEs shown, we conclude that NN model has the effcent tranng. ID and PID values for all nputs are calculated and lsted n Table 3. From the values of PID

6 Internatonal Journal of Intellgent Informaton Systems 206; 5(5): shown, t s clearly found that Cr thckness (x 4 ) and Cr O 2 3 thckness (x 5 ) are two most mportant nfluencng factors to TP transmttance. The value of quartz crystal deposton montor (x ) and the rotaton speed of holder (x 2 ) also have the certan mpacts to transmttance. In these fve nput varables, the substrate poston of panel placed (x 3 ) could be gnored due to ts small PID value. In order to prove our concluson, the transmttance estmatons by NN model wth dfferent nput combnatons are redone. Table 4 lsts MAEs and MAPEs of transmttance estmatons performed by NN model wth dfferent four-nput combnatons. Table 5 lsts MAEs and MAPEs of transmttance estmatons performed by NN model wth dfferent three-nput combnatons. From the results shown n these tables, t s clearly found that the concluson we made n accordance wth the values of ID and PID s correct. In other words, the evolutonary NN technque we proposed s very promsng and potental n the real applcatons of system dentfcaton. Table 4. The error statstcs performed by NN model wth four nputs. Inputs: x, x 2, x 4, x %.76% % 2.9% Avg %.98% Inputs: x, x 3, x 4, x % 2.62% % 3.45% Avg % 3.03% Inputs: x 2, x 3, x 4, x % 2.59% % 3.24% Avg % 2.9% Table 5. The error statstcs performed by NN model wth three nputs. Inputs: x, x 4, x % 2.5% % 2.77% Avg % 2.64% Inputs: x 2, x 4, x %.74% % 2.9% Avg %.96% Inputs: x 3, x 4, x % 5.33% % 5.7% Avg % 5.25% 4. Concluson Ths research presents an evolutonary NN technque for the applcaton of system dentfcaton. The lnear and nonlnear systems and the real ndustral system were studed and smulated. The study results obvously show that the novel technque proposed ndeed has ts feasblty and superorty n the applcaton of system dentfcaton. The computatonal methods denoted by ID and PID can extract the useful and mportant nputs to the system output. In other words, ths study proposes a new system dentfcaton technque based on supervsed NN and ths technque has the potental n the area of data mnng wth large database. In ths research, the actvaton functon for all nodes of NN s the sgmod functon. The ID and PID values defned are also derved from the characterstc of sgmod functon. Thus, t s beleved that NN model wth any node s transfer functon whch has the same characterstc as sgmod functon certanly has the smlar capablty n the applcaton of data mnng. For nstance, the hyperbolc tangent functon s also a choce. It s well known that NN wth BP learnng algorthm usually has the local mnmum problem n ts learnng process. An ll-learnng NN model mght have the generalzaton and accuracy problems, f NN model really plunged nto the local mnmum. Ths condton mght cause the ncorrect ID and PID values calculated from the ll-traned NN model. Thus, we have to emphasze that NN model must have an effcent learnng for ensurng the correct ID and PID nformaton could be obtaned. Acknowledgements Ths research was supported by the Mnstry of Scence and Technology, Tawan, R.O.C. under Contracts No. MOST E References [] M. Schetzen, "The Volterra and Wener Theores of Nonlnear Systems". Wley, 980. [2] S. A. Bllngs, "Nonlnear System Identfcaton: NARMAX Methods n the Tme, Frequency, and Spato-Temporal Domans". Wley, 203. [3] O. Nelles, "Nonlnear System Identfcaton: From Classcal Approaches to Neural Networks". Sprnger Verlag, 200. [4] M. Letta, Dynamc multvarate B-splne neural network desgn usng orthogonal least squares algorthm for non-lnear system dentfcaton, 204 8th Internatonal Conference on System Theory, Control and Computng, ICSTCC 204, pp , 204. [5] K. J. Ndhl Wlfred, S. Sreera, B. Vay, V. Bagyaveereswaran, System dentfcaton usng artfcal neural network, IEEE Internatonal Conference on Crcut, Power and Computng Technologes, ICCPCT 205, July 5, 205.

7 8 Shumng T. Wang et al.: An Evolutonary Method of Neural Network n System Identfcaton [6] Hector M. Romero Ugalde, J. C. Carmona, R. R. Juan, Vctor M. Alvarado, J. Mantlla, Computatonal cost mprovement of neural network models n black box nonlnear system dentfcaton, Neurocomputng, vol. 66, pp , 205. [7] Leandro L.S. Lnhares, José M. Araúo, Fábo M.U. Araúo, T. Yoneyama, A nonlnear system dentfcaton approach based on Fuzzy Wavelet Neural Network, Journal of Intellgent and Fuzzy Systems, vol. 28, no., pp , 205. [8] P. Adraans, D. Zantnge, Data Mnng, Addson_Wesley Longman, 996. [9] L. Guan, H. J. Lang, Data warehouse and data Mnng, Mcrocomputer Applcatons, vol. 5, no. 9, pp. 7-20, 999. [0] J. S. Feng, KDD and ts applcatons, BaoGang Technques. vol. 3, pp. 27-3, 999. [] I. H. Wtten, E. Frank, Data Mnng: Practcal Machne Learnng, Morgan Kaufmann, [2] J. Han and M. Kamber, Data Mnng: Concepts and Technques, Morgan Kaufmann, 200. [3] D. Hand, Prncples of Data Mnng, Massachusetts Insttute of Technology, 200. [4] G. Wang, D. Huang, The summary of the data mnng technology, Computer Applcaton Technology, vol. 69, pp. 9-4, [5] R. F. Gunst, R. L. Mason, Regresson Analyss and Its Applcaton: A Data-Orented Approach, Marcel Dekker Inc. New York, 980. [7] H. Lu, R. ono, H. Lu, Effectve data mnng usng neural network, IEEE Transactons on Knowledge and Data Engneerng, vol. 8, no. 6, pp , 996. [8] Z. Lu, L. Jng, The research of data mnng based on neural networks, Computer Engneerng and Applcaton, vol. 3, pp , [9] S. Zhang, Research of rule extracton and classfcaton algorthm based on neural network, Master dssertaton, Harbn Engneerng Unversty, Chna, [20] L. P. Duan, L. J. Zhou, Y. Wang, Data mnng based on neural networks, Technques of Automaton & Applcatons, vol. 7, pp. 2-9, [2] L. L, B. Zhang, M. Yang, Data mnng algorthm based on fuzzy neural network, Computer Engneerng, vol. 33, pp , [22] X. N, Research of data mnng based on neural networks, World Academy of Scence, Engneerng and Technology, vol. 39, pp , [23] S. Nrkh, Potental use of artfcal neural network n data mnng, n Proc. 2nd Int. Conf. Computer and Automaton (ICCAE), 200, vol. 2, pp [24] P. T. Hsu, The studes of data mnng by usng neural network, Master Thess, I-Shou Unversty, Tawan, 202. [25] J. C. Chen, The practcal study of neural network n data mnng, Master Thess, I-Shou Unversty, Tawan, 203. [6] G. Towell, J. W. Shavlk, The extracton of refned rules from knowledge-based neural networks, Machne Learnng, vol. 3, pp. 7-0, 993.

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

829. An adaptive method for inertia force identification in cantilever under moving mass

829. An adaptive method for inertia force identification in cantilever under moving mass 89. An adaptve method for nerta force dentfcaton n cantlever under movng mass Qang Chen 1, Mnzhuo Wang, Hao Yan 3, Haonan Ye 4, Guola Yang 5 1,, 3, 4 Department of Control and System Engneerng, Nanng Unversty,

More information

Short Term Load Forecasting using an Artificial Neural Network

Short Term Load Forecasting using an Artificial Neural Network Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

on the improved Partial Least Squares regression

on the improved Partial Least Squares regression Internatonal Conference on Manufacturng Scence and Engneerng (ICMSE 05) Identfcaton of the multvarable outlers usng T eclpse chart based on the mproved Partal Least Squares regresson Lu Yunlan,a X Yanhu,b

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

A Network Intrusion Detection Method Based on Improved K-means Algorithm

A Network Intrusion Detection Method Based on Improved K-means Algorithm Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Application research on rough set -neural network in the fault diagnosis system of ball mill

Application research on rough set -neural network in the fault diagnosis system of ball mill Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(4):834-838 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 Applcaton research on rough set -neural network n the

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Statistics MINITAB - Lab 2

Statistics MINITAB - Lab 2 Statstcs 20080 MINITAB - Lab 2 1. Smple Lnear Regresson In smple lnear regresson we attempt to model a lnear relatonshp between two varables wth a straght lne and make statstcal nferences concernng that

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG 6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Chapter 5 Multilevel Models

Chapter 5 Multilevel Models Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables LINEAR REGRESSION ANALYSIS MODULE VIII Lecture - 7 Indcator Varables Dr. Shalabh Department of Maematcs and Statstcs Indan Insttute of Technology Kanpur Indcator varables versus quanttatve explanatory

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL The Synchronous 8th-Order Dfferental Attack on 12 Rounds of the Block Cpher HyRAL Yasutaka Igarash, Sej Fukushma, and Tomohro Hachno Kagoshma Unversty, Kagoshma, Japan Emal: {garash, fukushma, hachno}@eee.kagoshma-u.ac.jp

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013 Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

The Ordinary Least Squares (OLS) Estimator

The Ordinary Least Squares (OLS) Estimator The Ordnary Least Squares (OLS) Estmator 1 Regresson Analyss Regresson Analyss: a statstcal technque for nvestgatng and modelng the relatonshp between varables. Applcatons: Engneerng, the physcal and chemcal

More information

Scroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator

Scroll Generation with Inductorless Chua s Circuit and Wien Bridge Oscillator Latest Trends on Crcuts, Systems and Sgnals Scroll Generaton wth Inductorless Chua s Crcut and Wen Brdge Oscllator Watcharn Jantanate, Peter A. Chayasena, and Sarawut Sutorn * Abstract An nductorless Chua

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Grey prediction model in world women s pentathlon performance prediction applied research

Grey prediction model in world women s pentathlon performance prediction applied research Avalable onlne www.jocpr.com Journal of Chemcal and Pharmaceutcal Research, 4, 6(6):36-4 Research Artcle ISSN : 975-7384 CODEN(USA) : JCPRC5 Grey predcton model n world women s pentathlon performance predcton

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Discretization of Continuous Attributes in Rough Set Theory and Its Application*

Discretization of Continuous Attributes in Rough Set Theory and Its Application* Dscretzaton of Contnuous Attrbutes n Rough Set Theory and Its Applcaton* Gexang Zhang 1,2, Lazhao Hu 1, and Wedong Jn 2 1 Natonal EW Laboratory, Chengdu 610036 Schuan, Chna dylan7237@sna.com 2 School of

More information

Sequential Condition Diagnosis for Centrifugal Pump System Using Fuzzy Neural Network

Sequential Condition Diagnosis for Centrifugal Pump System Using Fuzzy Neural Network eural Informaton Processng Letters and Revews Vol., o. 3, March 007 LETTER Sequental Condton Dagnoss for Centrfugal Pump System Usng Fuzzy eural etwork Huaqng Wang and Peng Chen Department of Envronmental

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

Chapter 6. Supplemental Text Material

Chapter 6. Supplemental Text Material Chapter 6. Supplemental Text Materal S6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS OF A TELESCOPIC HYDRAULIC CYLINDER SUBJECTED TO EULER S LOAD

THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS OF A TELESCOPIC HYDRAULIC CYLINDER SUBJECTED TO EULER S LOAD Journal of Appled Mathematcs and Computatonal Mechancs 7, 6(3), 7- www.amcm.pcz.pl p-issn 99-9965 DOI:.75/jamcm.7.3. e-issn 353-588 THE EFFECT OF TORSIONAL RIGIDITY BETWEEN ELEMENTS ON FREE VIBRATIONS

More information

System identifications by SIRMs models with linear transformation of input variables

System identifications by SIRMs models with linear transformation of input variables ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition A Neuro-Fuzzy System on System Modelng and Its Applcaton on Character Recognton C. J. Chen 1, S. M. Yang 2, Z. C. Wang 3 1 Department of Avaton Servce Management Alethea Unversty Tawan, ROC 2,3 Department

More information

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM Ganj, Z. Z., et al.: Determnaton of Temperature Dstrbuton for S111 DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM by Davood Domr GANJI

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

Parameter Estimation for Dynamic System using Unscented Kalman filter

Parameter Estimation for Dynamic System using Unscented Kalman filter Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,

More information

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS Journal of Theoretcal and Appled Informaton Technology 005-01 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites IOP Conference Seres: Materals Scence and Engneerng PAPER OPE ACCESS An dentfcaton algorthm of model knetc parameters of the nterfacal layer growth n fber compostes o cte ths artcle: V Zubov et al 216

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

Basically, if you have a dummy dependent variable you will be estimating a probability.

Basically, if you have a dummy dependent variable you will be estimating a probability. ECON 497: Lecture Notes 13 Page 1 of 1 Metropoltan State Unversty ECON 497: Research and Forecastng Lecture Notes 13 Dummy Dependent Varable Technques Studenmund Chapter 13 Bascally, f you have a dummy

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy Comparatve Studes of Law of Conservaton of Energy and Law Clusters of Conservaton of Generalzed Energy No.3 of Comparatve Physcs Seres Papers Fu Yuhua (CNOOC Research Insttute, E-mal:fuyh1945@sna.com)

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Chapter 12 Analysis of Covariance

Chapter 12 Analysis of Covariance Chapter Analyss of Covarance Any scentfc experment s performed to know somethng that s unknown about a group of treatments and to test certan hypothess about the correspondng treatment effect When varablty

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information